CN118341080A - Method and device for controlling display in game, electronic equipment and readable storage medium - Google Patents

Method and device for controlling display in game, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN118341080A
CN118341080A CN202410382652.XA CN202410382652A CN118341080A CN 118341080 A CN118341080 A CN 118341080A CN 202410382652 A CN202410382652 A CN 202410382652A CN 118341080 A CN118341080 A CN 118341080A
Authority
CN
China
Prior art keywords
virtual character
sliding operation
virtual camera
controlled
game
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410382652.XA
Other languages
Chinese (zh)
Inventor
关磊
魏翰林
胡戌涛
陈涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Publication of CN118341080A publication Critical patent/CN118341080A/en
Pending legal-status Critical Current

Links

Abstract

The application discloses a display control method, a device, an electronic device and a readable storage medium in a game, wherein the method comprises the following steps: providing a movement control area on a first side of a graphical user interface and providing a viewing angle control area on a second side of the graphical user interface; responding to the triggering operation of the controlled virtual character entering an automatic moving state, and displaying a first visual angle control on the first side of the graphical user interface; controlling a shooting direction of a virtual camera and a moving direction of the controlled virtual character according to a first sliding operation in response to the first sliding operation for the first view angle control; in response to a second sliding operation for the viewing angle control region, a shooting direction of a virtual camera is adjusted according to the second sliding operation. The application is convenient for the user to smoothly execute the complex control operation.

Description

Method and device for controlling display in game, electronic equipment and readable storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and apparatus for controlling display in a game, an electronic device, and a computer readable storage medium.
Background
In some action games, controlling the virtual character to enter an automatic moving state (such as an automatic running state) is an indispensable playing function. In the related art, when a player drags the virtual wheel disc to slide upwards by a left hand and meets a certain condition, an automatic movement mark appears, and when the player slides to the automatic movement mark to stay for a preset time, the player controls the virtual character to enter an automatic movement state. During the course of the virtual character being in the running state, the player changes the orientation of the virtual character by sliding the screen with the right hand.
However, in the above related art, when the virtual character is in an automatic moving state, the shooting direction of the virtual camera, the direction of the virtual character, and other operations such as attack, change of motion gesture, etc. need to be controlled by the right hand, but the right hand cannot simultaneously consider more than two operations.
Disclosure of Invention
The present application provides a display control method, apparatus, electronic device, and computer-readable storage medium in a game, thereby overcoming, at least to some extent, the problem of unsmooth operation due to the limitations of the related art.
In a first aspect, an embodiment of the present application provides a display control method in a game, the method including:
Providing a movement control area on a first side of a graphical user interface and providing a viewing angle control area on a second side of the graphical user interface;
responding to the triggering operation of the controlled virtual character entering an automatic moving state, and displaying a first visual angle control on the first side of the graphical user interface;
Controlling a shooting direction of a virtual camera and a moving direction of the controlled virtual character according to a first sliding operation in response to the first sliding operation for the first view angle control;
In response to a second sliding operation for the viewing angle control region, a shooting direction of a virtual camera is adjusted according to the second sliding operation.
In a second aspect, an embodiment of the present application provides a display control apparatus in a game, the apparatus including:
A first display module for providing a movement control area on a first side of a graphical user interface and providing a viewing angle control area on a second side of the graphical user interface;
The second display module is used for responding to the triggering operation of the controlled virtual character entering the automatic moving state and displaying a first visual angle control on the first side of the graphical user interface;
the control module is used for responding to a first sliding operation for the first visual angle control, and controlling the shooting direction of the virtual camera and the moving direction of the controlled virtual character according to the first sliding operation;
and the processing module is used for responding to a second sliding operation for the visual angle control area and adjusting the shooting direction of the virtual camera according to the second sliding operation.
In a third aspect, an embodiment of the present application provides an electronic device, including:
a memory and a processor, the memory and the processor coupled;
The memory is used for storing one or more computer instructions;
The processor is configured to execute the one or more computer instructions to implement the method for controlling display in a game according to any one of the first aspect.
In a fourth aspect, an embodiment of the present application provides a computer readable storage medium having stored thereon one or more computer instructions, wherein the instructions are executed by a processor to implement the method for controlling display in a game according to any one of the first aspects.
In a fifth aspect, an embodiment of the present application provides a computer program product, including a computer program, which when executed by a processor implements the display control method in a game according to any one of the first aspects above.
Compared with the prior art, the application has the following advantages:
The application provides a display control method in a game, which comprises the steps of providing a movement control area on a first side of a graphical user interface and providing a visual angle control area on a second side of the graphical user interface; responding to the triggering operation of the controlled virtual character entering an automatic moving state, and displaying a first visual angle control on the first side of the graphical user interface; controlling a shooting direction of a virtual camera and a moving direction of the controlled virtual character according to a first sliding operation in response to the first sliding operation for the first view angle control; in response to a second sliding operation for the viewing angle control region, a shooting direction of a virtual camera is adjusted according to the second sliding operation.
Compared with the prior art, the method and the device have the advantages that the first visual angle control is additionally displayed on the first side of the graphical user interface in response to the controlled virtual character entering the free movement state, so that the user can synchronously adjust the shooting direction of the virtual camera and the movement direction of the controlled virtual character by executing the first sliding operation on the first visual angle control, and meanwhile, the user can execute other operations such as attack operation or change movement operation, so that different operations are respectively organically fused, the complexity of executing a series of different actions is reduced, and the complex operation can be smoothly completed; on the other hand, the user can adjust the photographing direction of the virtual camera by performing the second sliding operation on the viewing angle control area displayed on the second side, thus ensuring that the moving direction of the controlled virtual character is not changed, thus enabling the player to more flexibly adjust the viewing angle to observe the surrounding environment, and also not affecting the controlled virtual character to move in the same direction.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a flow chart of a method for controlling display in a game according to an embodiment of the application;
FIG. 2 is a diagram of a graphical user interface according to one embodiment of the present application;
FIG. 3 is a schematic diagram illustrating operation of a first view control according to an embodiment of the present application;
FIG. 4 is a schematic diagram illustrating operation of a first view control according to another embodiment of the present application;
Fig. 5 is a schematic diagram illustrating adjustment of a shooting direction of a virtual camera through a view angle control area according to an embodiment of the present application;
Fig. 6 is a schematic diagram illustrating controlling a shooting direction of a virtual camera according to a third sliding operation according to an embodiment of the present application;
FIG. 7 is a schematic diagram of triggering a controlled avatar to enter an auto-move state according to one embodiment of the present application;
FIG. 8 is a schematic diagram of a movement control area according to an embodiment of the present application;
FIG. 9 is a schematic diagram illustrating a controlled avatar exiting a free-movement state according to one embodiment of the present application;
FIG. 10 is a schematic diagram of a display control device in a game according to an embodiment of the present application;
fig. 11 is a schematic hardware structure of an electronic device according to an embodiment of the application.
Specific embodiments of the present application have been shown by way of the above drawings and will be described in more detail below. The drawings and the written description are not intended to limit the scope of the inventive concepts in any way, but rather to illustrate the inventive concepts to those skilled in the art by reference to the specific embodiments.
Detailed Description
The application will be described more fully hereinafter with reference to the accompanying drawings, in which embodiments of the application are shown. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. The described embodiments are intended to be some, but not all, of the many other embodiments that a person of ordinary skill in the art would obtain without inventive faculty are within the scope of the application.
It should be noted that in the description of the present application, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying any particular order or sequence. The specific meaning of the above terms in the present application will be understood in specific cases by those of ordinary skill in the art. Furthermore, in the description of the present application, the term "plurality" means two or more, unless otherwise indicated. The term "and/or" describes an association relationship of associated objects, meaning that there may be three relationships, e.g., a and/or B, which may represent: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. The terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
In order to facilitate understanding of the technical solution of the present application, related concepts related to the present application will be described first.
Virtual scene: the virtual environment provided by the application program corresponding to the virtual game when running on the electronic equipment can be displayed through the display screen of the terminal equipment so as to be convenient for a user to check. The virtual scene may be a simulation environment for the real world, a semi-simulation and semi-fictional environment, or a pure fictional environment. Such as a fictitious game environment, a virtual reality environment formed by superimposing a fictitious game environment and a real environment, etc. The virtual scene may be a two-dimensional virtual scene or a three-dimensional virtual scene.
Virtual roles: for game characters that a game user plays in a game, the character selected and controlled by the user may be referred to in one embodiment of the application as a character representing the user's avatar. The user may control the virtual character selected by the user to perform skills or operations such as walking, climbing, flipping, jumping, running, shooting, light, etc. in the game virtual scene, by way of example only, and the embodiments of the present application are not limited in any way.
In order to solve the problems of the related art described above, the present application provides a display control method in a game, an in-game display control apparatus corresponding to the method, an electronic device capable of implementing the in-game display control method, and a computer-readable storage medium. The following provides detailed descriptions of the above methods, apparatuses, electronic devices, and computer-readable storage media.
The display control method in the game provided by one embodiment of the application can be operated on a terminal device, wherein the terminal device comprises a local terminal device or a server. The terminal can be terminal equipment such as a smart phone, a tablet personal computer, a notebook computer, a touch screen, a game machine and the like, and the terminal can also comprise a client, wherein the client can be a game application client, a browser client carrying a game program, an instant messaging client or the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, basic cloud computing services such as big data and artificial intelligence platforms, and the like.
When the display control method in the game is run on the server, the method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device. A terminal device is any device having computing hardware capable of supporting and executing a software application product corresponding to a game. The game software application product includes, but is not limited to, any of a first person game application, a third person game application, a single person game application, and a multiplayer online tactical athletic game application. The types of games described above may include, but are not limited to, at least one of: two-dimensional game applications, three-dimensional game applications, virtual reality game applications, augmented reality game applications, and mixed reality game applications. The above is merely an example, and embodiments of the present application are not limited in any way.
In an alternative embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, a running main body of the game program and a game picture presentation main body are separated, the storage and running of a display control method in the game are completed on a cloud game server, and the function of a client device is used for receiving and sending data and presenting game pictures, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the cloud game server which performs information processing is a cloud. When playing the game, the player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In an alternative embodiment, taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
In the embodiment of the application, the player can observe the virtual scene through various visual angles and control the virtual character to play the game. The visual angle refers to a shooting visual angle preset by a virtual camera in the game, the virtual scene is shot through the visual angle of the virtual camera, and the shot game virtual scene is provided for a user to observe. The first person viewing angle refers to a photographing viewing angle of a first virtual camera bound to the virtual character, the second person viewing angle refers to a photographing viewing angle of a second virtual camera for photographing a target scene area, and the third person viewing angle refers to a photographing viewing angle of a third virtual camera for global photographing. The position of the first virtual camera can be changed along with the position change of the bound virtual character, the position of the second virtual camera can be changed along with the position change of the target scene area, and the third virtual camera does not need to be changed due to the fact that all virtual scene shooting is carried out. Here, the virtual camera refers to a three-dimensional model around the virtual character in the virtual scene, and when the first-person perspective is adopted, the virtual camera may be located near the head of the virtual character or at the head of the virtual character. When a third person is employed, referred to as a viewing angle, the virtual camera may be located behind the virtual character.
In an alternative embodiment, the first person view angle may be used for displaying, and the displayed virtual scene may only include the hand, the arm, or the prop held in the hand of the virtual character, so that the effect of observing the virtual scene through the view angle of the virtual character can be simulated.
In another alternative embodiment, a third person viewing angle may be used for displaying, where the third person viewing angle is consistent with the first person viewing angle, and only the third person viewing angle may display a virtual character facing away from the terminal screen in the virtual scene, so that the player may see the action, the environment, etc. of the virtual character controlled by the player in the virtual scene. The shooting direction of the virtual camera is an observation direction when the virtual scene is observed at a first person perspective or a third person perspective of the virtual character.
In the method for controlling the display in the game provided by the application, a graphical user interface can be provided through a terminal device for displaying the game interface, wherein the terminal device can be the aforementioned local terminal device or the aforementioned client device in the cloud interaction system.
Next, referring to fig. 1, a method for controlling display in a game according to an embodiment of the present application will be described, and fig. 1 is a flowchart of a method for controlling display in a game according to an embodiment of the present application.
As shown in fig. 1, the display control method in the game includes steps S10 to S40:
s10, providing a movement control area on a first side of the graphical user interface and providing a visual angle control area on a second side of the graphical user interface.
In the embodiment of the application, a part of game scenes are displayed on the graphical user interface, and a movement control area and a visual angle control area are arranged on the upper layer of the displayed part of game scenes. Wherein the movement control area is disposed on a first side of the graphical user interface and the viewing angle control area is disposed on a second side of the graphical user interface.
The first side and the second side on the graphical user interface may be a left side area and a right side area of the graphical user interface, or may be a right side area and a left side area of the graphical user interface, or may be an upper side area and a lower side area of the graphical user interface, or may be a lower side area and an upper side area of the graphical user interface, which are merely examples, and the embodiment of the present application does not limit the specific areas of the first side and the second side that are specifically located on the graphical user interface, so long as the player may perform corresponding operations on the first side and the second side at the same time.
And the movement control area is used for initiating touch operation through the touch medium and controlling the controlled virtual character to move. A virtual rocker control or a virtual dial control, etc. that can control movement may be displayed in the movement control area, which is only an example, and the embodiment of the present application is not limited in any way. The controlled virtual character is a virtual game character controlled by the current user through the held terminal equipment. The user may perform a movement control operation in the movement control area, and the movement control operation may be a movement operation, for example, controlling the controlled virtual character to move in the game scene according to a movement trajectory or a click position of the movement operation of the user in the movement control area.
The visual angle control area is used for initiating touch operation through the touch medium and controlling the shooting direction of the virtual camera. The viewing angle control region may be located in a blank region on the second side of the graphical user interface, the blank region being a region of the right side of the graphical user interface where no control is operated. The shooting direction of the virtual camera refers to a direction in which the virtual camera shoots the game virtual scene, that is, a specific direction from which the virtual camera shoots the game virtual scene. The view angle control area may be used to change the shooting direction of the virtual camera in the game by the player through a virtual control, sliding a screen, or other control device, so as to observe the virtual scene of the game in the surrounding environment.
In the following, with reference to fig. 2 and a specific example, a movement control area and a viewing angle control area are provided on a first side and a second side of a graphical user interface, respectively, and fig. 2 is a schematic diagram of a graphical user interface according to an embodiment of the present application.
As shown in fig. 2, a part of a game scene is displayed on the graphic user interface 200, and a movement control area 201 is provided on a first side (e.g., left side) in the graphic user interface, and a user triggers the controlled virtual character 301 to move by performing a movement control operation within the movement control area. Specifically, as shown in fig. 2, a movement control 204 is provided in the movement control area 201, and a user can perform a trigger operation on the movement control 204. And, a viewing angle control area 202 is provided at a second side (e.g., right side) in the graphic user interface, and a user triggers a change of a photographing direction of the virtual camera by performing a trigger operation within the viewing angle control area 202. Specifically, the user can change the shooting direction of the virtual camera within this view angle control area 202 using a finger to perform a sliding operation in the view angle control area 202.
And S20, responding to the triggering operation of the controlled virtual character to enter the automatic movement state, and displaying a first visual angle control on the first side of the graphical user interface.
The controlled virtual character entering the automatic movement state refers to a state in which the controlled virtual character moves automatically without direct control of the player in the game or virtual environment. Common automatic movement states of the avatar include, but are not limited to, an automatic sprint state, an automatic follow state, an automatic jogging state, an automatic walk state, just by way of example, and the embodiments of the present application are not limited in any way.
In the embodiment of the application, after the player controls the controlled virtual character to enter the automatic moving state, a first visual angle control is displayed on a first side of the graphical user interface in response to the controlled virtual character entering the automatic moving state. The first visual angle control is used for synchronously adjusting the shooting direction of the virtual camera and the moving direction of the controlled virtual character. After the controlled virtual character enters the automatic moving state, the player can synchronously adjust the shooting direction of the virtual camera and the moving direction of the controlled virtual character by executing triggering operation on a first visual angle control displayed on a first side of the graphical user interface.
S30, responding to a first sliding operation for the first visual angle control, and controlling the shooting direction of the virtual camera and the moving direction of the controlled virtual character according to the first sliding operation.
When the terminal device is a touch screen device such as a mobile phone, a tablet computer, a game machine, etc., the first sliding operation may be a sliding operation of sliding the first view angle control on the touch screen by a finger of a user or other objects such as a stylus. The first sliding operation may be one of sliding operations such as up-sliding, down-sliding, left-sliding, right-sliding, or a combination of two or more of sliding operations such as up-sliding, down-sliding, left-sliding, right-sliding, etc. By way of example only, embodiments of the application are not limited in any way. When the user slides on the touch screen, the game system records the sliding track of the sliding operation.
For example, the first sliding operation may be to control the first view control to slide up a distance and then slide to the right a distance. For another example, the first sliding operation may be to control the first view control to slide a distance to the left and then to slide a distance to the right.
In the embodiment of the application, in response to a first sliding operation for a first visual angle control, the shooting direction of the virtual camera and the moving direction of the controlled virtual character are controlled according to the first sliding operation. Specifically, taking the first sliding operation as an example of sliding to the left for a distance, the shooting direction of the virtual camera is controlled to move to the left, and the moving direction of the controlled virtual character is synchronously controlled to move to the left. Therefore, by synchronizing the shooting direction of the virtual camera and the moving direction of the controlled virtual character, a user can obtain more uniform and consistent operation experience during operation, and confusion are reduced.
Next, with reference to fig. 3 and a specific example, an effect schematic diagram of operating the first perspective control according to one embodiment of the present application is shown in fig. 3, where fig. 3 is a schematic diagram of operating the first perspective control according to one embodiment of the present application.
As shown in fig. 3, the interface (a), the interface (b), and the interface (c) are included. As shown in interface (a), the controlled virtual character 301 is in a game scene. As shown in interface (b), the upper right hand corner of the graphical user interface displays an automatic movement status indicator 302, indicating that the player is in an automatic movement status, such as an automatic race status. In response to the controlled avatar 301 entering the auto-move state, a first perspective control 303 is displayed on the graphical user interface. As shown in the interface (c), the user slides the first view control 303 rightward with a finger in the sliding direction shown in the interface (c), the photographing direction of the virtual camera moves rightward, and the movement direction of the synchronously controlled virtual character moves rightward. Before the rightward sliding operation is performed on the first view angle control 303, as shown in the interface (b), the shooting direction of the virtual camera is toward the virtual house 304, and the moving direction of the controlled virtual character 301 is away from the virtual house 304. After the rightward sliding operation is performed on the first view angle control 303, as shown in the interface (c), the shooting direction of the virtual camera is a side facing the virtual house 304 with the virtual flower 306, and the moving direction of the controlled virtual character 301 is a side facing away from the virtual house 304 with the virtual flower 306.
Next, with reference to fig. 4 and a specific example, an effect schematic diagram of operating the first view control according to one embodiment of the present application is shown, and fig. 4 is a schematic diagram of operating the first view control according to another embodiment of the present application.
As shown in fig. 4, includes an interface (a) and an interface (b). As shown in interface (a), the controlled avatar 301 faces the virtual house 304. In response to the controlled avatar 301 entering the auto-move state, a first perspective control 303 is additionally displayed on a first side of the graphical user interface, as shown in interface (a), and an auto-move state identifier 302 is displayed in the upper right corner of the graphical user interface. As shown in interface (b), the user slides the first view control 303 rightward with the left index finger in the sliding direction shown in interface (b), and the user manipulates the controlled avatar 301 to aim at the enemy avatar 401 with the gun, and performs a click operation on the attack control 401 with the right index finger. Before the sliding operation is performed on the first view control 303, as shown in the interface (b), the shooting direction of the virtual camera and the moving direction of the controlled virtual character 301 are opposite to the virtual tree 305. After the rightward sliding operation of the first view angle control 303, as shown in the interface (c), the shooting direction of the virtual camera and the moving direction of the controlled virtual character 301 are facing away from the virtual tree 305, and the controlled virtual character 301 holds the gun to hit the enemy virtual character 401.
S40, responding to a second sliding operation for the visual angle control area, and adjusting the shooting direction of the virtual camera according to the second sliding operation.
When the terminal device is a touch screen device such as a mobile phone, a tablet computer, or a game console, the second sliding operation may be a sliding operation in which a user's finger or another object such as a stylus slides on the touch screen in the viewing angle control area. The second sliding operation may be one of sliding operations such as up-sliding, down-sliding, left-sliding, right-sliding, or a combination of two or more of sliding operations such as up-sliding, down-sliding, left-sliding, right-sliding, etc. By way of example only, embodiments of the application are not limited in any way. When the user slides on the touch screen, the game system records the sliding track of the sliding operation.
In the embodiment of the application, the player can also execute the second sliding operation in the visual angle control area of the second side in the graphical user interface, so that the shooting direction of the virtual camera is independently adjusted according to the second sliding operation. It should be noted that, the player performs the second sliding operation in the viewing angle control area on the second side in the graphical user interface, without changing the moving direction of the controlled virtual character. By performing the second sliding operation on the view angle control area, the shooting direction of the virtual camera can be independently and flexibly adjusted to observe the surrounding environment, such as finding hidden roads, enemies or treasures, under the condition that the movement direction of the controlled virtual character is not changed, so that the game experience of the user is improved.
Next, with reference to fig. 5 and a specific example, a second sliding operation performed in the view angle control area will be described, and fig. 5 is a schematic diagram of adjusting a shooting direction of the virtual camera through the view angle control area according to an embodiment of the present application.
As shown in fig. 5, includes an interface (a) and an interface (b). As shown in interface (a), a viewing angle control region 501 is located on a second side of the graphical user interface. The user performs a second sliding operation in the sliding direction shown by the interface (a) in the viewing angle control area by a finger, and the starting point and the ending point of the sliding operation are shown by the interface (a). Correspondingly, as shown in the interface (a), the shooting direction of the virtual camera 307 is facing the virtual house 304 before the second sliding operation. In response to completion of the second sliding operation, the shooting direction of the virtual camera 307 is a side facing the virtual house 304 having the virtual flower 306, as shown in the interface (b). It should be noted that the movement direction of the controlled virtual character 301 is not changed (i.e., the controlled virtual character 301 always faces the front of the virtual house 304). The movement direction of the controlled virtual character 301 is in the direction facing the front of the virtual house 304 as shown in the interface (a), and the movement direction of the controlled virtual character 301 is still in the direction facing the front of the virtual house 304 as shown in the interface (b) in response to completion of the second sliding operation.
The display control method in the game provided by the embodiment of the application provides a movement control area on the first side of the graphical user interface and provides a visual angle control area on the second side of the graphical user interface. And in response to a triggering operation of the controlled virtual character entering an automatic movement state, displaying a first visual angle control on a first side of the graphical user interface. In response to a first sliding operation for the first view angle control, a shooting direction of the virtual camera and a moving direction of the controlled virtual character are controlled according to the first sliding operation. In response to a second sliding operation for the viewing angle control area, a shooting direction of the virtual camera is adjusted according to the second sliding operation.
Compared with the prior art, the method and the device have the advantages that the first visual angle control is additionally displayed on the first side of the graphical user interface in response to the controlled virtual character entering the free movement state, so that the user can synchronously adjust the shooting direction of the virtual camera and the movement direction of the controlled virtual character by executing the first sliding operation on the first visual angle control, and meanwhile, the user can execute other operations such as attack operation or change movement operation, so that different operations are respectively organically fused, the complexity of executing a series of different actions is reduced, and the complex operation can be smoothly completed; on the other hand, the user can adjust the photographing direction of the virtual camera by performing the second sliding operation on the viewing angle control area displayed on the second side, thus ensuring that the moving direction of the controlled virtual character is not changed, thus enabling the player to more flexibly adjust the viewing angle to observe the surrounding environment, and also not affecting the controlled virtual character to move in the same direction.
On the basis of the above embodiments, the method for controlling display in a game provided by the embodiment of the present application is further described below.
An optional implementation manner, the method for controlling display in a game provided by the embodiment of the present application further includes step S50:
And S50, responding to an ending instruction of the first sliding operation of the visual angle control area, and keeping the shooting direction of the adjusted virtual camera and the moving direction of the controlled virtual character.
In the embodiment of the application, when the first sliding operation for the visual angle control area is ended, the user triggers the generation of an ending instruction of the first sliding operation for the visual angle control area. In response to an end instruction of the first sliding operation for the view angle control area, the adjusted photographing direction of the virtual camera and the movement direction of the controlled virtual character are maintained. For example, when the first sliding operation of the user with respect to the angle-of-view control area is ended, the adjusted shooting direction of the virtual camera is the north direction, and the moving direction of the controlled virtual character is the east direction. Then, in response to an end instruction of the first sliding operation for the angle-of-view control area, the photographing direction of the virtual camera is kept to be the northbound direction, and the moving direction of the controlled virtual character is kept to be the eastern direction.
It can be understood that the user performs the first sliding operation with respect to the viewing angle control area to adjust the photographing direction of the virtual camera and the moving direction of the controlled virtual character, and the user continues to perform the first sliding operation until reaching the photographing direction of the virtual camera and the moving direction of the controlled virtual character desired by the user. Then, at the end of the first sliding operation for the view angle control area by the user, the game screen is still presented in the shooting direction of the virtual camera and the moving direction of the controlled virtual character, which are in accordance with the user's desire. In this way, the user can always present the game screen in the shooting direction of the virtual camera and the moving direction of the controlled virtual character, which are expected by the user, without continuously triggering the visual angle control area, which brings great operational convenience to the user.
An optional implementation manner, the display control method in the game provided by the embodiment of the application further includes the following steps S601-S603:
s601, displaying a second visual angle control on a second side of the graphical user interface.
S602, responding to a third sliding operation for the second visual angle control, and adjusting the shooting direction of the virtual camera according to the third sliding operation.
S603, responding to an ending instruction of the third sliding operation for the second visual angle control, and controlling the shooting direction of the virtual camera.
Next, the above steps S601 to S603 will be described:
The second view control is used for adjusting the shooting direction of the virtual camera, and the user can select to display the second view control on the graphical user interface or not display the second view control on the graphical user interface (i.e. turn off to display the second view control) in the interface setting.
In an embodiment of the application, a second perspective control is displayed on a second side of the graphical user interface.
When the terminal device is a touch screen device such as a mobile phone, a tablet computer, or a game console, the third sliding operation may be a sliding operation in which a user's finger or another object such as a stylus slides on the touch screen in the viewing angle control area. The third sliding operation may be one of sliding operations such as up-sliding, down-sliding, left-sliding, and right-sliding, or may be a combination of two or more of sliding operations such as up-sliding, down-sliding, left-sliding, and right-sliding. By way of example only, embodiments of the application are not limited in any way. When the user slides on the touch screen, the game system records the sliding track of the sliding operation.
In the embodiment of the application, in response to a third sliding operation for the second visual angle control, the shooting direction of the virtual camera is controlled according to the third sliding operation. Specifically, taking the third sliding operation as an example of sliding to the right for a distance, the photographing direction of the virtual camera is controlled to move to the right, so that the photographing direction of the virtual camera is individually adjusted according to the third sliding operation. It should be noted that, the third sliding operation performed by the player for the second view control on the second side in the graphical user interface does not change the moving direction of the controlled virtual character. Through the third sliding operation performed on the second visual angle control, the shooting direction of the virtual camera can be independently and flexibly adjusted to observe the surrounding environment under the condition that the moving direction of the controlled virtual character is not changed, and therefore the game experience of a user is improved.
An optional implementation manner of "control shooting direction of virtual camera" in step S603 includes step S6031:
s6031, the shooting direction of the virtual camera before adjustment is restored.
In the embodiment of the application, the user stops the third sliding operation for the second visual angle control, and generates an ending instruction of the third sliding operation for the second visual angle control. And responding to an ending instruction of the third sliding operation for the second visual angle control, and restoring to the shooting direction of the virtual camera before adjustment. For example, the current shooting direction of the virtual camera is the north-plus direction, and the user controls the shooting direction of the virtual camera to be adjusted to the north-plus direction for the third sliding operation of the second view angle control. When an end instruction of the third sliding operation for the second view angle control is responded, the shooting direction of the virtual camera is restored from the northwest direction to the northwest direction.
In the embodiment of the application, the player is allowed to adjust the shooting direction of the virtual camera by performing the third sliding operation on the second visual angle control, so that the player can freely adjust the visual angle to observe the game world, the immersion and the realism of the game are enhanced, and the player can be deeply integrated into the game environment. And responding to an ending instruction of the third sliding operation for the second visual angle control, and restoring to the shooting direction of the virtual camera before adjustment. On the one hand, the consistency of the game picture can be maintained by restoring the shooting direction of the virtual camera before adjustment, so that the sudden change of the shooting direction of the virtual camera caused by the adjustment of the shooting direction of the virtual camera by a user is avoided, and a player feels more stable and comfortable. On the other hand, the shooting direction of the virtual camera before the adjustment is restored to be the shooting direction, so that the game controllability can be improved, players can better master and understand the visual angle change in the game, and the game skills and experience level of the players are improved.
In an alternative embodiment, another implementation manner of "controlling the shooting direction of the virtual camera" in step S603 includes steps S6032-S6034: :
s6032, judging whether the third sliding operation meets the first preset condition.
And S6033, if yes, maintaining the shooting direction of the adjusted virtual camera.
And S6034, if not, restoring the shooting direction of the virtual camera before adjustment.
In the embodiment of the application, the shooting direction of the virtual camera is determined by judging whether the third sliding operation meets the first preset condition. And when the third sliding operation meets the first preset condition, maintaining the shooting direction of the adjusted virtual camera. And when the third sliding operation does not meet the first preset condition, controlling the shooting direction of the virtual camera to be restored to the shooting direction of the virtual camera before adjustment.
In an alternative embodiment, the first preset condition includes one or more of the following: whether the sliding distance of the third sliding operation is greater than or equal to a first distance threshold value, and whether the sliding duration of the third sliding operation is greater than or equal to a first duration threshold value.
Next, with reference to fig. 6 and a specific example, the above-mentioned control of the shooting direction of the virtual camera according to whether the third sliding operation satisfies the first preset condition is exemplarily described, and fig. 6 is a schematic diagram of controlling the shooting direction of the virtual camera according to the third sliding operation according to one embodiment of the present application. It is assumed that the first preset condition is whether the sliding duration of the third sliding operation is equal to or greater than a first time duration threshold, and the first time duration threshold is 5 seconds.
As shown in fig. 6, the interface (a), the interface (b), and the interface (c) are included. As shown in interface (a), a second perspective control 601 is displayed on a second side of the graphical user interface. The shooting direction of the virtual camera is a direction facing the right shoulder of the side of the controlled virtual character 301. As shown in the interface (b), the user performs a third sliding operation on the second view control 601 by a finger in the sliding direction shown in the interface (b), and the total duration of the third sliding operation is 3 seconds(s). And responding to a third sliding operation for the second visual angle control, and adjusting the shooting direction of the virtual camera according to the third sliding operation. As shown in the interface (b), the shooting direction of the virtual camera is a direction facing the front of the controlled virtual character 301. And controlling the shooting direction of the virtual camera to be restored to the shooting direction of the virtual camera before adjustment in response to the sliding time length of the third sliding operation being 3 seconds less than the first time length threshold value for 5 seconds, namely that the third sliding operation does not meet the first preset condition. As shown in the interface (c), the shooting direction of the virtual camera is restored from the direction of the front controlled virtual character 301 to the direction of the right shoulder facing the side of the controlled virtual character 301.
In an alternative embodiment, step S602 "in response to the third sliding operation for the second view control, one possible implementation of adjusting the shooting direction of the virtual camera according to the third sliding operation" includes step S6021:
S6021, in response to a third sliding operation for the second view angle control, maintaining the moving direction of the controlled virtual character, and adjusting the shooting direction of the virtual camera according to the third sliding operation.
In the embodiment of the application, in the process that the user executes the third sliding operation on the second visual angle control, the moving direction of the controlled virtual character is kept unchanged, and the shooting direction of the virtual camera is adjusted according to the third sliding operation.
In the above, the moving direction of the controlled virtual character is maintained so that the direction in which the character is originally moved is not changed when the controlled virtual character is moved, and the controlled virtual character is maintained so that the movement is not changed even when the photographing direction of the virtual camera is adjusted or other operations are performed. For example, before the third sliding operation of the user with respect to the second view angle control, the moving direction of the controlled virtual character is in the north direction, and during the process of the third sliding operation of the user with respect to the second view angle control, and the third sliding operation is finished, the moving direction of the controlled virtual character is always in the north direction.
The specific implementation manner of the step of adjusting the shooting direction of the virtual camera according to the third sliding operation may refer to the description and explanation of the step S603, and the same technical effects may be achieved, so that the description is omitted here for avoiding repetition.
In an alternative embodiment, step S40", in response to the second sliding operation for the viewing angle control area, one possible implementation of adjusting the shooting direction of the virtual camera according to the second sliding operation, includes step S401:
S401, in response to a second sliding operation for the view angle control area, maintaining a moving direction of the controlled virtual character, and adjusting a photographing direction of the virtual camera according to the second sliding operation.
In the embodiment of the application, in the process that the user executes the third sliding operation on the second visual angle control, the moving direction of the controlled virtual character is kept unchanged, and the shooting direction of the virtual camera is adjusted according to the third sliding operation.
The above description and explanation of the step of maintaining the moving direction of the controlled virtual character may refer to the above description and explanation of the step S6021, and the same technical effects can be achieved, and for avoiding repetition, the description is omitted here.
The specific implementation manner of the step of adjusting the shooting direction of the virtual camera according to the second sliding operation may refer to the description and explanation of the step S40, and the same technical effects may be achieved, so that the description is omitted here for avoiding repetition.
In an alternative implementation manner, in an embodiment of the present application, the triggering operation for triggering the controlled virtual character to enter the automatic movement state may be an operation that acts on the movement control area and satisfies the second preset condition.
In an alternative embodiment, the triggering operation is triggered by the touch panel of the current terminal device.
The second preset condition includes one or more of the following:
The initial position of the touch point which is triggered to be operated on the touch panel is positioned in the movement control area, and the touch point moves into the preset control.
The initial position of the touch point of the triggering operation on the touch panel is positioned in the movement control area, and the movement distance of the touch point is larger than or equal to a second distance threshold value.
The initial position of the touch point of the triggering operation on the touch panel is positioned in the movement control area, and the movement time length of the touch point is more than or equal to a second time length threshold value.
The initial position of the touch point of the triggering operation on the touch panel is positioned in the movement control area, and the touch pressure of the touch point is larger than or equal to a preset pressure threshold value.
Next, with reference to fig. 7 and a specific example, an implementation manner of the triggering operation for triggering the controlled virtual character to enter the automatic moving state is described in an exemplary manner, and fig. 7 is a schematic diagram provided in an embodiment of the present application for triggering the controlled virtual character to enter the automatic moving state.
As shown in fig. 7, includes an interface (a) and an interface (b). As shown in the interface (a), the starting position of the touch point of the triggering operation of the left index finger of the user on the touch panel is located in the movement control 204 in the movement control area 201, such as the starting point of the triggering operation of the left index finger of the user of the interface (a). And moves along the moving direction as shown in the interface (a), and when moving to the preset control, such as the automatic moving control 702, the controlled virtual character 701 is triggered to enter the automatic running state. As shown in interface (b), the controlled virtual character 701 automatically runs, and the automatic movement state identifier 302 is displayed on the upper right corner trigger of the graphical user interface during the process that the controlled virtual character 701 is in the automatic running state.
An optional implementation manner, the display control method in the game provided by the embodiment of the application further includes the following steps S701-S702:
And S701, under the condition that the controlled virtual character is in an automatic moving state and no first virtual character exists in the moving direction, responding to the controlled virtual character to execute a first attack operation, controlling the controlled virtual character to stop the free moving state, wherein the first virtual character is a virtual character except the controlled virtual character.
S702, responding to the instruction of the first attack operation, and restoring the controlled virtual role to an automatic moving state.
In the embodiment of the application, under the condition that the controlled virtual character is in an automatic moving state and under the condition that the controlled virtual character does not have a first virtual character in the moving direction of the controlled virtual character, the controlled virtual character is controlled to stop the free moving state in response to the controlled virtual character executing a first attack operation. The controlled avatar may revert to jogging or walking, etc., and perform a first attack operation without hitting any avatar, i.e., hitting the air gradually falls. And responding to the instruction of the completion of the execution of the first attack operation, and controlling the controlled virtual character to restore to an automatic moving state.
The first virtual character may be a virtual character other than the controlled virtual character, for example, an enemy virtual character corresponding to the controlled virtual character, or a teammate of the controlled virtual character.
An optional implementation manner, the display control method in the game provided by the embodiment of the application further includes steps S801 to S802:
S801, under the condition that the controlled virtual character is in an automatic moving state and a first virtual character exists in the moving direction, responding to the controlled virtual character to execute a second attack operation aiming at the first virtual character, judging whether the second attack operation hits the first virtual character or not, wherein the first virtual character is a virtual character except the controlled virtual character.
S802, if the first virtual character is hit, the controlled virtual character is controlled to exit the automatic movement state.
The first virtual character is a virtual character other than the controlled virtual character, and may be, for example, a controlled virtual character enemy virtual character.
In the embodiment of the application, under the condition that the controlled virtual character is in an automatic moving state and a first virtual character exists in the moving direction, a second attack operation is executed aiming at the first virtual character in response to the controlled virtual character, whether the second attack operation hits the first virtual character is judged, and the first virtual character is a virtual character except the controlled virtual character. And if the first virtual character is hit, controlling the controlled virtual character to exit the automatic moving state. Under the condition of hitting the first virtual character, the controlled virtual character does not need to catch up to the first virtual character any more, and the controlled virtual character is controlled to automatically exit from the automatic moving state. The player needs to consider whether to continue triggering into an automatic move state or select an attack to increase interactivity and enjoyment of the game.
Next, with reference to fig. 8 and a specific example, an exemplary movement control performed in the movement control area is described, and fig. 8 is a schematic diagram of the movement control area according to an embodiment of the present application.
As shown in fig. 8, includes an interface (a) and an interface (b). As shown in interface (a), an automatic movement control 702 is displayed on the left side of the graphical user interface and an automatic movement status identifier 302 is displayed on the upper right side of the graphical user interface, with the controlled virtual character 701 currently in an automatic movement state. As shown in interface (a), the user manipulates the controlled virtual character 701 to hold the flying lead 803 while the user's right index finger performs a triggering operation, such as a clicking operation, on the flying lead launching control 802. As shown in interface (b), the controlled virtual character 701 successfully hooks the first virtual character 801 using the flying rope 803. At the same time, the automatic movement control 702 and the automatic movement state identifier 302 may be found to be de-displayed on the graphical user interface, i.e., the controlled virtual character 701 exits the automatic movement state.
An optional implementation manner, the method for controlling display in a game provided by the embodiment of the present application further includes step S90:
and S90, canceling to display the first visual angle control in response to the controlled virtual character exiting the automatic moving state.
Next, an exemplary description will be given of controlling the controlled virtual character to exit from the free movement state with reference to fig. 9 and a specific example, and fig. 9 is a schematic diagram of controlling the controlled virtual character to exit from the free movement state according to an embodiment of the present application.
As shown in fig. 9, includes an interface (a) and an interface (b). As shown in interface (a), an automatic movement control 702 and a first view control 303 are displayed on the left side of the graphical user interface, an automatic movement status identifier 302 is displayed on the upper right side of the graphical user interface, and the controlled virtual character 701 is currently in an automatic movement state. As shown in interface (a), the user's left index finger performs a triggering operation, such as a clicking operation, on automatic movement control 702. The first view control 303, the automatic movement control 702, and the automatic movement state identification 302 are canceled from the graphical user interface as shown in interface (b), i.e., the controlled virtual character 701 exits the automatic movement state.
The display control apparatus in a game provided by the present application will be described below, and the display control apparatus in a game described below and the display control method in a game described above may be referred to correspondingly to each other.
Fig. 10 is a schematic structural diagram of a display control device in a game according to an embodiment of the present application. As shown in fig. 10, the display control apparatus 1000 in the game includes: a first display module 1001, a second display module 1002, a control module 1003, and a processing module 1004.
A first display module for providing a movement control area on a first side of a graphical user interface and providing a viewing angle control area on a second side of the graphical user interface;
The second display module is used for responding to the triggering operation of the controlled virtual character entering the automatic moving state and displaying a first visual angle control on the first side of the graphical user interface;
the control module is used for responding to a first sliding operation for the first visual angle control, and controlling the shooting direction of the virtual camera and the moving direction of the controlled virtual character according to the first sliding operation;
and the processing module is used for responding to a second sliding operation for the visual angle control area and adjusting the shooting direction of the virtual camera according to the second sliding operation.
Optionally, the control module is further configured to:
In response to an end instruction of the first sliding operation for the view angle control area, the adjusted photographing direction of the virtual camera and the movement direction of the controlled virtual character are maintained.
Optionally, the control module is further configured to:
displaying a second perspective control on the second side of the graphical user interface;
responding to a third sliding operation for the second visual angle control, and adjusting the shooting direction of the virtual camera according to the third sliding operation;
and controlling the shooting direction of the virtual camera in response to an ending instruction of a third sliding operation for the second visual angle control.
Optionally, the control module is specifically configured to:
And restoring to the shooting direction of the virtual camera before adjustment.
Optionally, the control module is specifically configured to:
judging whether the third sliding operation meets a first preset condition or not;
If yes, maintaining the shooting direction of the adjusted virtual camera;
If not, restoring to the shooting direction of the virtual camera before adjustment.
Optionally, the first preset condition includes one or more of the following:
Whether the sliding distance of the third sliding operation is greater than or equal to a first distance threshold;
and whether the sliding duration of the third sliding operation is greater than or equal to a first time duration threshold.
Optionally, the responding to the third sliding operation for the second visual angle control adjusts the shooting direction of the virtual camera according to the third sliding operation, including:
In response to a third sliding operation for the second perspective control, a moving direction of the controlled virtual character is maintained, and a shooting direction of the virtual camera is adjusted according to the third sliding operation.
Optionally, the adjusting, in response to a second sliding operation for the viewing angle control area, a shooting direction of a virtual camera according to the second sliding operation includes:
In response to a second sliding operation for the view angle control area, a moving direction of the controlled virtual character is maintained, and a photographing direction of the virtual camera is adjusted according to the second sliding operation.
Optionally, the triggering operation is an operation that acts on the movement control area and satisfies a second preset condition.
Optionally, the triggering operation is triggered through a touch panel of the current terminal device;
The second preset condition includes one or more of the following:
the initial position of the touch point of the triggering operation on the touch panel is positioned in the movement control area, and the touch point moves into a preset control;
the initial position of the touch point of the triggering operation on the touch panel is positioned in the movement control area, and the movement distance of the touch point is larger than or equal to a second distance threshold;
The initial position of the touch point of the triggering operation on the touch panel is positioned in the movement control area, and the movement time length of the touch point is more than or equal to a second time length threshold value;
And the initial position of the touch point of the triggering operation on the touch panel is positioned in the movement control area, and the touch pressure of the touch point is greater than or equal to a preset pressure threshold.
Optionally, the method further comprises:
When the controlled virtual character is in the automatic moving state and no first virtual character exists in the moving direction, responding to the controlled virtual character to execute a first attack operation, controlling the controlled virtual character to stop the free moving state, wherein the first virtual character is a virtual character except the controlled virtual character;
and responding to the instruction of the first attack operation after the execution is finished, and restoring the controlled virtual role to an automatic moving state.
Optionally, the method further comprises:
Under the condition that the controlled virtual character is in the automatic moving state and a first virtual character exists in the moving direction, responding to the controlled virtual character to execute a second attack operation aiming at the first virtual character, judging whether the second attack operation hits the first virtual character or not, wherein the first virtual character is a virtual character except the controlled virtual character;
And if the first virtual character is hit, responding to the instruction that the second attack operation is executed, and controlling the controlled virtual character to exit the automatic moving state.
Optionally, the method further comprises:
And canceling to display the first visual angle control in response to the controlled virtual character exiting the automatic movement state.
The display control device in the game provided in this embodiment may be used to execute the technical scheme of the display control method embodiment in the game, and its implementation principle and technical effect are similar, and this embodiment will not be repeated here.
Fig. 11 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application, as shown in fig. 11, an electronic device 1100 of the present embodiment includes: a processor 1101 and a memory 1102; wherein the method comprises the steps of
Memory 1102 for storing computer-executable instructions;
the processor 1101 is configured to execute computer-executable instructions stored in the memory to implement the steps executed by the display control method in the game in the above embodiment. Reference may be made in particular to the relevant description of the embodiments of the method described above.
Alternatively, the memory 1102 may be separate or integrated with the processor 1101.
When the memory 1102 is provided separately, the electronic device further comprises a bus 1103 for connecting said memory 1102 with the processor 1101.
An embodiment of the present application further provides a computer readable storage medium, where computer execution instructions are stored, and when a processor executes the computer execution instructions, the technical scheme corresponding to the display control method in the game in any one of the above embodiments executed by the electronic device is implemented.
One embodiment of the present application also provides a computer program product comprising: and the computer program is stored in the readable storage medium, and the at least one processor of the electronic device can read the computer program from the readable storage medium, and the at least one processor executes the computer program to enable the electronic device to execute the technical scheme corresponding to the display control method in the game in any embodiment.
While the application has been described in terms of preferred embodiments, it is not intended to be limiting, but rather, it will be apparent to those skilled in the art that various changes and modifications can be made herein without departing from the spirit and scope of the application as defined by the appended claims.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or modules, which may be in electrical, mechanical, or other forms.
The integrated modules, which are implemented in the form of software functional modules, may be stored in a computer readable storage medium. The software functional module is stored in a storage medium, and includes several instructions for causing an electronic device (which may be a personal computer, a server, or a network device, etc.) or a processor (english: processor) to perform some of the steps of the methods according to the embodiments of the application.
It should be understood that the above Processor may be a central processing module (english: central Processing Unit, abbreviated as CPU), or may be other general purpose processors, a digital signal Processor (english: DIGITAL SIGNAL Processor, abbreviated as DSP), an Application-specific integrated Circuit (english: application SPECIFIC INTEGRATED Circuit, abbreviated as ASIC), or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present invention may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in a processor for execution.
The memory may comprise a high-speed RAM memory, and may further comprise a non-volatile memory NVM, such as at least one magnetic disk memory, and may also be a U-disk, a removable hard disk, a read-only memory, a magnetic disk or optical disk, etc.
The bus may be an industry standard architecture (Industry Standard Architecture, ISA) bus, an external device interconnect (PERIPHERAL COMPONENT, PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, among others. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, the buses in the drawings of the present application are not limited to only one bus or to one type of bus.
The storage medium may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the method embodiments described above may be performed by hardware associated with program instructions. The foregoing program may be stored in a computer readable storage medium. The program, when executed, performs steps including the method embodiments described above; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.

Claims (16)

1. A display control method in a game, the method comprising:
Providing a movement control area on a first side of a graphical user interface and providing a viewing angle control area on a second side of the graphical user interface;
responding to the triggering operation of the controlled virtual character entering an automatic moving state, and displaying a first visual angle control on the first side of the graphical user interface;
Controlling a shooting direction of a virtual camera and a moving direction of the controlled virtual character according to a first sliding operation in response to the first sliding operation for the first view angle control;
In response to a second sliding operation for the viewing angle control region, a shooting direction of a virtual camera is adjusted according to the second sliding operation.
2. The method according to claim 1, wherein the method further comprises:
In response to an end instruction of the first sliding operation for the view angle control area, the adjusted photographing direction of the virtual camera and the movement direction of the controlled virtual character are maintained.
3. The method according to claim 1, wherein the method further comprises:
displaying a second perspective control on the second side of the graphical user interface;
responding to a third sliding operation for the second visual angle control, and adjusting the shooting direction of the virtual camera according to the third sliding operation;
and controlling the shooting direction of the virtual camera in response to an ending instruction of a third sliding operation for the second visual angle control.
4. A method according to claim 3, wherein said controlling the shooting direction of the virtual camera comprises:
And restoring to the shooting direction of the virtual camera before adjustment.
5. A method according to claim 3, wherein said controlling the shooting direction of the virtual camera comprises:
judging whether the third sliding operation meets a first preset condition or not;
If yes, maintaining the shooting direction of the adjusted virtual camera;
If not, restoring to the shooting direction of the virtual camera before adjustment.
6. The method of claim 5, wherein the first preset condition comprises one or more of:
Whether the sliding distance of the third sliding operation is greater than or equal to a first distance threshold;
and whether the sliding duration of the third sliding operation is greater than or equal to a first time duration threshold.
7. The method of claim 3, wherein the adjusting the shooting direction of the virtual camera according to the third sliding operation in response to the third sliding operation for the second perspective control comprises:
In response to a third sliding operation for the second perspective control, a moving direction of the controlled virtual character is maintained, and a shooting direction of the virtual camera is adjusted according to the third sliding operation.
8. The method according to claim 1, wherein the adjusting the shooting direction of the virtual camera according to the second sliding operation in response to the second sliding operation for the angle-of-view control area includes:
In response to a second sliding operation for the view angle control area, a moving direction of the controlled virtual character is maintained, and a photographing direction of the virtual camera is adjusted according to the second sliding operation.
9. The method according to claim 1, wherein the triggering operation is an operation that acts on the movement control area and satisfies a second preset condition.
10. The method according to claim 9, wherein the triggering operation is triggered by a touch panel of the current terminal device;
The second preset condition includes one or more of the following:
the initial position of the touch point of the triggering operation on the touch panel is positioned in the movement control area, and the touch point moves into a preset control;
the initial position of the touch point of the triggering operation on the touch panel is positioned in the movement control area, and the movement distance of the touch point is larger than or equal to a second distance threshold;
The initial position of the touch point of the triggering operation on the touch panel is positioned in the movement control area, and the movement time length of the touch point is more than or equal to a second time length threshold value;
And the initial position of the touch point of the triggering operation on the touch panel is positioned in the movement control area, and the touch pressure of the touch point is greater than or equal to a preset pressure threshold.
11. The method according to claim 1, wherein the method further comprises:
When the controlled virtual character is in the automatic moving state and no first virtual character exists in the moving direction, responding to the controlled virtual character to execute a first attack operation, controlling the controlled virtual character to stop the free moving state, wherein the first virtual character is a virtual character except the controlled virtual character;
and responding to the instruction of the first attack operation after the execution is finished, and restoring the controlled virtual role to an automatic moving state.
12. The method according to claim 1, wherein the method further comprises:
Under the condition that the controlled virtual character is in the automatic moving state and a first virtual character exists in the moving direction, responding to the controlled virtual character to execute a second attack operation aiming at the first virtual character, judging whether the second attack operation hits the first virtual character or not, wherein the first virtual character is a virtual character except the controlled virtual character;
And if the first virtual character is hit, responding to the instruction that the second attack operation is executed, and exiting the automatic moving state by the controlled virtual character.
13. The method according to claim 12, wherein the method further comprises:
And canceling to display the first visual angle control in response to the controlled virtual character exiting the automatic movement state.
14. A display control apparatus in a game, the apparatus comprising:
A first display module for providing a movement control area on a first side of a graphical user interface and providing a viewing angle control area on a second side of the graphical user interface;
The second display module is used for responding to the triggering operation of the controlled virtual character entering the automatic moving state and displaying a first visual angle control on the first side of the graphical user interface;
the control module is used for responding to a first sliding operation for the first visual angle control, and controlling the shooting direction of the virtual camera and the moving direction of the controlled virtual character according to the first sliding operation;
and the processing module is used for responding to a second sliding operation for the visual angle control area and adjusting the shooting direction of the virtual camera according to the second sliding operation.
15. An electronic device, the electronic device comprising:
A processor; and
A memory for storing a data processing program, the electronic device being powered on and executing the program by the processor, to execute the display control method in a game according to any one of claims 1 to 13.
16. A computer-readable storage medium, characterized in that a data processing program is stored, the program being executed by a processor, to perform the display control method in a game as claimed in any one of claims 1 to 13.
CN202410382652.XA 2024-03-29 Method and device for controlling display in game, electronic equipment and readable storage medium Pending CN118341080A (en)

Publications (1)

Publication Number Publication Date
CN118341080A true CN118341080A (en) 2024-07-16

Family

ID=

Similar Documents

Publication Publication Date Title
CN112090069B (en) Information prompting method and device in virtual scene, electronic equipment and storage medium
JP7386360B2 (en) Information processing method, apparatus and terminal device
CN113398601B (en) Information transmission method, information transmission device, computer-readable medium, and apparatus
CN112206512A (en) Information processing method and device, electronic equipment and storage medium
TWI793837B (en) Method of controlling virtual object, device, electrical equipment, storage medium, and computer program product
CN111888766B (en) Information processing method and device in game, electronic equipment and storage medium
CN112076473A (en) Control method and device of virtual prop, electronic equipment and storage medium
CN112402960A (en) State switching method, device, equipment and storage medium in virtual scene
JP2022553558A (en) Virtual environment screen display method, device, equipment and program
TWI831074B (en) Information processing methods, devices, equipments, computer-readable storage mediums, and computer program products in virtual scene
CN113082697A (en) Game interaction method and device and electronic equipment
WO2022237420A1 (en) Control method and apparatus for virtual object, device, storage medium, and program product
CN112138385B (en) Virtual shooting prop aiming method and device, electronic equipment and storage medium
US20230033530A1 (en) Method and apparatus for acquiring position in virtual scene, device, medium and program product
US20220355188A1 (en) Game program, game method, and terminal device
WO2024001191A1 (en) Operation method and apparatus in game, nonvolatile storage medium, and electronic apparatus
CN116099195A (en) Game display control method and device, electronic equipment and storage medium
CN112755524B (en) Virtual target display method and device, electronic equipment and storage medium
CN115708956A (en) Game picture updating method and device, computer equipment and medium
CN113144617B (en) Control method, device and equipment of virtual object and computer readable storage medium
CN118341080A (en) Method and device for controlling display in game, electronic equipment and readable storage medium
CN112156472B (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN114130006A (en) Control method, device, equipment, storage medium and program product of virtual prop
Quek et al. Obscura: A mobile game with camera based mechanics
CN113318431B (en) In-game aiming control method and device

Legal Events

Date Code Title Description
PB01 Publication