CN113398564B - Virtual character control method, device, storage medium and computer equipment - Google Patents

Virtual character control method, device, storage medium and computer equipment Download PDF

Info

Publication number
CN113398564B
CN113398564B CN202110786342.0A CN202110786342A CN113398564B CN 113398564 B CN113398564 B CN 113398564B CN 202110786342 A CN202110786342 A CN 202110786342A CN 113398564 B CN113398564 B CN 113398564B
Authority
CN
China
Prior art keywords
control
function
target
virtual
control function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110786342.0A
Other languages
Chinese (zh)
Other versions
CN113398564A (en
Inventor
李雪妹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110786342.0A priority Critical patent/CN113398564B/en
Publication of CN113398564A publication Critical patent/CN113398564A/en
Application granted granted Critical
Publication of CN113398564B publication Critical patent/CN113398564B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Abstract

The embodiment of the application discloses a virtual role control method, a virtual role control device, a computer readable storage medium and computer equipment. Displaying a graphical user interface comprising at least a portion of the virtual scene, at least a portion of the virtual objects located in the virtual scene, and a designated response area; when the role control operation is detected, a control position of the role control operation acting on the graphical user interface is obtained; responding to the control position starting from the appointed response area, determining an appointed control function corresponding to the appointed response area, and controlling the virtual object to realize the appointed control function; and responding to the control position acting on the target function response area, determining a target control function corresponding to the target function response area, and controlling the virtual object to simultaneously realize the designated control function and the target control function. The virtual object is controlled to realize various control functions by determining the action area of the role control operation, so that the control efficiency of controlling the virtual roles is improved.

Description

Virtual character control method, device, storage medium and computer equipment
Technical Field
The present invention relates to the field of computers, and in particular, to a virtual character control method, device, computer readable storage medium, and computer apparatus.
Background
In recent years, with development and popularization of computer equipment technology, more and more applications having three-dimensional virtual environments, such as: virtual reality applications, three-dimensional map programs, military simulation programs, first person shooter games (First person shooting game, FPS), multiplayer online tactical athletic games (Multiplayer Online Battle Arena Games, MOBA), etc.
In the prior art, taking an FPS game as an example, when a user controls a virtual weapon to perform operations of opening a mirror, probing a head and shooting, the user needs to click on the opening mirror control, the probe control and the shooting control respectively, and needs to click on different controls at least three times.
In the research and practice process of the prior art, the inventor of the application finds that the behavior control of the virtual roles in the prior art is complicated and is not coherent, so that the control efficiency of controlling the virtual roles is low.
Disclosure of Invention
The embodiment of the application provides a virtual character control method and device, which can improve the control efficiency of controlling virtual characters.
In order to solve the technical problems, the embodiment of the application provides the following technical scheme:
a virtual character control method comprising:
Displaying a graphical user interface, wherein the graphical user interface comprises at least a part of virtual scenes, at least a part of virtual objects positioned in the virtual scenes, and a designated response area, the designated response area is a response area corresponding to a designated control function, and the designated control function is a control function which simultaneously controls the virtual role with at least one other control function in a plurality of control functions;
when a role control operation is detected, a control position of the role control operation acting on the graphical user interface is obtained, wherein the role control operation is used for controlling the action of the virtual object;
responding to the control position starting from the appointed response area, determining an appointed control function corresponding to the appointed response area, and controlling the virtual object to realize the appointed control function;
and responding the control position to act on a target function response area, determining a target control function corresponding to the target function response area, and controlling the virtual object to realize the designated control function and the target control function at the same time.
A virtual character control apparatus comprising:
the first display module is used for displaying a graphical user interface, wherein the graphical user interface comprises at least a part of virtual scenes, at least a part of virtual objects positioned in the virtual scenes and designated response areas, the designated response areas are response areas corresponding to designated control functions, and the designated control functions are control functions of a plurality of control functions and at least one other control function for controlling the virtual role simultaneously;
The acquisition module is used for acquiring a control position of the role control operation acting on the graphical user interface when the role control operation is detected, wherein the role control operation is used for controlling the action of the virtual object;
the first determining module is used for responding to the control position starting from the appointed response area, determining an appointed control function corresponding to the appointed response area and controlling the virtual object to realize the appointed control function;
and the second determining module is used for responding to the control position to act on a target function response area, determining a target control function corresponding to the target function response area and controlling the virtual object to realize the designated control function and the target control function at the same time.
In some embodiments, the apparatus further comprises:
and the second display module is used for displaying associated function controls around the appointed response area, and associated control functions corresponding to the associated function controls are used for being associated with the appointed control functions so as to realize the control of the virtual roles simultaneously with the appointed control functions.
In some embodiments, the second display module includes:
The acquisition sub-module is used for acquiring the action duration of the role control operation in the appointed response area;
and the display sub-module is used for displaying the related function control on the periphery of the appointed response area when the action duration is longer than the preset duration.
In some embodiments, the second display module includes:
and the adjustment sub-module is used for adjusting the sight direction of the virtual character if the sight direction adjustment operation aiming at the appointed response area is detected when the action duration is smaller than the preset duration.
In some embodiments, each associated functionality control corresponds to an associated functionality response area, and there is a coincidence area between at least some of the associated functionality response areas, the target functionality response area being comprised of each associated functionality response area;
the second determining module includes:
and the first determining submodule is used for determining at least two first target associated control functions corresponding to the overlapping area when the control position acts on the overlapping area, and determining the first target associated control functions as target control functions.
In some embodiments, the second determining module further comprises:
And the second determining submodule is used for determining a second target associated control function corresponding to the non-overlapping area when the control position acts on the non-overlapping area, and determining the second target associated control function as a target control function.
In some embodiments, the apparatus further comprises:
and the first prohibition module is used for prohibiting the virtual object from realizing the specified control function and the target control function and hiding the associated function control when the role control operation is not detected.
In some embodiments, the apparatus further comprises:
and the second prohibiting module is used for prohibiting the virtual object from realizing the target control function and controlling the virtual character to realize the designated control function when the control position acts on the designated response area again.
In some embodiments, the apparatus further comprises:
a third determining module, configured to determine, when the control position acts on a non-target function response area, a non-target control function corresponding to the non-target function response area, where the non-target function response area is a response area corresponding to a control function other than the designated control function and the associated control function in a control function set implemented by controlling the virtual object;
And the control module is used for prohibiting the virtual role from realizing the appointed control function and the target control function and controlling the virtual role to realize the non-target control function.
A computer readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps of the virtual character control method described above.
A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the virtual character control method as described above when executing the program.
The method comprises the steps that through displaying a graphical user interface, the graphical user interface comprises at least a part of virtual scenes, at least a part of virtual objects located in the virtual scenes and designated response areas, the designated response areas are response areas corresponding to designated control functions, and the designated control functions are control functions of a plurality of control functions and at least one other control function for controlling the virtual role at the same time; when a role control operation is detected, a control position of the role control operation acting on the graphical user interface is obtained, wherein the role control operation is used for controlling the action of the virtual object; responding to the control position starting from the appointed response area, determining an appointed control function corresponding to the appointed response area, and controlling the virtual object to realize the appointed control function; and responding the control position to act on a target function response area, determining a target control function corresponding to the target function response area, and controlling the virtual object to realize the designated control function and the target control function at the same time. Therefore, the virtual object is controlled to realize various control functions by determining the action area of the role control operation, so that the behavior control of the virtual role is simple and continuous, and the control efficiency of controlling the virtual role is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1a is a schematic system diagram of a virtual character control system according to an embodiment of the present application.
Fig. 1b is a schematic flow chart of a virtual character control method according to an embodiment of the present application.
Fig. 1c is a schematic diagram of a user interface according to an embodiment of the present application.
Fig. 1d is a schematic diagram of a display associated functionality control according to an embodiment of the present application.
Fig. 1e is a schematic diagram of a functional control response area in a graphical user interface according to an embodiment of the present application.
Fig. 2a is a second flowchart of a virtual character control method according to an embodiment of the present application.
Fig. 2b is a first schematic diagram of a user performing a role control operation in a graphical user interface according to an embodiment of the present application.
Fig. 2c is a second schematic diagram of a user performing a role control operation in a graphical user interface according to an embodiment of the present application.
Fig. 3 is a schematic structural diagram of a virtual character control device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The embodiment of the application provides a virtual role control method, a virtual role control device, a storage medium and computer equipment. Specifically, the virtual character control method in the embodiment of the application may be executed by a computer device, where the computer device may be a device such as a terminal or a server. The terminal may be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA), etc., and the terminal device may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client, etc. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligent platforms.
For example, when the virtual character control method is run on the terminal, the terminal device stores a game application and presents a part of a game scene in a game through the display component. The terminal device is used for interacting with a user through a graphical user interface, for example, the terminal device downloads and installs a game application program and runs the game application program. The way in which the terminal device presents the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device, or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including game screens and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for running the game, generating the graphical user interface, responding to the operation instructions, and controlling the display of the graphical user interface on the touch display screen.
For example, when the virtual character control method is run on a server, it may be a cloud game. Cloud gaming refers to a game style based on cloud computing. In the running mode of the cloud game, a running main body of the game application program and a game picture presentation main body are separated, and the storage and the running of the virtual character control method are completed on the cloud game server. The game image presentation is completed at a cloud game client, which is mainly used for receiving and sending game data and presenting game images, for example, the cloud game client may be a display device with a data transmission function, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, etc., near a user side, but a terminal device for performing game virtual character control is a cloud game server in the cloud. When playing the game, the user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the cloud game client through a network, and finally decodes the data through the cloud game client and outputs the game pictures.
Referring to fig. 1a, fig. 1a is a schematic system diagram of a virtual character control device according to an embodiment of the present application. The system may include at least one terminal 1000, at least one server 2000, at least one database 3000, and a network 4000. Terminal 1000 held by a user may be connected to servers of different games through network 4000. Terminal 1000 can be any device having computing hardware capable of supporting and executing software products corresponding to a game. In addition, terminal 1000 can have one or more multi-touch sensitive screens for sensing and obtaining input from a user through touch or slide operations performed at multiple points of one or more touch sensitive display screens. In addition, when the system includes a plurality of terminals 1000, a plurality of servers 2000, and a plurality of networks 4000, different terminals 1000 may be connected to each other through different networks 4000, through different servers 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition, the different terminals 1000 may be connected to other terminals or to a server or the like using their own bluetooth network or hotspot network. For example, multiple users may be online through different terminals 1000 so as to be connected via an appropriate network and synchronized with each other to support multiplayer games. In addition, the system may include a plurality of databases 3000, the plurality of databases 3000 being coupled to different servers 2000, and information related to the game environment may be continuously stored in the databases 3000 while different users play the multiplayer game online.
The embodiment of the application provides a virtual role control method which can be executed by a terminal or a server. The embodiments of the present application will be described with an example in which a virtual character control method is executed by a terminal. The terminal comprises a display component and a processor, wherein the display component is used for presenting a graphical user interface and receiving operation instructions generated by a user acting on the display component. When a user operates the graphical user interface through the display component, the graphical user interface can control the local content of the terminal through responding to the received operation instruction, and can also control the content of the opposite-end server through responding to the received operation instruction. For example, the user-generated operational instructions for the graphical user interface include instructions for launching the gaming application, and the processor is configured to launch the gaming application after receiving the user-provided instructions for launching the gaming application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch-sensitive display screen. A touch display screen is a multi-touch-sensitive screen capable of sensing touch or slide operations performed simultaneously by a plurality of points on the screen. The user performs touch operation on the graphical user interface by using a finger, and when the graphical user interface detects the touch operation, the graphical user interface controls different virtual objects in the graphical user interface of the game to perform actions corresponding to the touch operation. For example, the game may be any one of a leisure game, an action game, a role playing game, a strategy game, a sports game, an educational game, a first person shooter game (First person shooting game, FPS), and the like. Wherein the game may comprise a virtual scene of the game drawn on a graphical user interface. Further, one or more virtual objects, such as virtual characters, controlled by a user (or player) may be included in the virtual scene of the game. In addition, one or more obstacles, such as rails, ravines, walls, etc., may also be included in the virtual scene of the game to limit movement of the virtual object, e.g., to limit movement of the one or more objects to a particular area within the virtual scene. Optionally, the virtual scene of the game also includes one or more elements, such as skills, scores, character health status, energy, etc., to provide assistance to the player, provide virtual services, increase scores related to the player's performance, etc. In addition, the graphical user interface may also present one or more indicators to provide indication information to the player. For example, a game may include a player controlled virtual object and one or more other virtual objects (such as enemy characters). In one embodiment, one or more other virtual objects are controlled by other players of the game. For example, one or more other virtual objects may be computer controlled, such as a robot using an Artificial Intelligence (AI) algorithm, implementing a human-machine engagement mode. For example, virtual objects possess various skills or capabilities that a game player uses to achieve a goal. For example, the virtual object may possess one or more weapons, props, tools, etc. that may be used to eliminate other objects from the game. Such skills or capabilities may be activated by the player of the game using one of a plurality of preset touch operations with the touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of the user.
It should be noted that, the system schematic diagram of the virtual character control system shown in fig. 1a is only an example, and the virtual character control system and the scenario described in the embodiments of the present application are for more clearly describing the technical solutions of the embodiments of the present application, and do not constitute a limitation to the technical solutions provided in the embodiments of the present application, and those skilled in the art can know that, with the evolution of the virtual character control system and the appearance of a new service scenario, the technical solutions provided in the embodiments of the present application are equally applicable to similar technical problems.
In this embodiment, description will be made from the viewpoint of a virtual character control apparatus which can be integrated in a computer device having a storage unit and a microprocessor mounted thereon and having arithmetic capability.
Referring to fig. 1b, fig. 1b is a schematic flow chart of a virtual character control method according to an embodiment of the present application. The virtual character control method comprises the following steps:
in step 101, a graphical user interface is displayed, the graphical user interface comprising at least a portion of a virtual scene, at least a portion of a virtual object located in the virtual scene, and a designated response area.
The three-dimensional virtual environment is provided when the application program runs on the terminal, and can be a simulation environment for the real world, a semi-simulation and semi-fictitious environment or a pure fictitious environment. The environment picture displayed on the graphic user interface is an environment picture presented when the virtual object observes the three-dimensional virtual environment. The user controls the virtual object in the game scene through the terminal, the virtual object can observe the three-dimensional virtual environment through the camera model, taking the FPS game as an example, when the virtual object is in the first person view angle, the camera model is positioned at the head or neck of the first virtual object, and only the arm part of the virtual character can be displayed in the graphical user interface; when in the third person viewing angle, the camera model is positioned behind the first virtual object, and only the upper body portion of the virtual character can be displayed in the graphical user interface. The graphic user interface is an environment picture presented by observing the three-dimensional virtual environment through the camera model at a certain view angle. The designated response area is a response area corresponding to the designated control function, wherein the designated control function can be displayed in a visual form in the graphical user interface. For example, the functional control corresponding to the specified control function is set in the graphical user interface, and the specified response area may be a display area corresponding to the functional control, or include a display area, and the like, which is not limited herein. The designated control function is a control function of the virtual character that is controlled simultaneously with at least one other control function of the plurality of control functions. For example, the aim control function may be implemented simultaneously with the left probe control function to control the virtual character, so that the aim control function or the left probe control function may be implemented as a designated control function, while the left and right walk control functions may not be implemented simultaneously to control the virtual character, so that neither the left nor right walk control functions may be implemented as designated control functions.
Specifically, referring to fig. 1c, fig. 1c is a schematic diagram of a user interface according to an embodiment of the present application. The user interface is presented by a screen of terminal 1000, and includes in the user interface a virtual object 10 manipulated by a user, and an aiming identification 20 for prompting an aiming position of a virtual weapon in the user interface, a cursor control 30 prompting current direction information of the virtual object 10 by the user, a movement control 40 controlling movement of the virtual object 10 in a three-dimensional virtual environment, and an aiming control 50 that can be used when the first virtual object 10 is attacked, and a map control 60 prompting a position of the first virtual object 10 by the user in the three-dimensional virtual environment, and an attack control 70 controlling an attack operation of the first virtual object 10 in the three-dimensional virtual environment, etc. An indication control 31 is further disposed in the cursor control 30, and is used for indicating the direction of the first virtual object 10 in the cursor control 30.
In step 102, when a character control operation is detected, a control position where the character control operation acts on the graphical user interface is acquired.
The character control operation is an operation of a user with respect to a graphical user interface displayed in the terminal, such as a click operation, a slide operation, or the like. In order to determine the control function required by the user, it is necessary to determine the control position where the character control operation acts in the graphical user interface.
Specifically, a two-dimensional coordinate system is established with the upper left corner in the graphical user interface as the origin. Since the character control operation is a click operation or a slide operation, when the user generates the character control operation with respect to the graphic user interface, coordinates of the character control operation acting in the graphic user interface can be determined by a voltage change of a built-in circuit in the terminal display screen. The coordinates are control positions where the character control operation acts on the graphical user interface.
In step 103, in response to the control position starting from the specified response area, determining a specified control function corresponding to the specified response area, and controlling the virtual object to implement the specified control function.
Wherein the control position starts from the designated response area, i.e., the character control operation starts from the designated response area. Therefore, the specified control function corresponding to the specified response area can be determined, and the virtual object is controlled to realize the specified control function.
For example, when the control position starts from the response area corresponding to the aiming control, the virtual object is controlled to realize the aiming function.
In some embodiments, after the step of controlling the virtual object to implement the specified control function, further comprising:
And displaying associated function controls around the appointed response area, wherein associated control functions corresponding to the associated function controls are used for being associated with the appointed control functions, so that the virtual roles are controlled simultaneously with the appointed control functions.
Fig. 1d is a schematic diagram of displaying an associated function control according to an embodiment of the present application, as shown in fig. 1 d. After the virtual object is controlled to realize the appointed control function, the associated function control can be actively displayed around the appointed response area, and the associated control function corresponding to the associated function control is used for being associated with the appointed control function, so that the virtual role is controlled simultaneously with the appointed control function.
Specifically, as shown in the left probe control 80 and the right probe control 90 in fig. 1d, the left probe control 80 and the right probe control 90 are associated functional controls of the aiming control 50, so that a user is prompted to control the virtual object to realize the aiming function, and the virtual character can be simultaneously controlled to realize the left probe function or the right probe function.
In some embodiments, the step of displaying the associated functionality control around the designated response area includes:
(1) Acquiring the acting time length of the role control operation in the appointed response area;
(2) And when the action duration is longer than the preset duration, displaying the related function controls around the appointed response area.
In order to determine whether a user needs to control a virtual object to realize a plurality of different control functions, a judging mode can be set, wherein the judging mode is to acquire the acting time length of role control operation in a designated response area, and when the acting time length is longer than a preset time length, relevant function controls are displayed on the periphery of the designated response area, so that the user is prompted to realize the relevant functions of the virtual role through the role control operation; when the action duration is smaller than the preset duration, the user is indicated that the virtual character is only required to realize the specified control function corresponding to the specified response area, and the related function control can not be displayed in the graphical user interface.
Specifically, the preset duration may be set to 1 second, 2 seconds, etc., which is not limited herein.
In some embodiments, the method further comprises:
and when the action duration is smaller than the preset duration, if the sight direction adjustment operation aiming at the appointed response area is detected, adjusting the sight direction of the virtual character.
When the player aims, the sight direction during aiming needs to be adjusted, so that the sight direction adjusting operation can be performed in the appointed response area when the action duration is smaller than the preset duration, and the sight direction of the virtual character is adjusted.
Specifically, the line-of-sight direction adjustment operation may be a sliding operation or the like for a specified response area, and is not limited herein.
In step 104, in response to the control position acting on the target function response area, a target control function corresponding to the target function response area is determined, and the virtual object is controlled to simultaneously implement the designated control function and the target control function.
The target function response area is a response area corresponding to a certain target control function in the graphical user interface.
For example, when the control position is applied to the response area B, if it is determined that the control function corresponding to the response area B is the left probe function, the virtual object is controlled to realize the left probe function while aiming.
In some embodiments, each associated functionality control corresponds to an associated functionality response area, and there is a coincidence area between at least some of the associated functionality response areas, the target functionality response area being comprised of each associated functionality response area;
the step of determining the target control function corresponding to the target function response area in response to the control position acting on the target function response area includes:
when the control position acts on the overlapping area, at least two first target associated control functions corresponding to the overlapping area are determined, and the first target associated control functions are determined to be target control functions.
Fig. 1e is a schematic diagram of a functional control response area in a graphical user interface according to an embodiment of the present application, as shown in fig. 1 e. Associated functionality controls associated with aiming control 50 are attack control 70, left probe control 80, and right probe control 90, respectively. The response area corresponding to the attack control 70 is a, the response area corresponding to the left probe control 80 is B, and the response area corresponding to the right probe control is C. It can be seen that there is a region of overlap between A and B, and between A and C. Therefore, in order to achieve the effect of the quick control of the user, when the control position is applied to the overlapping area, at least two first target associated control functions corresponding to the overlapping area can be achieved simultaneously, the first target associated control functions are determined to be target control functions, the virtual object is controlled, and the designated control function and the target control function are achieved simultaneously.
For example, when the control position is applied to the superposition area between A and B, three control functions of the left probe and firing can be realized when the virtual object is aimed; when the control position acts on the superposition area between A and C, three control functions of right probe and firing are realized when the virtual object is aimed.
In some embodiments, the method further comprises:
and when the control position acts on the non-overlapping area, determining a second target associated control function corresponding to the non-overlapping area, and determining the second target associated control function as a target control function.
When the control position is applied to the non-overlapping area, the user is stated to realize only a single control function, so that a second target associated control function corresponding to the non-overlapping area is determined, and the second target associated control function is determined as the target control function.
For example, when the user acts on the non-coincident region which is coincident with the A in the response region C corresponding to the right probe control, the second target associated control function is the right probe, and the two control functions of the right probe are realized when the virtual object is aimed.
In some embodiments, after the step of controlling the virtual object to simultaneously implement the specified control function and the target control function, the method further includes:
and when the role control operation is not detected, prohibiting the virtual object from being controlled to realize the designated control function and the target control function, and hiding the associated function control.
When the role control operation is not detected, the user is judged to be not required to realize the control function by the virtual object, the virtual object is forbidden to be controlled to realize the appointed control function and the target control function, and the associated function control is hidden.
In some embodiments, after the step of controlling the virtual object to simultaneously implement the specified control function and the target control function, the method further includes:
and when the control position is acted on the appointed response area again, prohibiting the virtual object from being controlled to realize the target control function, and controlling the virtual character to realize the appointed control function.
When the role control operation starts from the appointed response area, and the corresponding control position returns to the appointed response area, the virtual object is forbidden to be controlled to realize the target control function, and the virtual role is controlled to realize the appointed control function.
For example, the control position corresponding to the character control operation starts from the response area corresponding to the aiming control, then acts on the response area B corresponding to the left probe control 80, and finally acts on the response area corresponding to the aiming control, so that the functions of the virtual object are implemented sequentially as aiming, left probe aiming and aiming.
In some embodiments, after the step of controlling the virtual object to simultaneously implement the specified control function and the target control function, the method further includes:
(1) When the control position acts on a non-target function response area, determining a non-target control function corresponding to the non-target function response area, wherein the non-target function response area is a response area corresponding to a control function except the appointed control function and the associated control function in a control function set realized by controlling the virtual object;
(2) And prohibiting the virtual character from realizing the appointed control function and the target control function, and controlling the virtual character to realize the non-target control function.
And if the control position acts on the non-target function response area, determining a non-target control function corresponding to the non-target function response area. And prohibiting the virtual character from realizing the designated control function and the target control function, and controlling the virtual character to realize the non-target control function.
Specifically, the control function set implemented by the control virtual object includes all control functions for controlling the virtual object, the non-target control function is a control function in the control function set except for a designated control function and an associated control function, and the non-target function response area is a response area corresponding to the non-target control function.
For example, the designated control function is a left probe, the target control function is shooting, the non-target function response area is B, and the corresponding non-target control function is jumping. The virtual role can not be controlled simultaneously with the left probe, so that the virtual role is prohibited from realizing the left probe and shooting functions, and the virtual role is controlled to realize a single jump function.
As can be seen from the foregoing, in the embodiments of the present application, by displaying a graphical user interface, where the graphical user interface includes at least a part of a virtual scene, at least a part of a virtual object located in the virtual scene, and a designated response area, where the designated response area is a response area corresponding to a designated control function, and the designated control function is a control function that controls the virtual character simultaneously with at least one other control function among a plurality of control functions; when a role control operation is detected, a control position of the role control operation acting on the graphical user interface is obtained, and the role control operation is used for controlling the action of the virtual object; responding to the control position starting from the appointed response area, determining an appointed control function corresponding to the appointed response area, and controlling the virtual object to realize the appointed control function; and responding the control position to act on a target function response area, determining a target control function corresponding to the target function response area, and controlling the virtual object to realize the designated control function and the target control function at the same time. Therefore, the virtual object is controlled to realize various control functions by determining the action area of the role control operation, so that the behavior control of the virtual role is simple and continuous, and the control efficiency of controlling the virtual role is improved.
The methods described in connection with the above embodiments are described in further detail below by way of example.
Referring to fig. 2a, fig. 2a is a schematic flow chart of a virtual character control method according to an embodiment of the present application. The flow diagram is to realize the control function of the virtual object by the sliding operation of the user.
The method flow may include:
in step 201, a computer device displays a graphical user interface including at least a portion of a virtual scene, at least a portion of a virtual object located in the virtual scene, and a designated response area.
The three-dimensional virtual environment is provided when the application program runs on the terminal, and can be a simulation environment for the real world, a semi-simulation and semi-fictitious environment or a pure fictitious environment. The environment picture displayed on the graphic user interface is an environment picture presented when the virtual object observes the three-dimensional virtual environment. The user controls the virtual object in the game scene through the terminal, the virtual object can observe the three-dimensional virtual environment through the camera model, taking the FPS game as an example, when the virtual object is in the first person view angle, the camera model is positioned at the head or neck of the first virtual object, and only the arm part of the virtual character can be displayed in the graphical user interface; when in the third person viewing angle, the camera model is positioned behind the first virtual object, and only the upper body portion of the virtual character can be displayed in the graphical user interface. The graphic user interface is an environment picture presented by observing the three-dimensional virtual environment through the camera model at a certain view angle. The designated response area is a response area corresponding to the designated control function, wherein the designated control function can be displayed in a visual form in the graphical user interface. For example, the functional control corresponding to the specified control function is set in the graphical user interface, and the specified response area may be a display area corresponding to the functional control, or include a display area, and the like, which is not limited herein. The designated control function is a control function of the virtual character that is controlled simultaneously with at least one other control function of the plurality of control functions. For example, the aim control function may be implemented simultaneously with the left probe control function to control the virtual character, so that the aim control function or the left probe control function may be implemented as a designated control function, while the left and right walk control functions may not be implemented simultaneously to control the virtual character, so that neither the left nor right walk control functions may be implemented as designated control functions.
Specifically, referring to fig. 1c, fig. 1c is a schematic diagram of a user interface according to an embodiment of the present application. The user interface is presented by a screen of terminal 1000, and includes in the user interface a virtual object 10 manipulated by a user, and an aiming identification 20 for prompting an aiming position of a virtual weapon in the user interface, a cursor control 30 prompting current direction information of the virtual object 10 by the user, a movement control 40 controlling movement of the virtual object 10 in a three-dimensional virtual environment, and an aiming control 50 that can be used when the first virtual object 10 is attacked, and a map control 60 prompting a position of the first virtual object 10 by the user in the three-dimensional virtual environment, and an attack control 70 controlling an attack operation of the first virtual object 10 in the three-dimensional virtual environment, etc. An indication control 31 is further disposed in the cursor control 30, and is used for indicating the direction of the first virtual object 10 in the cursor control 30.
In step 202, when a character control operation is detected by the computer device, a control position where the character control operation acts on the graphical user interface is acquired.
The character control operation is an operation of a user with respect to a graphical user interface displayed in the terminal, such as a click operation, a slide operation, or the like. In order to determine the control function required by the user, it is necessary to determine the control position where the character control operation acts in the graphical user interface.
Specifically, a two-dimensional coordinate system is established with the upper left corner in the graphical user interface as the origin. Since the character control operation is a click operation or a slide operation, when the user generates the character control operation with respect to the graphic user interface, coordinates of the character control operation acting in the graphic user interface can be determined by a voltage change of a built-in circuit in the terminal display screen. The coordinates are control positions where the character control operation acts on the graphical user interface.
In step 203, when the control location starts at the specified response area, the computer device determines a specified control function corresponding to the specified response area, and controls the virtual object to implement the specified control function.
Wherein the control position starts from the designated response area, i.e., the character control operation starts from the designated response area. Therefore, the specified control function corresponding to the specified response area can be determined, and the virtual object is controlled to realize the specified control function.
For example, when the control position starts from the response area corresponding to the aiming control, the virtual object is controlled to realize the aiming function.
In step 204, the computer device obtains a duration of action of the character control operation within the specified response area.
Fig. 2b is a first schematic diagram of a user performing a role control operation in a graphical user interface according to an embodiment of the present application, as shown in fig. 2 b. Fig. 2b is an example of a character control operation as a user sliding operation within a graphical user interface on which the user can slide by a single finger. When the character control operation of the user starts from the designated response area, the acting duration of the character control operation in the designated response area, such as the pressing duration of the sliding operation in the aiming control, is acquired.
In step 205, when the time period is longer than the preset time period, the computer device displays the associated functionality control around the designated response area.
In order to determine whether the user needs to control the virtual object to realize a plurality of different control functions, a judging mode can be set, wherein the judging mode is to acquire the acting time length of the role control operation in the designated response area, and when the acting time length is longer than the preset time length, the relevant function control is displayed on the periphery of the designated response area, so that the user is prompted to realize the relevant function of the virtual role through the role control operation. When the action duration is smaller than the preset duration, the user is indicated that the virtual character is only required to realize the specified control function corresponding to the specified response area, and the related function control can not be displayed in the graphical user interface.
Specifically, the preset duration may be set to 1 second, 2 seconds, etc., which is not limited herein.
In step 206, when the control position is applied to the overlapping area, the computer device determines at least two first target-related control functions corresponding to the overlapping area, determines the first target-related control functions as target control functions, and controls the virtual object to simultaneously implement the designated control function and the target control function.
Each associated function control corresponds to an associated function response area, and a superposition area exists between at least part of the associated function response areas.
Specifically, as shown in fig. 2c, fig. 2c is a second schematic diagram of a user performing a role control operation in a graphical user interface according to an embodiment of the present application. Associated functionality controls associated with aiming control 50 are attack control 70, left probe control 80, and right probe control 90, respectively. The response area corresponding to the attack control 70 is a, the response area corresponding to the left probe control 80 is B, and the response area corresponding to the right probe control is C. It can be seen that there is a region of overlap between A and B, and between A and C. Therefore, in order to achieve the effect of the user's quick control, at least two first target-associated control functions corresponding to the overlapping region can be simultaneously achieved when the control position is applied to the overlapping region.
For example, when the user's finger moves from the aim control 50 to the overlap area between a and B, it is determined that the control function corresponding to the overlap area is the left probe and the attack, and therefore, the control function corresponding to the virtual character changes from aim to aim, left probe and attack.
In step 207, when the character control operation is not detected, the computer device prohibits the control virtual object from implementing the specified control function as well as the target control function, and conceals the associated function control.
When the role control operation is not detected, the user is judged to be not required to realize the control function by the virtual object, the virtual object is forbidden to be controlled to realize the appointed control function and the target control function, and the associated function control is hidden.
In step 208, when the control position is applied again to the designated response area, the computer device prohibits the control virtual object from implementing the target control function and controls the virtual character to implement the designated control function.
When the role control operation starts from the appointed response area, and the corresponding control position returns to the appointed response area, the virtual object is forbidden to be controlled to realize the target control function, and the virtual role is controlled to realize the appointed control function.
For example, the control position corresponding to the character control operation starts from the response area corresponding to the aiming control, then acts on the response area B corresponding to the left probe control 80, and finally acts on the response area corresponding to the aiming control, so that the functions of the virtual object are implemented sequentially as aiming, left probe aiming and aiming.
In step 209, when the control position is applied to the non-target function response area, the computer device determines a non-target control function corresponding to the non-target function response area.
And if the control position acts on the non-target function response area, determining a non-target control function corresponding to the non-target function response area.
Specifically, the control function set implemented by the control virtual object includes all control functions for controlling the virtual object, the non-target control function is a control function in the control function set except for a designated control function and an associated control function, and the non-target function response area is a response area corresponding to the non-target control function.
In step 210, the computer device prohibits the virtual character from implementing the designated control function as well as the target control function and controls the virtual character to implement the non-target control function.
And prohibiting the virtual character from realizing the designated control function and the target control function, and controlling the virtual character to realize the non-target control function.
For example, the designated control function is a left probe, the target control function is shooting, the non-target function response area is B, and the corresponding non-target control function is jumping. The virtual role can not be controlled simultaneously with the left probe, so that the virtual role is prohibited from realizing the left probe and shooting functions, and the virtual role is controlled to realize a single jump function.
As can be seen from the foregoing, in the embodiments of the present application, by displaying a graphical user interface, where the graphical user interface includes at least a part of a virtual scene, at least a part of a virtual object located in the virtual scene, and a designated response area, where the designated response area is a response area corresponding to a designated control function, and the designated control function is a control function that controls the virtual character simultaneously with at least one other control function among a plurality of control functions; when a role control operation is detected, a control position of the role control operation acting on the graphical user interface is obtained, and the role control operation is used for controlling the action of the virtual object; responding to the control position starting from the appointed response area, determining an appointed control function corresponding to the appointed response area, and controlling the virtual object to realize the appointed control function; and responding the control position to act on a target function response area, determining a target control function corresponding to the target function response area, and controlling the virtual object to realize the designated control function and the target control function at the same time. Therefore, the virtual object is controlled to realize various control functions by determining the action area of the role control operation, so that the behavior control of the virtual role is simple and continuous, and the control efficiency of controlling the virtual role is improved.
In order to facilitate better implementation of the virtual role control method provided by the embodiment of the application, the embodiment of the application also provides a device based on the virtual role control method. The meaning of the nouns is the same as that of the virtual character control method, and specific implementation details can be referred to the description of the method embodiment.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a virtual character control device according to an embodiment of the present application, where the virtual character control device may include a first display module 301, an obtaining module 302, a first determining module 303, a second determining module 304, and so on.
A first display module 301, configured to display a graphical user interface, where the graphical user interface includes at least a part of a virtual scene, at least a part of a virtual object located in the virtual scene, and a specified response area, where the specified response area is a response area corresponding to a specified control function, and the specified control function is a control function that controls the virtual character simultaneously with at least one other control function in a plurality of control functions;
an obtaining module 302, configured to obtain, when a role control operation is detected, a control position where the role control operation acts on the graphical user interface, where the role control operation is used to control an action of the virtual object;
A first determining module 303, configured to determine, in response to the control position starting from the specified response area, a specified control function corresponding to the specified response area, and control the virtual object to implement the specified control function;
the second determining module 304 is configured to determine, in response to the control location acting on the target function response area, a target control function corresponding to the target function response area, and control the virtual object to implement the specified control function and the target control function at the same time.
In some embodiments, the apparatus further comprises:
and the second display module is used for displaying associated function controls around the appointed response area, and associated control functions corresponding to the associated function controls are used for being associated with the appointed control functions so as to realize the control of the virtual roles simultaneously with the appointed control functions.
In some embodiments, the second display module includes:
the acquisition sub-module is used for acquiring the action duration of the role control operation in the appointed response area;
and the display sub-module is used for displaying the related function control on the periphery of the appointed response area when the action duration is longer than the preset duration.
In some embodiments, the second display module includes:
and the adjustment sub-module is used for adjusting the sight direction of the virtual character if the sight direction adjustment operation aiming at the appointed response area is detected when the action duration is smaller than the preset duration.
In some embodiments, each associated functionality control corresponds to an associated functionality response area, and there is a coincidence area between at least some of the associated functionality response areas, the target functionality response area being comprised of each associated functionality response area;
the second determining module 304 includes:
and the first determining submodule is used for determining at least two first target associated control functions corresponding to the overlapping area when the control position acts on the overlapping area, and determining the first target associated control functions as target control functions.
In some embodiments, the second determining module 304 further includes:
and the second determining submodule is used for determining a second target associated control function corresponding to the non-overlapping area when the control position acts on the non-overlapping area, and determining the second target associated control function as a target control function.
In some embodiments, the apparatus further comprises:
and the first prohibition module is used for prohibiting the virtual object from realizing the specified control function and the target control function and hiding the associated function control when the role control operation is not detected.
In some embodiments, the apparatus further comprises:
and the second prohibiting module is used for prohibiting the virtual object from realizing the target control function and controlling the virtual character to realize the designated control function when the control position acts on the designated response area again.
In some embodiments, the apparatus further comprises:
a third determining module, configured to determine, when the control position acts on a non-target function response area, a non-target control function corresponding to the non-target function response area, where the non-target function response area is a response area corresponding to a control function other than the designated control function and the associated control function in a control function set implemented by controlling the virtual object;
and the control module is used for prohibiting the virtual role from realizing the appointed control function and the target control function and controlling the virtual role to realize the non-target control function.
As can be seen from the foregoing, in the embodiment of the present application, a graphical user interface is displayed through the first display module 301, where the graphical user interface includes at least a part of a virtual scene, at least a part of virtual objects located in the virtual scene, and a designated response area, where the designated response area is a response area corresponding to a designated control function, and the designated control function is a control function that controls the virtual character simultaneously with at least one other control function in a plurality of control functions; the acquiring module 302 acquires a control position of the character control operation acting on the graphical user interface when the character control operation is detected, wherein the character control operation is used for controlling the action of the virtual object; the first determining module 303 determines a specified control function corresponding to the specified response area in response to the control position starting from the specified response area, and controls the virtual object to implement the specified control function; the second determining module 304 determines a target control function corresponding to the target function response area in response to the control position acting on the target function response area, and controls the virtual object to implement the designated control function and the target control function at the same time. Therefore, the virtual object is controlled to realize various control functions by determining the action area of the role control operation, so that the behavior control of the virtual role is simple and continuous, and the control efficiency of controlling the virtual role is improved.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Correspondingly, the embodiment of the application also provides a computer device, which can be a terminal or a server, wherein the terminal can be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA) and the like. Fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present application, as shown in fig. 4. The computer apparatus 400 includes a processor 401 having one or more processing cores, a memory 402 having one or more computer readable storage media, and a computer program stored on the memory 402 and executable on the processor. The processor 401 is electrically connected to the memory 402. It will be appreciated by those skilled in the art that the computer device structure shown in the figures is not limiting of the computer device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
Processor 401 is a control center of computer device 400 and connects the various portions of the entire computer device 400 using various interfaces and lines to perform various functions of computer device 400 and process data by running or loading software programs and/or modules stored in memory 402 and invoking data stored in memory 402, thereby performing overall monitoring of computer device 400.
In the embodiment of the present application, the processor 401 in the computer device 400 loads the instructions corresponding to the processes of one or more application programs into the memory 402 according to the following steps, and the processor 401 executes the application programs stored in the memory 402, so as to implement various functions:
displaying a graphical user interface, wherein the graphical user interface comprises at least a part of virtual scenes, at least a part of virtual objects positioned in the virtual scenes, and a designated response area, the designated response area is a response area corresponding to a designated control function, and the designated control function is a control function which controls the virtual character simultaneously with at least one other control function in a plurality of control functions; when a role control operation is detected, a control position of the role control operation acting on the graphical user interface is obtained, and the role control operation is used for controlling the action of the virtual object; responding to the control position starting from the appointed response area, determining an appointed control function corresponding to the appointed response area, and controlling the virtual object to realize the appointed control function; and responding the control position to act on a target function response area, determining a target control function corresponding to the target function response area, and controlling the virtual object to realize the designated control function and the target control function at the same time.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Optionally, as shown in fig. 4, the computer device 400 further includes: a touch display 403, a radio frequency circuit 404, an audio circuit 405, an input unit 406, and a power supply 407. The processor 401 is electrically connected to the touch display 403, the radio frequency circuit 404, the audio circuit 405, the input unit 406, and the power supply 407, respectively. Those skilled in the art will appreciate that the computer device structure shown in FIG. 4 is not limiting of the computer device and may include more or fewer components than shown, or may be combined with certain components, or a different arrangement of components.
The touch display 403 may be used to display a graphical user interface and receive operation instructions generated by a user acting on the graphical user interface. The touch display screen 403 may include a display panel and a touch panel. Wherein the display panel may be used to display information entered by a user or provided to a user as well as various graphical user interfaces of a computer device, which may be composed of graphics, text, icons, video, and any combination thereof. Alternatively, the display panel may be configured in the form of a liquid crystal display (LCD, liquid Crystal Display), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations on or near the user (such as operations on or near the touch panel by the user using any suitable object or accessory such as a finger, stylus, etc.), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, and sends the touch point coordinates to the processor 401, and can receive and execute commands sent from the processor 401. The touch panel may overlay the display panel, and upon detection of a touch operation thereon or thereabout, the touch panel is passed to the processor 401 to determine the type of touch event, and the processor 401 then provides a corresponding visual output on the display panel in accordance with the type of touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 403 to implement the input and output functions. In some embodiments, however, the touch panel and the touch panel may be implemented as two separate components to perform the input and output functions. I.e. the touch-sensitive display 403 may also implement an input function as part of the input unit 406.
In the embodiment of the application, the processor 401 executes the game application program to generate a graphical user interface on the touch display screen 403, where the virtual scene on the graphical user interface includes at least one skill control area, and the skill control area includes at least one skill control. The touch display 403 is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface.
The radio frequency circuitry 404 may be used to transceive radio frequency signals to establish wireless communications with a network device or other computer device via wireless communications.
The audio circuitry 405 may be used to provide an audio interface between a user and a computer device through speakers, microphones, and so on. The audio circuit 405 may transmit the received electrical signal after audio data conversion to a speaker, where the electrical signal is converted into a sound signal for output; on the other hand, the microphone converts the collected sound signals into electrical signals, which are received by the audio circuit 405 and converted into audio data, which are processed by the audio data output processor 401 and sent via the radio frequency circuit 404 to, for example, another computer device, or which are output to the memory 402 for further processing. The audio circuit 405 may also include an ear bud jack to provide communication of the peripheral ear bud with the computer device.
The input unit 406 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 407 is used to power the various components of the computer device 400. Alternatively, the power supply 407 may be logically connected to the processor 401 through a power management system, so as to implement functions of managing charging, discharging, and power consumption management through the power management system. The power supply 407 may also include one or more of any of a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown in fig. 4, the computer device 400 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., and will not be described herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
As can be seen from the above, in the computer device provided in this embodiment, by displaying a graphical user interface, the graphical user interface includes at least a part of a virtual scene, at least a part of a virtual object located in the virtual scene, and a designated response area, where the designated response area is a response area corresponding to a designated control function, and the designated control function is a control function that controls the virtual character simultaneously with at least one other control function in a plurality of control functions; when a role control operation is detected, a control position of the role control operation acting on the graphical user interface is obtained, and the role control operation is used for controlling the action of the virtual object; responding to the control position starting from the appointed response area, determining an appointed control function corresponding to the appointed response area, and controlling the virtual object to realize the appointed control function; and responding the control position to act on a target function response area, determining a target control function corresponding to the target function response area, and controlling the virtual object to realize the designated control function and the target control function at the same time. Therefore, the virtual object is controlled to realize various control functions by determining the action area of the role control operation, so that the behavior control of the virtual role is simple and continuous, and the control efficiency of controlling the virtual role is improved.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer readable storage medium having stored therein a plurality of computer programs that can be loaded by a processor to perform steps in any of the skills control methods provided by embodiments of the present application. For example, the computer program may perform the steps of:
displaying a graphical user interface, wherein the graphical user interface comprises at least a part of virtual scenes, at least a part of virtual objects positioned in the virtual scenes, and a designated response area, the designated response area is a response area corresponding to a designated control function, and the designated control function is a control function which controls the virtual character simultaneously with at least one other control function in a plurality of control functions; when a role control operation is detected, a control position of the role control operation acting on the graphical user interface is obtained, and the role control operation is used for controlling the action of the virtual object; responding to the control position starting from the appointed response area, determining an appointed control function corresponding to the appointed response area, and controlling the virtual object to realize the appointed control function; and responding the control position to act on a target function response area, determining a target control function corresponding to the target function response area, and controlling the virtual object to realize the designated control function and the target control function at the same time.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Wherein the storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
The steps in any virtual character control method provided in the embodiments of the present application may be executed by the computer program stored in the storage medium, so that the beneficial effects that any virtual character control method provided in the embodiments of the present application may be achieved are described in detail in the previous embodiments, which are not described herein.
The foregoing describes in detail a virtual character control method, apparatus, storage medium and computer device provided in the embodiments of the present application, and specific examples are applied to describe the principles and embodiments of the present application, where the descriptions of the foregoing embodiments are only used to help understand the method and core ideas of the present application; meanwhile, those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, and the present description should not be construed as limiting the present application in view of the above.

Claims (9)

1. A virtual character control method, comprising:
displaying a graphical user interface, wherein the graphical user interface comprises at least a part of virtual scenes, at least a part of virtual objects positioned in the virtual scenes, and a designated response area, the designated response area is a response area corresponding to a designated control function, and the designated control function is a control function which simultaneously controls the virtual role with at least one other control function in a plurality of control functions;
when a role control operation is detected, a control position of the role control operation acting on the graphical user interface is obtained, wherein the role control operation is used for controlling the action of the virtual object;
responding to the control position starting from the appointed response area, determining an appointed control function corresponding to the appointed response area, and controlling the virtual object to realize the appointed control function;
acquiring the acting time length of the role control operation in the appointed response area;
when the action duration is longer than a preset duration, displaying associated function controls around the appointed response area, wherein associated control functions corresponding to the associated function controls are used for being associated with the appointed control functions;
When the action duration is smaller than the preset duration, if the sight direction adjustment operation aiming at the appointed response area is detected, the sight direction of the virtual character is adjusted;
in the process of controlling the virtual object to realize the appointed control function, responding to the control position to act on a target function response area, determining a target control function corresponding to the target function response area, and controlling the virtual object to realize the appointed control function and the target control function at the same time.
2. A virtual character control method according to claim 1, wherein each associated function control corresponds to an associated function response area, and there is a coincidence area between at least some of the associated function response areas, the target function response area being composed of each associated function response area;
the step of determining the target control function corresponding to the target function response area in response to the control position acting on the target function response area includes:
when the control position acts on the overlapping area, determining at least two first target associated control functions corresponding to the overlapping area, and determining the first target associated control functions as target control functions.
3. The virtual character control method according to claim 2, wherein the method further comprises:
and when the control position acts on the non-overlapping area, determining a second target associated control function corresponding to the non-overlapping area, and determining the second target associated control function as a target control function.
4. The virtual character control method according to claim 1, further comprising, after the step of controlling the virtual object to simultaneously implement the specified control function and the target control function:
and when the role control operation is not detected, prohibiting the virtual object from being controlled to realize the designated control function and the target control function, and hiding the associated function control.
5. The virtual character control method according to claim 1, further comprising, after the step of controlling the virtual object to simultaneously implement the specified control function and the target control function:
and when the control position acts on the appointed response area again, prohibiting control of the virtual object to realize the target control function, and controlling the virtual character to realize the appointed control function.
6. The virtual character control method according to claim 1, further comprising, after the step of controlling the virtual object to simultaneously implement the specified control function and the target control function:
when the control position acts on a non-target function response area, determining a non-target control function corresponding to the non-target function response area, wherein the non-target function response area is a response area corresponding to a control function except the appointed control function and the associated control function in a control function set realized by controlling the virtual object;
and prohibiting the virtual character from realizing the appointed control function and the target control function, and controlling the virtual character to realize the non-target control function.
7. A virtual character control apparatus, comprising:
the first display module is used for displaying a graphical user interface, wherein the graphical user interface comprises at least a part of virtual scenes, at least a part of virtual objects positioned in the virtual scenes and designated response areas, the designated response areas are response areas corresponding to designated control functions, and the designated control functions are control functions of a plurality of control functions and at least one other control function for controlling the virtual role simultaneously;
The acquisition module is used for acquiring a control position of the role control operation acting on the graphical user interface when the role control operation is detected, wherein the role control operation is used for controlling the action of the virtual object;
the first determining module is used for responding to the control position starting from the appointed response area, determining an appointed control function corresponding to the appointed response area and controlling the virtual object to realize the appointed control function;
a second display module, comprising:
the acquisition sub-module is used for acquiring the action duration of the role control operation in the appointed response area;
the display sub-module is used for displaying associated function controls around the appointed response area when the action duration is longer than a preset duration, and associated control functions corresponding to the associated function controls are used for being associated with the appointed control functions;
the adjusting sub-module is used for adjusting the sight direction of the virtual character if the sight direction adjusting operation aiming at the appointed response area is detected when the action duration is smaller than the preset duration;
and the second determining module is used for responding to the control position to act on a target function response area in the process of controlling the virtual object to realize the appointed control function, determining a target control function corresponding to the target function response area and controlling the virtual object to realize the appointed control function and the target control function at the same time.
8. A computer readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps in the virtual character control method of any one of claims 1 to 6.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the virtual character control method as claimed in any one of claims 1 to 6 when the program is executed.
CN202110786342.0A 2021-07-12 2021-07-12 Virtual character control method, device, storage medium and computer equipment Active CN113398564B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110786342.0A CN113398564B (en) 2021-07-12 2021-07-12 Virtual character control method, device, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110786342.0A CN113398564B (en) 2021-07-12 2021-07-12 Virtual character control method, device, storage medium and computer equipment

Publications (2)

Publication Number Publication Date
CN113398564A CN113398564A (en) 2021-09-17
CN113398564B true CN113398564B (en) 2024-02-13

Family

ID=77686144

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110786342.0A Active CN113398564B (en) 2021-07-12 2021-07-12 Virtual character control method, device, storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN113398564B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007260194A (en) * 2006-03-29 2007-10-11 Konami Digital Entertainment:Kk Game program, game device and game control method
KR101834986B1 (en) * 2017-08-28 2018-03-07 주식회사 솔트랩 Game system and method supporting disappearance processing
CN108553891A (en) * 2018-04-27 2018-09-21 腾讯科技(深圳)有限公司 Object method of sight and device, storage medium and electronic device
CN108771863A (en) * 2018-06-11 2018-11-09 网易(杭州)网络有限公司 The control method and device of shooting game
CN109847370A (en) * 2019-03-26 2019-06-07 网易(杭州)网络有限公司 Control method, device, equipment and the storage medium of shooting game
CN110559662A (en) * 2019-09-12 2019-12-13 腾讯科技(深圳)有限公司 Visual angle switching method, device, terminal and medium in virtual environment
CN110639203A (en) * 2019-09-29 2020-01-03 网易(杭州)网络有限公司 Control response method and device in game
CN111921194A (en) * 2020-08-26 2020-11-13 腾讯科技(深圳)有限公司 Virtual environment picture display method, device, equipment and storage medium
CN112933591A (en) * 2021-03-15 2021-06-11 网易(杭州)网络有限公司 Method and device for controlling game virtual character, storage medium and electronic equipment
CN113082718A (en) * 2021-04-19 2021-07-09 网易(杭州)网络有限公司 Game operation method, device, terminal and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007260194A (en) * 2006-03-29 2007-10-11 Konami Digital Entertainment:Kk Game program, game device and game control method
KR101834986B1 (en) * 2017-08-28 2018-03-07 주식회사 솔트랩 Game system and method supporting disappearance processing
CN108553891A (en) * 2018-04-27 2018-09-21 腾讯科技(深圳)有限公司 Object method of sight and device, storage medium and electronic device
CN108771863A (en) * 2018-06-11 2018-11-09 网易(杭州)网络有限公司 The control method and device of shooting game
CN109847370A (en) * 2019-03-26 2019-06-07 网易(杭州)网络有限公司 Control method, device, equipment and the storage medium of shooting game
CN110559662A (en) * 2019-09-12 2019-12-13 腾讯科技(深圳)有限公司 Visual angle switching method, device, terminal and medium in virtual environment
CN110639203A (en) * 2019-09-29 2020-01-03 网易(杭州)网络有限公司 Control response method and device in game
CN111921194A (en) * 2020-08-26 2020-11-13 腾讯科技(深圳)有限公司 Virtual environment picture display method, device, equipment and storage medium
CN112933591A (en) * 2021-03-15 2021-06-11 网易(杭州)网络有限公司 Method and device for controlling game virtual character, storage medium and electronic equipment
CN113082718A (en) * 2021-04-19 2021-07-09 网易(杭州)网络有限公司 Game operation method, device, terminal and storage medium

Also Published As

Publication number Publication date
CN113398564A (en) 2021-09-17

Similar Documents

Publication Publication Date Title
CN113101652A (en) Information display method and device, computer equipment and storage medium
CN113426124B (en) Display control method and device in game, storage medium and computer equipment
CN113082707B (en) Virtual object prompting method and device, storage medium and computer equipment
CN114522423A (en) Virtual object control method and device, storage medium and computer equipment
CN115193064A (en) Virtual object control method and device, storage medium and computer equipment
CN112245914B (en) Viewing angle adjusting method and device, storage medium and computer equipment
CN115999153A (en) Virtual character control method and device, storage medium and terminal equipment
CN115212572A (en) Control method and device of game props, computer equipment and storage medium
CN116139483A (en) Game function control method, game function control device, storage medium and computer equipment
CN113398564B (en) Virtual character control method, device, storage medium and computer equipment
CN114225412A (en) Information processing method, information processing device, computer equipment and storage medium
CN113398590B (en) Sound processing method, device, computer equipment and storage medium
CN115225926B (en) Game live broadcast picture processing method, device, computer equipment and storage medium
CN113332721B (en) Game control method, game control device, computer equipment and storage medium
CN116999825A (en) Game control method, game control device, computer equipment and storage medium
CN116474367A (en) Virtual lens control method and device, storage medium and computer equipment
CN116850594A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN116139484A (en) Game function control method, game function control device, storage medium and computer equipment
CN116999835A (en) Game control method, game control device, computer equipment and storage medium
CN117482523A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN116920392A (en) Game information prompting method, device, computer equipment and storage medium
CN116870472A (en) Game view angle switching method and device, computer equipment and storage medium
CN116271791A (en) Game control method, game control device, computer equipment and storage medium
CN116351059A (en) Non-player character control method, device, computer equipment and storage medium
CN116328301A (en) Information prompting method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant