CN111389003A - Game role control method, device, equipment and computer readable storage medium - Google Patents

Game role control method, device, equipment and computer readable storage medium Download PDF

Info

Publication number
CN111389003A
CN111389003A CN202010181539.7A CN202010181539A CN111389003A CN 111389003 A CN111389003 A CN 111389003A CN 202010181539 A CN202010181539 A CN 202010181539A CN 111389003 A CN111389003 A CN 111389003A
Authority
CN
China
Prior art keywords
mapping
touch operation
menu
touch
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010181539.7A
Other languages
Chinese (zh)
Other versions
CN111389003B (en
Inventor
吴伟迪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010181539.7A priority Critical patent/CN111389003B/en
Publication of CN111389003A publication Critical patent/CN111389003A/en
Application granted granted Critical
Publication of CN111389003B publication Critical patent/CN111389003B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure provides a game role control method, device, equipment and computer readable storage medium, a graphical user interface is provided through an electronic device, the graphical user interface comprises at least one menu control, the method comprises: responding to a first touch operation acting on a menu control in a graphical user interface, and displaying the menu interface corresponding to the menu control and a game role; and responding to a second touch operation acted on the menu interface, and controlling the game role to execute a virtual action corresponding to the second touch operation. According to the game scheme provided by the disclosure, when the user performs the second touch operation on the menu interface, the displayed game role can execute the virtual action corresponding to the second touch operation. According to the scheme provided by the disclosure, the game role can be controlled to make feedback corresponding to the second touch operation, so that the control of the user on the role is not influenced in the menu interface, and the substitution sense of the user in the game process is further improved.

Description

Game role control method, device, equipment and computer readable storage medium
Technical Field
The present disclosure relates to mobile terminal technologies, and in particular, to a method, an apparatus, a device, and a computer-readable storage medium for controlling a game role.
Background
At present, mobile terminals have become a common tool for people, and besides a common communication function, the mobile terminals can be used for user entertainment.
The mobile terminal can be provided with various game software, and a user can operate the game software to play entertainment. In some character games, when a user operates in a menu interface in the game, the operation of the user and the action of a game character are split, so that the substitution feeling of the user is not high, and therefore, how to improve the substitution feeling of the user in the game process is a technical problem to be solved.
Disclosure of Invention
The present disclosure provides a game role control method, device, equipment and computer readable storage medium to solve the problem in the prior art that the user has low substitution feeling during the game process.
A first aspect of the present disclosure is to provide a method for controlling a game character, in which a graphical user interface is provided by an electronic device, the graphical user interface including at least one menu control, the method including:
responding to a first touch operation acting on the menu control in the graphical user interface, and displaying the menu interface corresponding to the menu control and a game role;
and responding to a second touch operation acted on the menu interface, and controlling the game role to execute a virtual action corresponding to the second touch operation.
Another aspect of the present disclosure is to provide a game character control apparatus, applied to an electronic device, for providing a graphical user interface through the electronic device, where the graphical user interface includes at least one menu control, the apparatus including:
the display module is used for responding to a first touch operation acted on the menu control in the graphical user interface and displaying the menu interface corresponding to the menu control and a game role;
and the control module is used for responding to a second touch operation acted on the menu interface and controlling the game role to execute a virtual action corresponding to the second touch operation.
Still another aspect of the present disclosure is to provide a game character control apparatus including:
a memory;
a processor; and
a computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor to implement the game character control method according to the first aspect.
It is still another aspect of the present disclosure to provide a computer-readable storage medium having stored thereon a computer program to be executed by a processor to implement the game character control method according to the first aspect described above.
The game role control method, the game role control device, the game role control equipment and the computer readable storage medium have the technical effects that:
the game role control method, device, equipment and computer readable storage medium provided by the present disclosure provide a graphical user interface through an electronic device, the graphical user interface including at least one menu control, the method includes: responding to a first touch operation acting on a menu control in a graphical user interface, and displaying the menu interface corresponding to the menu control and a game role; and responding to a second touch operation acted on the menu interface, and controlling the game role to execute a virtual action corresponding to the second touch operation. According to the game role control method, the game role control device, the game role control equipment and the computer readable storage medium, when a user performs a second touch operation on the menu interface, the displayed game role can execute a virtual action corresponding to the second touch operation. According to the scheme provided by the disclosure, the game role can be controlled to make feedback corresponding to the second touch operation, so that the control of the user on the role is not influenced in the menu interface, and the substitution sense of the user in the game process is further improved.
Drawings
FIG. 1 is an interface diagram illustrating an exemplary embodiment;
FIG. 2 is a flow chart illustrating a method for controlling a game character according to an exemplary embodiment of the present application;
FIG. 3 is a first interface diagram illustration shown in an exemplary embodiment of the present application;
FIG. 4 is a flow chart illustrating a method for controlling a game character according to an exemplary embodiment of the present application;
FIG. 5 is a second interface diagram illustration shown in an exemplary embodiment of the present application;
FIG. 6 is a third interface diagram illustration shown in an exemplary embodiment of the present application;
FIG. 7 is a fourth interface diagram illustration shown in an exemplary embodiment of the present application;
FIG. 8 is a fifth interface diagram illustration shown in an exemplary embodiment of the present application;
FIG. 9 is a sixth interface diagram illustration shown in an exemplary embodiment of the present application;
FIG. 10 is a seventh interface diagram illustration shown in an exemplary embodiment of the present application;
FIG. 11 is a block diagram of a game character control apparatus according to an exemplary embodiment of the present application;
fig. 12 is a block diagram illustrating a game character control apparatus according to another exemplary embodiment of the present invention;
fig. 13 is a block diagram illustrating a game character control apparatus according to an exemplary embodiment of the present invention.
Detailed Description
FIG. 1 is an interface diagram illustrating an exemplary embodiment.
As shown in fig. 1, the electronic device may be provided with game software, and the electronic device is used to present game screens to a user and provide interactive controls for controlling a game by running the game software and providing a graphical user interface.
At least one interaction control may be included in the graphical user interface, such as a movement control for controlling the movement of a game character, a skill control for controlling the release of skills from a game character, a setting control for making game settings, and a plurality of menu controls 11 for presenting different menu interfaces within a game, etc. The user may operate the menu control 11, so that the electronic device displays a menu interface corresponding to the menu control 11.
According to the scheme, when a user performs first touch operation on the menu control in the graphical user interface, the electronic device can be triggered to display the menu interface corresponding to the menu control, game roles in the game can be displayed, and when the user performs second touch operation on the menu interface, the game roles can execute virtual actions corresponding to the second touch operation. According to the scheme, the game role can be controlled to make feedback corresponding to the second touch operation, so that the control of the user on the role is not influenced in the menu interface, and the substitution sense of the user in the game process is improved.
Fig. 2 is a flowchart illustrating a game character control method according to an exemplary embodiment of the present application.
As shown in fig. 2, the method for controlling a game character according to the present embodiment includes:
step 201, responding to a first touch operation acting on a menu control in a graphical user interface, and displaying the menu interface corresponding to the menu control and a game role.
The method provided by the embodiment may be executed by an electronic device, which may be, for example, a mobile terminal, such as a smart phone, a tablet computer, and the like. The electronic equipment is provided with a touch screen, and a user can issue an instruction to the electronic equipment in a mode of touching the touch screen.
The electronic equipment can provide a graphical user interface, and the graphical user interface comprises at least one menu control. For example, reference may be made to the graphical user interface shown in fig. 1.
Specifically, a user may perform a first touch operation on any menu control to send an instruction for opening a menu interface to the electronic device, for example, the user may click any menu control, so that the electronic device displays the corresponding menu interface.
Furthermore, the electronic device can acquire a first touch operation executed by a user on the menu control in the user interface through the touch screen, respond to the first touch operation, display the menu interface corresponding to the menu control, and simultaneously display a game role.
Fig. 3 is a first interface diagram illustration shown in an exemplary embodiment of the present application.
As shown in fig. 3, the electronic device may display a menu interface and a game character in response to the first touch operation.
In practice, the game character may be a game character currently used by the user. For example, the user has three game roles in the account, and if the user selects a first role to log in the game, the electronic device may further display an image of the first role when displaying the menu interface corresponding to the menu control.
In one embodiment, the layer on which the menu interface is located may be left or right of the layer on which the game character is located. Taking the menu control as the backpack control as an example, the menu interface corresponding to the backpack control is an item interface, for example, an item display area is arranged on the right side in the menu interface and is used for displaying virtual items owned by the game character, and the game character is displayed on the left side of the item area.
Step 202, responding to a second touch operation applied to the menu interface, and controlling the game character to execute a virtual action corresponding to the second touch operation.
When the electronic equipment displays the menu interface, the user can operate in the menu interface. Specifically, the second touch operation may be performed on the menu interface. For example, clicking on an item in the menu interface and then, for example, sliding through the menu interface, thereby causing the electronic device to display the next page of the menu.
Specifically, after obtaining a second touch operation of the user on the menu interface, the electronic device may control the displayed game character to execute a virtual action corresponding to the second touch operation.
For example, when the user performs a click operation in the menu interface, the finger of the game character can be controlled to move down at a middle point of the interface, so that the game character performs a virtual motion corresponding to the second touch operation of the user.
For another example, when the user performs a sliding operation in the user interface, the game character may be controlled to perform a corresponding sliding operation according to the sliding trajectory, so that the game character performs a virtual action corresponding to the touch operation of the user.
Furthermore, when the electronic equipment displays the menu interface, the user can still control the action of the game character, and the virtual action made by the game character corresponds to the second touch operation, so that the substitution feeling of the user can be improved.
The method for controlling the game role provided by the application provides a graphical user interface through an electronic device, wherein the graphical user interface comprises at least one menu control, and the method comprises the following steps: responding to a first touch operation acting on a menu control in a graphical user interface, and displaying the menu interface corresponding to the menu control and a game role; and responding to a second touch operation acted on the menu interface, and controlling the game role to execute a virtual action corresponding to the second touch operation. According to the control method of the game role, when the user performs the second touch operation on the menu interface, the displayed game role can execute the virtual action corresponding to the second touch operation. According to the method, the game role can be controlled to make feedback corresponding to the second touch operation, so that the control of the user on the role is not influenced in the menu interface, and the substitution sense of the user in the game process is improved.
Fig. 4 is a flowchart illustrating a game character control method according to an exemplary embodiment of the present application.
As shown in fig. 4, the game character control method provided by the present application includes:
step 401, responding to a first touch operation applied to a menu control in a graphical user interface, and displaying the menu interface corresponding to the menu control and a game role.
The method provided by the embodiment may be executed by an electronic device, which may be, for example, a mobile terminal, such as a smart phone, a tablet computer, and the like. The electronic equipment is provided with a touch screen, and a user can issue an instruction to the electronic equipment in a mode of touching the touch screen.
Fig. 5 is a second interface diagram illustration shown in an exemplary embodiment of the present application.
As shown in fig. 5, item information, equipment information, and skill information may be included in the menu interface. The item information may be, for example, items that the game character obtains in the game, such as bonus items for doing tasks, and items that are picked up during the game, wherein the items may be recovery-type medicines, such as blood return medicines and blue return medicines; or equipment such as hats, shoes, clothing, etc.; but also weapons such as canes, arches, etc. The equipment information may be the equipment that the game character has assembled and replaceable equipment, such as weapons currently held by the game character, breastplates, accessories, shoes, etc. that are currently equipped. The skill information may be, for example, the skills that a game character may learn, the learned skills, and the skill level of the learned skills.
The user can operate the object in the menu interface, for example, taking the menu control as an equipment control, taking the menu interface corresponding to the equipment control as an equipment interface, clicking on an assembly equipment in the equipment interface, and assembling the equipment for a game role or replacing a weapon.
Alternatively, the game character may be displayed on the left or right side of the menu interface. The figure only schematically shows that the game character may be displayed on the left side of the menu interface, but the game character may also be displayed on the right side of the menu interface.
Specifically, when a game character is displayed in response to the first touch operation, the game character may be displayed according to a preset state.
The preset state of the game character may be preset, such as a default standing position. The type of the game role can be acquired first when the first touch operation is responded according to the preset states corresponding to different game roles, and then the game role is displayed according to the preset states corresponding to the type.
The implementation and principle of step 401 are similar to those of step 201, and are not described again.
Step 402, responding to a second touch operation applied to the menu interface, and acquiring a touch point position of the second touch operation.
The user may perform a second touch operation in the menu interface, for example, click an arbitrary position, click an article in the menu interface, or slide in the menu interface. The electronic device can respond to the second touch operation and determine the position of the touch point of the second touch operation.
Specifically, if the second touch operation is a click operation, the position where the user performs the click operation may be directly used as the touch point position. If the second touch operation is a sliding operation, a track that a user passes through the sliding operation in the interface may be used as the touch point position. That is, the position of the touch point may be a single point or a plurality of points.
In step 403, a mapping position corresponding to the touch point position is determined in the virtual mapping area.
Further, in the method provided by the application, when the electronic device responds to the first touch operation acting on the menu control and displays the menu interface and the game role, the image user interface may further include a virtual mapping area corresponding to the menu interface.
Fig. 6 is a third interface diagram illustration shown in an exemplary embodiment of the present application.
As shown in fig. 6, a virtual mapping area 61 may be included in the graphical user interface, where the virtual mapping area 61 corresponds to the menu interface, for example, the aspect ratio of the virtual mapping area 61 is the same as that of the menu interface.
The mapping area 61 may be provided at a position where a game character is located. Such as in the upper half of the location of the game character, and such as in the middle of the location of the game character.
In the graphical user interface, both the game character and the menu interface can be directly displayed, and the virtual mapping area can be hidden, namely invisible to the user.
Specifically, when the user performs a second touch operation on the menu interface, the electronic device may obtain a touch point position of the second touch operation in response to the second touch operation, and then may determine a mapping position corresponding to the touch point position in the virtual mapping area. For example, when the user operates in the upper right corner of the menu interface, a mapping position is determined in the upper right corner of the virtual mapping area.
Further, when the user performs a second touch operation in the menu interface, the user may send an instruction to the electronic device by touching the screen of the electronic device. When the electronic device senses the second touch operation, the relative position of the touch point of the second touch operation in the graphical user interface can be determined. For example, the ratio of the second touch operation position on the horizontal axis and the ratio on the vertical axis may be specified in the upper left corner and the upper right corner.
In practical application, a mapping position may be determined in the virtual mapping area according to a relative position of the second touch operation position in the graphical user interface. For example, if the relative position is the position of the upper left corner, the corresponding upper left corner position may also be determined in the virtual mapping region. For another example, if the horizontal axis represents 30% of the total horizontal axis and the vertical axis represents 70% of the total vertical axis, a mapping position in the virtual mapping region may be determined based on these two values, that is, a mapping position in the virtual mapping region may be determined that represents 30% of the total horizontal axis and 70% of the total vertical axis.
And step 404, controlling the game character to execute the virtual action according to the mapping position.
When the method is actually applied, the game character can be controlled to execute virtual action according to the determined mapping position. For example, the hand of the game character may be controlled to point to the mapped position, or the hand of the game character may be controlled to move to the mapped position, so that the game character responds to the second touch operation of the user.
Based on the method provided by the embodiment, the game role can be controlled to make different feedbacks according to different second touch operations.
For example, when a user performs a click operation in the menu interface, a mapping position may be determined in the virtual mapping area according to a position of the click operation. And controlling the fingers of the game role to be oriented to the mapping position, so that the game role can make an action corresponding to the second touch operation of the user, and the substitution feeling of the user is improved.
For another example, when the user performs a sliding operation in the user interface, the corresponding trajectory may be determined in the virtual mapping area according to the sliding trajectory, and the game character may be controlled to perform a corresponding sliding operation according to the trajectory, so that the game character performs an action corresponding to the second touch operation of the user.
Step 405, controlling the game character to recover the preset state in response to the end of the second touch operation.
Optionally, the response to the end of the second touch operation provided in this embodiment, for example, if the touch point of the second touch operation leaves the graphical user interface, it may be considered that the second touch operation is ended. For example, when the second touch operation is a click operation, if the user lifts the finger, it may be considered that the second touch operation is ended.
The electronic device can respond to the end of the second touch operation and control the game role to recover the preset state. For example, if the user lifts his finger, the game character returns to the preset state. For example, the game character resumes a standing posture with both hands hanging down.
Optionally, the method provided in this embodiment may further include:
step 406, adjusting the size of the menu interface in response to the second touch operation applied to the menu interface.
The execution timing of steps 406 and 402 and 404 is not limited.
The following describes how the method provided by the present application controls a game character to perform a virtual action by using a specific example.
The second touch operation is a click operation, and the game role is controlled to point to the mapping position.
Specifically, if the second touch operation is a click operation, the electronic device may determine a touch point position of the second touch operation in the graphical user interface, for example, may determine a coordinate (X, Y) according to a touch pixel point. And determining a mapping position corresponding to the touch point position in the virtual mapping area.
Fig. 7 is a fourth interface diagram illustration shown in an exemplary embodiment of the present application.
As shown in fig. 7, the user may click on location a in the user interface, 71 being the mapping area. The mapping location 72 may be determined in the mapping region.
In order to improve the substituting feeling of the user, when the user performs the click operation in the menu interface, the electronic device can control the game character to perform the corresponding click operation. Specifically, the game character can be controlled to point to the determined mapping position, so that the game character responds to the second touch operation of the user.
Since the mapping position is determined according to the touch point position of the second touch operation, when the user performs a click operation, the action of the game character pointing to the mapping position can simulate the click operation of the user, so that the game character is consistent with the behavior of the user.
Specifically, the target element corresponding to the click operation in the menu interface may be selected according to the click operation. The target element refers to any element in the menu interface corresponding to the clicking operation. For example, if the user clicks a weapon in the menu interface, the weapon is the target element, and the electronic device may select the weapon in response to the user clicking.
Furthermore, the electronic device may highlight the target element corresponding to the click operation, or select the target element in a frame, so that the target element is in the selected state.
If the second touch operation is a sliding operation, the method further includes:
determining a sliding track in a menu interface according to the position of a touch point of the sliding operation; and determining a mapping track in the virtual mapping area according to the mapping position.
Optionally, the hand of the game character can be controlled to move along the mapping track.
In the method provided by this embodiment, the second touch operation may be a sliding operation applied to the menu interface, and the user may execute the sliding operation in the menu interface, so that the electronic device senses the touch operation and controls the game character to perform a corresponding virtual action based on the sliding operation.
In particular, the electronic device may determine thisThe sliding track of the sliding operation in the menu interface may be determined, for example, according to the touch pixel, a plurality of touch positions, for example, (X)1,Y1)、(X2,Y2)…(XnYn) which can be connected to form a sliding trajectory. The sliding track comprises a touch point corresponding to the sliding track.
Further, the electronic device may determine a mapping position in the virtual mapping area according to the touch points generated by the sliding operation, for example, each touch point may correspond to a mapping position. The way of determining the mapping position according to the touch point may be similar to the embodiment shown in fig. 4.
In actual application, the mapping track can be determined in the virtual mapping area according to the determined mapping position. For example, the mapping locations may be concatenated to obtain a mapping track.
Fig. 8 is a fifth interface diagram illustration shown in an exemplary embodiment of the present application.
As shown in fig. 8, the user may perform a sliding operation in the menu interface, and the sliding track may be, for example, a curved line L1, and the electronic device may obtain coordinates (L x1, L y1), (L x2, L y2) … (L xn, L yn) of the touch point position included in L1.
The mapping locus 82 may be defined in the mapping region, the mapping locus 82 may conform to the shape of the sliding locus L1, and the mapping locus 82 may be a result of connecting mapping positions.
Optionally, the relative position of the mapping track 82 in the virtual mapping area 81 is consistent with the relative position of the sliding track in the menu interface.
In order to control the game role in the menu interface, when the user performs a sliding operation in the menu interface, the electronic device may control the game role to perform a corresponding sliding operation. Specifically, the game character can be controlled to move along the mapping track in the virtual mapping area, so that the game character responds to the second touch operation.
Optionally, in an implementation, the electronic device may further adjust the size of the menu interface according to a sliding track of the sliding operation.
And the action simulation device does not influence the control of the user on the game role even if the electronic equipment displays a menu interface, and the control result can improve the substitution feeling of the user, thereby further improving the user experience. The sliding operation applied to the menu interface may be set in advance as an operation for adjusting the size of the menu interface, and when the second touch operation is a sliding operation, the electronic device may adjust the size of the menu interface in response to the second touch operation. For example, when the user performs a rightward sliding operation in a designated area of the menu interface, the electronic device may enlarge the size of the menu interface. When the user performs a leftward sliding operation in the designated area of the menu interface, the electronic device may reduce the size of the menu interface.
Specifically, after the user releases the second touch operation, the size of the menu interface may be maintained before the second touch operation is released.
Optionally, when the second touch operation is a sliding operation, the embodiment further provides another method for controlling the game character to execute a virtual action.
The second touch operation is a sliding operation, and an initial touch point position and a final touch point position corresponding to the sliding operation are determined in the menu interface; and determining a mapping starting point and a mapping end point in the virtual mapping area according to the initial touch point position and the final touch point position.
Optionally, the hand of the game character can be controlled to move from the mapping starting point to the mapping end point according to a preset track.
In the method provided by this embodiment, the second touch operation may be a sliding operation applied to the menu interface, and the user may execute the sliding operation in the menu interface, so that the electronic device senses the touch operation and controls the game character to perform a corresponding virtual action based on the sliding operation.
Specifically, the electronic device may determine an initial touch point position and a final touch point position corresponding to the sliding operation in the menu interface. For example, the two touch points can be determined according to the start point and the end point of the sliding operation, for example, the start point of the sliding operation can be determinedAs the initial touch point position, the sliding operation end point is used as the final touch point position, and the coordinates (X) of the two points can be specifically acquireds,Ys)、(Xe,Ye)。
Further, a mapping start point and a mapping end point corresponding to the initial touch point position and the final touch point position, respectively, may be determined in the virtual mapping area. For example, the relative start point position and the relative end point position of the sliding start point and the sliding end point in the menu interface may be determined, and the mapping start point and the mapping end point may be determined in the virtual mapping area according to the relative start point position and the relative end point position, so that the relative position of the mapping start point in the virtual mapping area coincides with the relative position of the sliding start point in the menu interface, and the relative position of the mapping end point in the virtual mapping area coincides with the relative position of the sliding end point in the menu interface.
Fig. 9 is a sixth interface diagram illustration shown in an exemplary embodiment of the present application.
As shown in fig. 9, the user may perform a sliding operation in the menu interface, and the sliding track may be, for example, a curve L1, and the electronic device may obtain L1 coordinates (X) of the position of the initial touch points,Ys) Coordinates (Xe, Ye) of the final touch point position may also be obtained L1.
Fig. 9 shows a virtual map area 91. A mapping start point 92 corresponding to the initial touch point position and a mapping end point 93 corresponding to the final touch point position may be determined in the virtual mapping area.
In order to enable the user to control the game character in the menu interface, when the user performs a sliding operation in the menu interface, the electronic device may control the game character to perform a corresponding sliding operation. Specifically, the game role can be controlled to move from the determined mapping starting point to the mapping end point according to the preset track. Compared with the above embodiment, the method provided by the embodiment only needs to determine two positions in the virtual mapping area, so that the game character can perform the sliding operation corresponding to the user, and the data processing amount is reduced.
The preset track may be set according to requirements, and may be, for example, a curve located between the mapping start point and the mapping end point, or a straight line connecting the mapping start point and the mapping end point.
Optionally, when the relative positions of the mapping start point and the mapping end point in the virtual mapping area are consistent with the relative positions of the initial touch point position and the final touch point position in the menu interface, when the user performs a sliding operation, the game character may also simulate the sliding operation at the corresponding position, so that the game character is consistent with the behavior of the user.
Optionally, in this embodiment, the electronic device may further adjust the size of the menu interface according to the initial touch point position and the final touch point position.
The sliding operation applied to the menu interface may be set in advance as an operation for adjusting the size of the menu interface, and when the second touch operation is a sliding operation, the electronic device may adjust the size of the menu interface in response to the second touch operation. For example, when the user performs a rightward sliding operation in a designated area of the menu interface, the electronic device may enlarge the size of the menu interface. When the user performs a leftward sliding operation in the designated area of the menu interface, the electronic device may reduce the size of the menu interface.
Specifically, the electronic device may adjust the size of the menu interface according to the initial touch point position and the final touch point position in the sliding operation process. For example, the size of the menu interface may be adjusted according to the distance between the initial touch point position and the final touch point position.
After the user releases the second touch operation, the size of the menu interface can be kept before the second touch operation is released.
The second touch operation is a multi-point touch operation, a first touch point position and a second touch point position of the multi-point touch operation are obtained, and a first mapping position corresponding to the first touch point position and a second mapping position corresponding to the second touch point position are determined in the virtual mapping area.
Two touch operation points with the farthest distance can be determined in the multi-point touch operation and are respectively used as a first touch point position and a second touch point position.
Optionally, the electronic device may further control two hands of the game character to point to the first mapping position and the second mapping position respectively. For the two determined touch operation points, corresponding mapping positions can be respectively determined in the virtual mapping area.
The distance between the two hands of the game character may also be controlled in accordance with a change in the distance between the first mapped position and the second mapped position. The change in the distance between the first mapped position and the second mapped position corresponds to the actual multi-touch operation performed by the user, thereby enabling the game character to respond to the second touch operation of the user.
Displaying a target element in a menu interface between two hands of the game character; and adjusting the size of the target element according to the distance between the two hands, wherein the target element can be an element selected by the electronic equipment in response to clicking operation on a menu interface. For example, equipment, weapons, items, etc. When the hand separation of the game character becomes larger, the target element can be correspondingly enlarged, and when the hand separation of the game character becomes smaller, the target element can be correspondingly reduced.
The user can perform a multi-touch operation in the menu interface, for example, an operation in which a plurality of fingers are gradually separated, so that the electronic device senses the touch operation and controls the game character to perform a corresponding virtual action based on the multi-touch operation.
Fig. 10 is a seventh interface diagram illustration shown in an exemplary embodiment of the present application.
As shown in fig. 10, the present embodiment is schematically illustrated by a two-finger operation menu interface.
The user can use two fingers to operate in the menu interface, and specifically, the two fingers can be gradually separated, for example, the two fingers of the user slide along the direction of the arrow shown in the figure.
After the electronic device senses the operation of the user, a first touch point position and a second touch point position corresponding to the two-point touch operation may be obtained, for example, two touch point positions may be obtained. If the two fingers are gradually separated, the positions of the two touch points are changed.
Thereafter, a first mapping location and a second mapping location corresponding to the operation may be determined in the virtual mapping region. When the positions of the first touch point and the second touch point are changed, the determined first mapping position and the second mapping position are also changed.
When the electronic equipment displays the menu interface, the electronic equipment can sense multi-point touch operation made by a user and determine a first mapping position and a second mapping position. The distance between the first mapping location and the second mapping location may vary according to the actual operation of the user.
For example, the region 1001 in fig. 10 is determined as a virtual mapping region, the 1002 position in fig. 10 is determined as a first mapping position, and the 1003 position is determined as a second mapping position.
In practical application, the electronic device can control the two hands of the game character to move to the first mapping position and the second mapping position respectively, and perform an action of unfolding the arms. For example, the left hand of the control game character points to a first mapped location and the right hand of the control game character points to a second mapped location.
Optionally, the currently selected target element may be further obtained, and when the left hand of the game character points to the first mapping position and the right hand of the game character is controlled to point to the second mapping position, the target element may be further displayed between the two hands of the game character, so that the effect that the two hands of the game character hold the target element is displayed in the menu interface. And the display effect of the target element can be adjusted according to the actual touch operation of the user. Therefore, when the user performs multi-touch operation, the electronic equipment can control the game role to hold the target element.
Fig. 11 is a block diagram of a game character control apparatus according to an exemplary embodiment of the present application.
As shown in fig. 11, the game character control apparatus provided in this embodiment is applied to an electronic device, and provides a graphical user interface through the electronic device, where the graphical user interface includes at least one menu control, and the apparatus includes:
the display module 111 is configured to respond to a first touch operation acting on the menu control in the graphical user interface, and display a menu interface corresponding to the menu control and a game role;
the control module 112 is configured to respond to a second touch operation applied to the menu interface, and control the game character to execute a virtual action corresponding to the second touch operation.
The game role control device provided by the embodiment is applied to an electronic device, a graphical user interface is provided through the electronic device, the graphical user interface comprises at least one menu control, and the device comprises: the display module is used for responding to a first touch operation acting on a menu control in the graphical user interface, and displaying the menu interface corresponding to the menu control and a game role; and the control module is used for responding to a second touch operation acted on the menu interface and controlling the game role to execute a virtual action corresponding to the second touch operation. According to the control device for the game role, when the user performs the second touch operation on the menu interface, the displayed game role can execute the virtual action corresponding to the second touch operation. The device can make feedback corresponding to the second touch control operation by controlling the game role, so that the control of the user on the role is not influenced in the menu interface, and the substitution sense of the user in the game process is further improved.
The specific principle and implementation of the game character control device provided in this embodiment are similar to those of the embodiment shown in fig. 2, and are not described here again.
Fig. 12 is a block diagram illustrating a game character control apparatus according to another exemplary embodiment of the present invention.
As shown in fig. 12, based on the above embodiment, in the game character control apparatus provided in this embodiment, optionally, the graphical user interface includes a virtual mapping area corresponding to the menu interface;
the control module 112 is specifically configured to:
responding to a second touch operation acting on the menu interface, and acquiring a touch point position of the second touch operation;
determining a mapping position corresponding to the touch point position in the virtual mapping area;
and controlling the game role to execute the virtual action according to the mapping position.
Optionally, the menu interface includes any one of the following information:
item information, equipment information, and skill information.
Optionally, the game character is displayed on the left side or the right side of the menu interface.
Optionally, the display module 111 is specifically configured to:
and responding to the first touch operation, and displaying the game role according to a preset state.
Optionally, the second touch operation is a click operation, and the control module 112 is specifically configured to:
and controlling the game role to point to the mapping position.
Optionally, the apparatus further includes a selecting module 113 configured to:
and according to the clicking operation, selecting a target element corresponding to the clicking operation in the menu interface.
Optionally, the second touch operation is a sliding operation, and the control module 112 is specifically configured to:
determining a sliding track in the menu interface according to the position of the touch point of the sliding operation;
and determining a mapping track in the virtual mapping area according to the mapping position.
Optionally, the control module 112 is specifically configured to:
and controlling the hand of the game character to move along the mapping track.
Optionally, the apparatus further comprises an adjusting module 114, configured to:
and adjusting the size of the menu interface according to the sliding track of the sliding operation.
Optionally, the second touch operation is a sliding operation, and the control module 112 is specifically configured to:
determining an initial touch point position and a final touch point position of the sliding operation in the menu interface;
and determining a mapping starting point and a mapping end point in the virtual mapping area according to the initial touch point position and the final touch point position.
Optionally, the control module 112 is specifically configured to:
and controlling the hand part of the game role to move from the mapping starting point to the mapping end point according to a preset track.
Optionally, the adjusting module 114 is further configured to: and adjusting the size of the menu interface according to the initial touch point position and the final touch point position.
Optionally, if the second touch operation is a multi-touch operation, the control module 112 is specifically configured to:
acquiring a first touch point position and a second touch point position of the multi-point touch operation;
and determining a first mapping position corresponding to the first touch point position and a second mapping position corresponding to the second touch point position in the virtual mapping area.
Optionally, the control module 112 is specifically configured to:
and controlling two hands of the game character to point to the first mapping position and the second mapping position respectively.
Optionally, the control module 112 is further configured to:
controlling a distance between the two hands of the game character according to a change in the distance between the first mapped position and the second mapped position.
Optionally, the control module 112 is further configured to:
displaying a target element in a menu interface between two hands of the game character;
and adjusting the size of the target element according to the distance between the two hands.
Optionally, the control module 112 is further configured to:
and responding to the end of the second touch operation, and controlling the game role to recover the preset state.
The specific principle and implementation of the game character control device provided in this embodiment are similar to those of the embodiments shown in fig. 3 to 10, and are not described herein again.
Fig. 13 is a block diagram illustrating a game character control apparatus according to an exemplary embodiment of the present invention.
As shown in fig. 13, the game character control apparatus provided in the present embodiment includes:
a memory 1301;
a processor 1302; and
a computer program;
wherein the computer program is stored in the memory 1301 and configured to be executed by the processor 1302 to implement any of the game character control methods described above.
The present embodiments also provide a computer-readable storage medium, having stored thereon a computer program,
the computer program is executed by a processor to implement any of the game character control methods described above.
The present embodiment also provides a computer program including a program code that executes any one of the game character control methods described above when the computer program is run by a computer.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (21)

1. A method for controlling a game character, wherein a graphical user interface is provided by an electronic device, the graphical user interface comprising at least one menu control, the method comprising:
responding to a first touch operation acting on the menu control in the graphical user interface, and displaying the menu interface corresponding to the menu control and a game role;
and responding to a second touch operation acted on the menu interface, and controlling the game role to execute a virtual action corresponding to the second touch operation.
2. The method of claim 1, wherein the graphical user interface comprises a virtual mapping area corresponding to the menu interface;
the responding to a second touch operation acted on the menu interface, and controlling the game role to execute a virtual action corresponding to the second touch operation comprises the following steps:
responding to a second touch operation acting on the menu interface, and acquiring a touch point position of the second touch operation;
determining a mapping position corresponding to the touch point position in the virtual mapping area;
and controlling the game role to execute the virtual action according to the mapping position.
3. The method of claim 1, wherein the menu interface comprises any one of the following information:
item information, equipment information, and skill information.
4. The method of claim 1, wherein the game character is displayed on the left or right side of the menu interface.
5. The method of claim 1, wherein the game character is displayed according to a preset state in response to the first touch operation.
6. The method of claim 2, wherein the second touch operation is a click operation, and wherein the controlling the game character to perform the virtual action according to the mapped position comprises:
and controlling the game role to point to the mapping position.
7. The method of claim 6, further comprising:
and according to the clicking operation, selecting a target element corresponding to the clicking operation in the menu interface.
8. The method of claim 2, wherein the second touch operation is a slide operation; the method further comprises the following steps:
determining a sliding track in the menu interface according to the position of the touch point of the sliding operation;
and determining a mapping track in the virtual mapping area according to the mapping position.
9. The method of claim 8, wherein said controlling the game character to perform the virtual action according to the mapped position comprises:
and controlling the hand of the game character to move along the mapping track.
10. The method of claim 8, further comprising:
and adjusting the size of the menu interface according to the sliding track of the sliding operation.
11. The method of claim 2, wherein the second touch operation is a slide operation;
the responding to a second touch operation acting on the menu interface, and acquiring the position of the touch point of the second touch operation, including:
determining an initial touch point position and a final touch point position of the sliding operation in the menu interface;
the determining a mapping position corresponding to the touch point position in the virtual mapping area includes:
and determining a mapping starting point and a mapping end point in the virtual mapping area according to the initial touch point position and the final touch point position.
12. The method of claim 11, wherein said controlling the game character to perform the virtual action according to the mapped position comprises:
and controlling the hand part of the game role to move from the mapping starting point to the mapping end point according to a preset track.
13. The method of claim 11, further comprising:
and adjusting the size of the menu interface according to the initial touch point position and the final touch point position.
14. The method of claim 2, wherein the second touch operation is a multi-touch operation;
the obtaining of the touch point position of the second touch operation includes:
acquiring a first touch point position and a second touch point position of the multi-point touch operation;
the determining a mapping position corresponding to the touch point position in the virtual mapping area includes:
and determining a first mapping position corresponding to the first touch point position and a second mapping position corresponding to the second touch point position in the virtual mapping area.
15. The method of claim 14, wherein said controlling the game character to perform the virtual action according to the mapped position comprises:
and controlling two hands of the game character to point to the first mapping position and the second mapping position respectively.
16. The method of claim 15, wherein said controlling the game character to perform the virtual action according to the mapped position comprises:
controlling a distance between the two hands of the game character according to a change in the distance between the first mapped position and the second mapped position.
17. The method of claim 16, further comprising:
displaying a target element in a menu interface between two hands of the game character;
and adjusting the size of the target element according to the distance between the two hands.
18. The method of claim 5, further comprising:
and responding to the end of the second touch operation, and controlling the game role to recover the preset state.
19. An apparatus for controlling a game character, applied to an electronic device, through which a graphical user interface is provided, the graphical user interface including at least one menu control, the apparatus comprising:
the display module is used for responding to a first touch operation acted on the menu control in the graphical user interface and displaying the menu interface corresponding to the menu control and a game role;
and the control module is used for responding to a second touch operation acted on the menu interface and controlling the game role to execute a virtual action corresponding to the second touch operation.
20. A game character control apparatus, comprising:
a memory;
a processor; and
a computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor to implement the method of any of claims 1-11.
21. A computer-readable storage medium, having stored thereon a computer program,
the computer program is executed by a processor to implement the method according to any of claims 1-11.
CN202010181539.7A 2020-03-16 2020-03-16 Game role control method, device, equipment and computer readable storage medium Active CN111389003B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010181539.7A CN111389003B (en) 2020-03-16 2020-03-16 Game role control method, device, equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010181539.7A CN111389003B (en) 2020-03-16 2020-03-16 Game role control method, device, equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN111389003A true CN111389003A (en) 2020-07-10
CN111389003B CN111389003B (en) 2023-04-18

Family

ID=71416323

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010181539.7A Active CN111389003B (en) 2020-03-16 2020-03-16 Game role control method, device, equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111389003B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112619147A (en) * 2021-01-04 2021-04-09 网易(杭州)网络有限公司 Game equipment replacing method and device and terminal device
CN113198177A (en) * 2021-05-10 2021-08-03 网易(杭州)网络有限公司 Game control display method and device, computer storage medium and electronic equipment
CN113648649A (en) * 2021-08-23 2021-11-16 网易(杭州)网络有限公司 Game interface control method and device, computer readable medium and terminal equipment
WO2022247318A1 (en) * 2021-05-26 2022-12-01 网易(杭州)网络有限公司 Game interface display method and apparatus, and device and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150065241A1 (en) * 2013-08-27 2015-03-05 Zynga Inc. Gaming systems and methods for facilitating item grouping and group actions
CN108776544A (en) * 2018-06-04 2018-11-09 网易(杭州)网络有限公司 Exchange method and device, storage medium, electronic equipment in augmented reality
CN110038296A (en) * 2019-03-05 2019-07-23 努比亚技术有限公司 A kind of game control method, terminal and computer readable storage medium
CN110841291A (en) * 2019-11-19 2020-02-28 网易(杭州)网络有限公司 Method and device for interacting shortcut messages in game and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150065241A1 (en) * 2013-08-27 2015-03-05 Zynga Inc. Gaming systems and methods for facilitating item grouping and group actions
CN108776544A (en) * 2018-06-04 2018-11-09 网易(杭州)网络有限公司 Exchange method and device, storage medium, electronic equipment in augmented reality
CN110038296A (en) * 2019-03-05 2019-07-23 努比亚技术有限公司 A kind of game control method, terminal and computer readable storage medium
CN110841291A (en) * 2019-11-19 2020-02-28 网易(杭州)网络有限公司 Method and device for interacting shortcut messages in game and electronic equipment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112619147A (en) * 2021-01-04 2021-04-09 网易(杭州)网络有限公司 Game equipment replacing method and device and terminal device
CN113198177A (en) * 2021-05-10 2021-08-03 网易(杭州)网络有限公司 Game control display method and device, computer storage medium and electronic equipment
WO2022247318A1 (en) * 2021-05-26 2022-12-01 网易(杭州)网络有限公司 Game interface display method and apparatus, and device and medium
CN113648649A (en) * 2021-08-23 2021-11-16 网易(杭州)网络有限公司 Game interface control method and device, computer readable medium and terminal equipment

Also Published As

Publication number Publication date
CN111389003B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
CN111389003B (en) Game role control method, device, equipment and computer readable storage medium
JP7377328B2 (en) Systems and methods for controlling technological processes
US11112856B2 (en) Transition between virtual and augmented reality
EP2854973B1 (en) Graphical user interface for a gaming system
WO2019057150A1 (en) Information exchange method and apparatus, storage medium and electronic apparatus
US20100164946A1 (en) Method and Apparatus for Enhancing Control of an Avatar in a Three Dimensional Computer-Generated Virtual Environment
US11266904B2 (en) Game system, game control device, and information storage medium
CN112684970B (en) Adaptive display method and device of virtual scene, electronic equipment and storage medium
WO2019166005A1 (en) Smart terminal, sensing control method therefor, and apparatus having storage function
CN110404257B (en) Formation control method and device, computer equipment and storage medium
US20230405452A1 (en) Method for controlling game display, non-transitory computer-readable storage medium and electronic device
CN113440848B (en) In-game information marking method and device and electronic device
CN112180841A (en) Man-machine interaction method, device, equipment and storage medium
JP5767371B1 (en) Game program for controlling display of objects placed on a virtual space plane
CN113318430A (en) Virtual character posture adjusting method and device, processor and electronic device
CN113926187A (en) Object control method and device in virtual scene and terminal equipment
CN113440835A (en) Control method and device of virtual unit, processor and electronic device
KR101962464B1 (en) Gesture recognition apparatus for functional control
KR20120062053A (en) Touch screen control how the character of the virtual pet
US11531448B1 (en) Hand control interfaces and methods in virtual reality environments
CN113986079B (en) Virtual button setting method and device, storage medium and electronic equipment
WO2023002907A1 (en) Information processing system, program, and information processing method
CN118012265A (en) Man-machine interaction method, device, equipment and medium
JP2022107152A (en) program
CN116027957A (en) Interaction control method and device, wearable device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant