CN113694530A - Virtual character movement control method and device, electronic equipment and storage medium - Google Patents

Virtual character movement control method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113694530A
CN113694530A CN202111015668.XA CN202111015668A CN113694530A CN 113694530 A CN113694530 A CN 113694530A CN 202111015668 A CN202111015668 A CN 202111015668A CN 113694530 A CN113694530 A CN 113694530A
Authority
CN
China
Prior art keywords
user
cursor
thumbnail map
virtual character
dragging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111015668.XA
Other languages
Chinese (zh)
Inventor
钟杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202111015668.XA priority Critical patent/CN113694530A/en
Publication of CN113694530A publication Critical patent/CN113694530A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5378Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Abstract

The present disclosure provides a virtual character movement control method, a virtual character movement control apparatus, an electronic device, and a computer-readable storage medium; relates to the technical field of information. The virtual character movement control method is applied to a touch terminal presenting a graphical user interface, and comprises the following steps: activating the thumbnail map in response to the user selecting one or more first virtual characters and dragging more than a predetermined distance on the graphical user interface; generating a cursor corresponding to a current position of the first virtual character at a first position on the thumbnail map; responding to the continuous dragging of the user on the thumbnail map, and enabling the cursor to move from the first position along the dragging track of the user, wherein the moving distance of the cursor is reduced according to a preset proportion relative to the length of the dragging track; in response to the user releasing the drag when the cursor is moved to a second location on the thumbnail map, one or more first virtual characters are moved to a first target location corresponding to the second location. The present disclosure may improve the immersion of a user.

Description

Virtual character movement control method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of information technology, and in particular, to a virtual character movement control method, a virtual character movement control apparatus, an electronic device, and a computer-readable storage medium based on information technology.
Background
With the development of information technology, more and more users are beginning to meet personal entertainment demands through terminal devices such as smart phones capable of installing running applications. How to ensure and improve the immersion of users in the entertainment process is a problem which is constantly addressed by various developers.
Taking a game program running on a touch terminal as an example, in a real-time strategy (RTS) game, frequent operations of each virtual character to move to different destinations are often involved, and many times, the destinations are often out of the view of a game scene. At the moment, the user is required to slide on the screen of the touch terminal for multiple times to find a corresponding destination, so that the user cannot perform long-time immersive experience in one game scene, and the game scene is required to be frequently and quickly operated to be repeatedly switched, so that physiological fatigue and discomfort are easily caused to the user, and the immersion degree of the user is greatly reduced.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the disclosed embodiments is to provide a virtual character movement control method, a virtual character movement control apparatus, an electronic device, and a computer-readable storage medium, thereby improving the immersion degree of a user at least to some extent and simultaneously reducing the fatigue degree of the user.
According to one aspect of the present disclosure, a virtual character movement control method is provided, which is applied to a touch terminal presenting a graphical user interface; the method comprises the following steps:
activating the thumbnail map in response to the user selecting one or more first virtual characters and dragging more than a predetermined distance on the graphical user interface;
generating a cursor corresponding to a current location of the one or more first virtual characters at a first location on the thumbnail map;
responding to the continuous dragging of the user on the thumbnail map, and enabling the cursor to move from the first position along a dragging track of the user, wherein the moving distance of the cursor is reduced according to a preset proportion relative to the length of the dragging track;
in response to a user releasing the drag when the cursor is moved to a second location on the thumbnail map, moving the one or more first avatars to a first target location corresponding to the second location.
In an exemplary embodiment of the present disclosure, a value of the preset scale is positively correlated with a size of the thumbnail map.
In an exemplary embodiment of the present disclosure, the method further comprises: after the user releases the dragging, generating at least one first control on the thumbnail map; in response to a user touching one of the first controls, causing the one or more first avatars to perform a corresponding action upon reaching the first target location.
In an exemplary embodiment of the present disclosure, the method further comprises: after the user releases the drag, generating a second control on the graphical user interface; and responding to one or more second virtual characters selected by the user and touching and pressing the second control, and moving the one or more second virtual characters to the first target position.
In an exemplary embodiment of the present disclosure, the method further comprises: after the user releases the drag, generating a third control on the graphical user interface; in response to a user touching and pressing the third control, activating the thumbnail map and displaying the cursor on the thumbnail map; in response to a user dragging the cursor and releasing the dragging at a third location on the thumbnail map, moving the one or more first avatars to second target locations corresponding to the third location.
In an exemplary embodiment of the present disclosure, the moving the one or more first avatars to the first target location corresponding to the second location includes: causing the one or more first avatars to move to the first target location in an automated way; or enabling the one or more first virtual characters to move to the first target position according to the dragging track.
In an exemplary embodiment of the present disclosure, the method further comprises: the method further comprises the following steps: and responding to a touch command of a user, and adjusting the transparency of the thumbnail map.
According to an aspect of the present disclosure, there is provided a virtual character movement control apparatus applied to a touch terminal presenting a graphical user interface, including:
a map activation module for activating a thumbnail map in response to a user selecting one or more first virtual characters and dragging more than a predetermined distance on a graphical user interface;
a processing module to generate a cursor corresponding to a current location of the one or more first virtual characters at a first location on the thumbnail map; and responding to the continuous dragging of the user on the thumbnail map, and moving the cursor from the first position along the dragging track of the user, wherein the moving distance of the cursor is reduced according to a preset proportion relative to the length of the dragging track;
and the movement control module is used for responding to the release of the dragging of the user when the cursor moves to the second position on the thumbnail map, so that the one or more first virtual characters move to the first target position corresponding to the second position.
According to an aspect of the present disclosure, there is provided an electronic device including: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the method of any one of the above via execution of the executable instructions.
According to an aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of any one of the above.
Exemplary embodiments of the present disclosure may have some or all of the following benefits:
in the virtual character movement control method provided by the disclosed example embodiment, on one hand, the user does not need to leave the current game scene to find the moving destination by selecting the virtual character and dragging to activate the thumbnail map, but can conveniently and quickly set the moving destination of the virtual character in the thumbnail map, so that the user can perform long-time immersive experience in one game scene, and the immersion degree of the user is improved. On the other hand, the moving distance of the cursor at the position of the virtual character is reduced in proportion to the length of the dragging track, so that the operation precision of positioning the moving destination can be improved, the possibility that the user needs to find the destination by switching game scenes due to deviation of positioning is avoided, and the immersive experience of the user on the game is guaranteed. In another aspect, frequent and rapid operations for repeatedly switching game scenes are avoided, so that physiological fatigue and discomfort caused to the user by the operations and the stroboflash of the scenes are reduced at least to a certain extent, and the experience and the immersion degree of the user are improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 is a view schematically showing an interface of a virtual character movement control method according to the related art;
fig. 2 schematically shows a flowchart of a virtual character movement control method according to one embodiment of the present disclosure;
FIG. 3 schematically illustrates an interface diagram for activating a thumbnail map according to one embodiment of the present disclosure;
FIG. 4 schematically illustrates an interface diagram of a process for movement of a cursor over a thumbnail map according to one embodiment of the present disclosure;
FIG. 5 illustrates a schematic diagram of a first control, according to one embodiment of the present disclosure;
FIG. 6 illustrates a schematic diagram of a second control and a third control, according to one embodiment of the present disclosure;
FIG. 7 shows a schematic diagram of a drag trajectory according to one embodiment of the present disclosure;
fig. 8 schematically shows a block diagram of a virtual character movement control apparatus according to one embodiment of the present disclosure;
FIG. 9 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
In the related art related to virtual character movement control, as shown in fig. 1, for example, in a game of a real-time strategy (RTS) running on a touch terminal, when a host virtual character needs to be controlled to move, a user needs to select one or more host virtual characters 101 in a game scene, and then set a movement destination or select an enemy virtual character 102 that needs to perform an action by touch in the same game scene; in response to a touch command of the user, the own avatar 101 will move toward the set movement destination or the enemy avatar 102. However, when the moving destination or enemy virtual character 102 is located in a scene other than the current game scene, that is, outside the field of view of the current game scene, the user is required to drag on the screen to control the lens to move in order to find a desired moving destination or enemy virtual character. Because games often involve controlling the movement of a large number of virtual characters and a large number of batches of virtual characters, the movement control mode causes that a user often needs to frequently and quickly drag a game scene back and forth, sometimes even needs to slide on a screen for many times to find a moving destination, so that the user cannot perform long-time immersion experience in one game scene, but needs to frequently and quickly perform operation to repeatedly switch the game scene, and the high load on the operation and the stroboscopic effect caused by repeated switching of the game scene easily cause physiological fatigue and discomfort to the user, and seriously affect the immersion degree of the user.
The technical solution of the embodiment of the present disclosure is explained in detail below:
the embodiment of the example provides a virtual character movement control method, which is applied to a touch terminal presenting a graphical user interface. Referring to fig. 2, the virtual character movement control method may include the steps of:
step S210, responding to one or more first virtual characters selected by a user and dragging the first virtual characters on a graphical user interface for a distance greater than a preset distance, and activating a thumbnail map;
step S220, generating a cursor corresponding to the current positions of the one or more first virtual characters at a first position on the thumbnail map;
step S230, responding to the continuous dragging of the user on the thumbnail map, and enabling the cursor to move from the first position along a dragging track of the user, wherein the moving distance of the cursor is reduced according to a preset proportion relative to the length of the dragging track;
step 240, in response to the user releasing the dragging when the cursor moves to the second position on the thumbnail map, moving the one or more first virtual characters to the first target position corresponding to the second position.
In the virtual character movement control method provided by the disclosed example embodiment, on one hand, the user does not need to leave the current game scene to find the moving destination by selecting the virtual character and dragging to activate the thumbnail map, but can conveniently and quickly set the moving destination of the virtual character in the thumbnail map, so that the user can perform long-time immersive experience in one game scene, and the immersion degree of the user is improved. On the other hand, the moving distance of the cursor at the position of the virtual character is reduced in proportion to the length of the dragging track, so that the operation precision of positioning the moving destination can be improved, the possibility that the user needs to find the destination by switching game scenes due to deviation of positioning is avoided, and the immersive experience of the user on the game is guaranteed. In another aspect, frequent and rapid operations for repeatedly switching game scenes are avoided, so that physiological fatigue and discomfort caused to the user by the operations and the stroboflash of the scenes are reduced at least to a certain extent, and the experience and the immersion degree of the user are improved.
Next, in another embodiment, the above steps are explained in more detail.
In step S210, in response to the user selecting one or more first virtual characters and dragging more than a predetermined distance on the graphical user interface, the thumbnail map is activated.
In the present exemplary embodiment, as shown in fig. 3, when the virtual character needs to be controlled to move, the user may select one or more first virtual characters 301 to be moved on the game interface, for example, 4 armor infantries 301, and may drag the 4 armor infantries 301 in a touch manner in the selected state; when the touch terminal detects that the dragging distance L of the user on the graphical user interface is greater than a predetermined distance, the thumbnail map 302 may be activated. In an instant strategy game, a "small map" schematically and roughly showing the overall map situation is often displayed on a game interface, and the thumbnail map 302 may be, for example, a map view which enlarges the small map and clearly shows the overall map details, and may be, for example, rendered on the uppermost layer of a game layer; and, the predetermined distance may be, for example, a length of 200 pixels, and may be set to another length according to an actual situation or a size of a touch screen of the touch terminal, or may be set in advance by a user, for example, which is not particularly limited in the present exemplary embodiment.
In one example, when the thumbnail map 302 is activated, the location at which the thumbnail map 302 is displayed may be determined according to the location pressed by the user's finger in real time. For example, as shown in fig. 3, for example, the user selects 4 armored infantries 301 and drags to the right and upward, and when the touch terminal detects that the dragging distance L is greater than 200 pixels in length, the thumbnail map 302 is activated, and the rendering of the minimap may be stopped, for example. Upon activation of the thumbnail map, the touch terminal may detect, for example, the coordinates of the position of the touch screen being pressed by the user's finger, and set the position at which the thumbnail map 302 is displayed so that the thumbnail map as a whole is in the vicinity of the upper right corner of the game interface and the position at which the user's finger is pressed in real time is in the vicinity of the left edge of the thumbnail map 302 in accordance with the coordinates. In addition, for example, when the user selects one or more first virtual characters and drags down for a length of more than 200 pixels, the position at which the thumbnail map 302 is displayed may be set such that the lower edge of the thumbnail map is aligned with the lower edge of the game interface and the position pressed by the user's finger in real time is in the vicinity of the upper edge or the vicinity of the center of the thumbnail map 302, for example. In addition, the position where the thumbnail map is displayed may be adaptively adjusted according to the size of the touch screen or actual requirements, which is not particularly limited in the present exemplary embodiment.
In step S220, a cursor corresponding to the current position of the one or more first virtual characters is generated at a first position on the thumbnail map.
In the present exemplary embodiment, as shown in fig. 3, for example, a cursor 303 may be generated at a position on the thumbnail map 302 at or near the position pressed by the user's finger in real time, so that a first position at which the cursor 303 is located on the thumbnail map 302 corresponds to the current position at which the one or more first virtual characters 301 are located in the overall game scene. The cursor 303 serves to indicate the current position of one or more first virtual characters 301 selected by the user in real time, and the user can know the relative position of the selected virtual character in the overall game scene in real time through the cursor 303 so as to perform subsequent operations. The cursor 303 may be rendered as a mark having a specific geometric shape, such as a circle, a triangle, a cross, a sword, etc., or a mark having a specific animation effect, such as a flashing, a ripple, etc., and the cursor 303 may further have a pointing portion capable of indicating a moving direction of the virtual character in motion in real time, which is not particularly limited in this example embodiment.
In step S230, in response to the user continuing to drag on the thumbnail map, the cursor is moved from the first position along a drag track of the user, wherein a movement distance of the cursor is reduced by a preset proportion relative to a length of the drag track.
In the present example embodiment, after activating the thumbnail map and generating the corresponding cursor, the user may not lift the pressed finger, i.e., not release the drag, but continue the drag in the area of the thumbnail map. After detecting the dragging action of the user, the touch terminal can enable the cursor to move along the dragging track of the user finger. As shown in fig. 4, the initial position of the cursor 403 when generated is, for example, a first position located near the lower left corner of the thumbnail map 402, and after detecting the drag motion of the user's finger toward the upper right, the touch terminal may cause the cursor 403 to move in the same direction along the drag trajectory of the finger in response to the drag motion of the user's finger. In the example shown in fig. 4, the length of the dragging track of the finger is l1, for example, and the distance that the cursor 403 moves along the dragging track correspondingly is l2, for example, where l1 and l2 may satisfy:
l2=a×l1;
wherein a is a preset scale factor, and the value range of a can satisfy 0< a <1, for example. That is, the moving distance l2 of the cursor 403 may be reduced by a preset ratio with respect to the length l1 of the dragging trace. Since the thumbnail map 402 does not always occupy the entire graphical user interface, but only a portion of the graphical user interface, if the user's finger is dragged across the thumbnail map by the length of l1, and the cursor is moved by the length of l1, the cursor may "flutter" back and forth near the target point, especially in some cases where a short distance movement is required, and accurate positioning cannot be performed. By the mode that the set value is smaller than the preset scale factor of 1, the phenomenon of 'fluttering' can be effectively inhibited, so that a user can quickly and accurately position a required target point, the possibility that the user needs to search for a destination by switching game scenes due to deviation of positioning is avoided, and the immersion degree of the user is ensured.
In one example, the value of the preset scale, i.e., the value of the scale factor a, may be positively correlated with the size of the thumbnail map, for example. For example, when the size of the thumbnail map 402 occupies 1/5 of the game interface, the value of the scale factor a may be set to 0.5, for example; when the size of the thumbnail map 402 occupies 1/3 of the game interface, for example, the value of the scale factor a may be set to 0.8, for example. That is, the size of the game interface is fixed, and when the size of the thumbnail map 402 occupies a larger scale relative to the size of the game interface, the value of the scale factor a may be larger, for example, that is, the moving distance l2 of the cursor 403 is closer to the length l1 of the dragging track. Through the mode, the value of the preset proportion can be flexibly adjusted according to the actual size of the thumbnail map in the game, so that the cursor can be accurately positioned, and meanwhile, the phenomena that the cursor movement lags behind finger dragging and is sluggish can be reduced to a certain extent, the hand feeling experience of a user is improved, and the immersion degree of the user is further ensured.
In step S240, in response to the user releasing the drag when the cursor is moved to the second position on the thumbnail map, the one or more first virtual characters are moved to the first target position corresponding to the second position.
In the present exemplary embodiment, as shown in fig. 4, when the cursor 403 moves along the drag trajectory of the finger from a first position located near the lower left corner of the thumbnail map 402, for example, to a second position located near the center of the thumbnail map 402, if the touch terminal detects that the user lifts the finger at this time, that is, the user ends and releases the drag, the touch terminal may set a first target position in the overall game scene corresponding to the second position as a movement destination and control one or more first virtual characters selected by the user to move to the first target position. In this way, the user can quickly set the first target position in the whole game scene as the moving destination without departing from the current game scene and repeatedly dragging and switching the game scene to find the destination, so that the immersive experience of the user on the game is ensured.
In one example, upon detecting a user-release drag, the touch terminal may generate at least one first control, for example, on the thumbnail map near the location of the user-release drag, i.e., the location where the user's finger is lifted. In the example shown in fig. 5, for example, 3 first controls 504 may be generated near the location where the user releases the drag, which may be implemented as virtual buttons with action prompt text, for example, the illustrated 3 first controls 504 may respectively have prompt text of "attack", "patrol", "camp on". And the user can select to touch and press one of the first controls according to the requirement so as to issue an action instruction command to the first virtual role, so that the first virtual role executes corresponding action after completing the movement. In the instant strategy game, the user often controls the virtual character not only to require the virtual character to move but also to desire the virtual character to move in order to perform an action related to the game, for example, to attack an enemy after reaching a movement destination, to perform a patrol task in the vicinity after reaching the destination in order to search for a sign of activity of the opponent, and the like. Therefore, in the example shown in fig. 5, after the drag is completed to set the moving destination of the virtual character, the user may touch one of the generated 3 first controls 504, for example, may touch "attack", so that the selected 4 first virtual characters 501 perform the action of attacking the nearby enemy virtual character by themselves in response to the "attack" instruction issued in advance by the user immediately after reaching the moving destination.
Through the above example, after the user sets the moving destination for the selected virtual character by means of the thumbnail map 502, the moved action instruction can be issued immediately, so that the user does not need to pay attention to the progress of the virtual character executing the movement in real time, and does not need to switch to the game scene where the virtual character is located to issue the action instruction after the virtual character moves to the destination, thereby ensuring the continuous immersive experience of the user in the current game interface, and being beneficial to further improving the immersive degree of the user. The first control in this example may set the number of generations according to actual needs, and set the corresponding action instruction, which is not particularly limited in this example embodiment.
In one example, as shown in fig. 6, upon detecting that the user releases the drag, i.e., completes setting the movement destination of the first virtual character, the touch terminal may generate a second control 606, for example, on a graphical user interface of the game, the second control 606 may be implemented, for example, in the form of a virtual button, and may, for example, carry instruction prompt text such as "repeat movement". In the instant strategy game, users often encounter a situation that different virtual characters need to be moved to a destination in batches, for example, different troops arriving at a battlefield in sequence are dispatched to an enemy base in sequence for attack. At this time, although the control method of releasing the drag setting destination by selecting the virtual character, which is the drag activation thumbnail map, can quickly set the moving destination for the subsequent virtual character, the moving destinations set twice before and after may be different due to possible operation errors in addition to a large operation amount, which causes deviation of the instruction of the virtual character execution user and affects the user experience. In this case, the touch terminal may, for example, cache the destination (first target location) of the previous movement performed through the thumbnail map 602 and render a second control 606 associated with the destination, while the user may, for example, select one or more second virtual characters 605, in this example, 3 knights 605, and touch down the second control 606 to issue a "repeat movement" instruction in the prompt text. After detecting the "repeat movement" instruction, the touch terminal may control the 3 knights 605 to move to the destination of the previous movement, i.e., the first target position, for example.
By the above example, the operation load of the user in the game can be further reduced, so that the user does not need to repeatedly activate the operation of the thumbnail map when the user needs to issue the movement instruction to the virtual characters of different batches, but can simply move different virtual characters to the same destination, thereby reducing the fatigue of the user due to the operation load and being beneficial to improving the immersion degree of the game.
In one example, as shown in fig. 6, after detecting that the user releases the drag, that is, completes setting the moving destination of the first virtual character, the touch terminal may generate a third control 607 on the graphical user interface of the game, for example, the third control 607 may be implemented in the form of a virtual button, for example, and may carry a virtual character indication identifier such as "first formation". In an instant strategy game, due to a game process which changes rapidly in real time, a user often encounters a situation that a virtual character moving instruction which is issued needs to be adjusted temporarily; or, due to a quick operation, the user sometimes encounters a case where an erroneous operation occurs and the destination cannot be set at a desired position. In the above case, the user has finished setting the movement destination for the first avatar belonging to the first lot, for example, and the user can touch the third control 607 according to the avatar indication identifier on the third control 607, for example, "first formation" corresponding to the first lot. In response to a touch-down action of the user, the touch terminal may activate the thumbnail map 602, for example, and display a cursor 603 indicating a real-time position of the first virtual character on the thumbnail map 602; in the example shown in fig. 6, the real-time position of the first virtual character is located, for example, near the center of the thumbnail map 602, according to the position of the cursor 603. At this time, the user may touch the cursor 603 and drag to select a new destination of the first avatar. In the example shown in fig. 6, the user may, for example, drag the cursor 603 from a current position located near the center of the thumbnail map 602 to a third position located near the right edge of the thumbnail map 602, and lift the finger to release the drag; when it is detected that the user releases the drag, the touch terminal may set a position in the overall game scene corresponding to the third position as a second target position, for example, and replace the first target position of the first virtual character with the second target position and immediately control the first virtual character to move toward the second target position.
With the above example, the user can conveniently and quickly activate the thumbnail map and adjust the moving destination of the virtual character by dragging the cursor; by the method, when the moving destination needs to be adjusted temporarily or the user has misoperation, the user does not need to drag the game scene to search the virtual character in motion, but can directly change the moving destination on the thumbnail map, so that the operation efficiency of the user is improved, and the operation load of the user is reduced, thereby reducing the fatigue of the user caused by the operation load and being beneficial to improving the immersion degree of the game.
Further, in the above example, the number of generated third controls 607 may be preset, for example, at most 3 third controls 607 may be generated, and after the number of third controls reaches a maximum, for example, the oldest generated third control 607 may be removed when a new third control 607 is generated. In addition, for example, the third control 607 currently displayed on the user graphical interface may also be removed in response to an operation by the user; for example, a user may touch and drag a certain third control 607 towards the outside of the graphical user interface, and in response to the above operation by the user, the touch terminal may stop rendering the third control and remove the map information associated with the third control from the cache. The maximum number of the third controls and the removing operation may be set according to actual conditions and requirements, and this example is not particularly limited thereto.
In one example, as shown in FIG. 7, as the user's finger drags over the thumbnail map, a drag trace 708 or 709 of the user's finger may be displayed in real time on the thumbnail map 702. Of course, the drag trajectory 708 or 709 may not be displayed according to actual needs. After the user releases the drag to set the moving destination for the first avatar 701, the touch terminal may control the first avatar 701 to move toward the moving destination, for example, in the following two ways.
The first way may be: the movement route of the first virtual character 701 is not related to the drag trajectory 708 of the user's finger, and the user can drag, for example, from a start position located near the left edge of the thumbnail map to an end position located near the right edge of the thumbnail map along a straight line on the thumbnail map 702; after the user releases the drag, the touch terminal may, for example, record only the end position, and set a position in the overall game scene corresponding to the end position as the moving destination of the first virtual character 701. Thereafter, the touch terminal may calculate a moving route of the first virtual character 701 by means of an obstacle avoidance and routing algorithm such as a ×, for example, and move the first virtual character 701 toward the moving destination along the shortest moving route while avoiding an obstacle in the entire game scene, the moving route being calculated by the touch terminal according to the entire game scene, regardless of the dragging trajectory 708 of the user's finger. In addition to the a-algorithm, the moving route of the first virtual character 701 may be calculated by a routing algorithm such as an artificial potential field algorithm, an ant colony algorithm, or the like, which is not particularly limited in this example.
The second way may be: during the dragging process of the user finger, the touch terminal may record the dragging trace 709 of the user finger in real time, and in one possible implementation, the touch terminal may record the dragging trace 709 as a plurality of path nodes 710, for example, and fig. 7 exemplarily shows the dragging trace 709 formed by connecting the plurality of path nodes 710 to each other. After the user drags from the start position to the end position along the dragging trajectory 709 and releases the dragging, the touch terminal may further record the end position, and set a position in the overall game scene corresponding to the end position as the moving destination of the first virtual character 701. Thereafter, the touch terminal may control the first avatar 701 to move along the dragging trace 709, i.e., toward the movement destination in a manner of moving from one path node 710 to the next path node 710.
Through the mode, when a user needs to set the moving route of the virtual character by himself/herself, for example, when an enemy unit in the moving process needs to be avoided, the moving route of the virtual character can be drawn manually on the thumbnail map through dragging the track, so that the virtual character can move along the drawn moving route. When the moving route of the virtual character does not need to be specially specified, the moving route can be calculated by the touch terminal. Therefore, the moving route of the virtual character can be determined according to actual requirements, so that the user does not need to frequently operate in a game scene to set the moving route of different stages for the virtual character, the operation load of the user is further reduced, the user can perform immersive experience for a long time in the current game scene, and the fatigue of the user is reduced and the immersive experience of the user is enhanced.
In one example, the touch terminal may adjust the transparency of the thumbnail map according to a touch command of a user. Because the touch map occupies a portion of the graphical user interface when activated, a portion of the game interface may be blocked, which may cause the user to not normally view the game interface during fast-paced game play. Accordingly, as shown in fig. 7, the touch terminal may provide a virtual adjustment slider 711 for transparency of the thumbnail map 702 on a graphic user interface, for example. The user may touch and drag the virtual adjustment slider 711 to select an appropriate transparency of the thumbnail map 702 as desired. In response to the touch command of the user, the touch terminal may adjust the rendering parameters to increase or decrease the transparency of the thumbnail map 702 according to the touch command of the user, so that the user can observe the game interface obscured by the thumbnail map 702.
Through the above example, the user can flexibly adjust the transparency of the thumbnail map according to the self requirement, so that the influence of the shielding effect of the thumbnail map on the visual field of the user is reduced to at least a certain extent, and good experience is provided for the user.
It should be noted that although the various steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Further, in the present exemplary embodiment, a virtual character movement control apparatus is also provided, and the virtual character movement control apparatus may be applied to a touch terminal presenting a graphical user interface. Referring to fig. 8, the virtual character movement control apparatus 800 may include a map activation module 810, a processing module 820, and a movement control module 830, wherein:
the map activation module 810 may be configured to activate a thumbnail map in response to a user selecting one or more first virtual characters and dragging more than a predetermined distance across the graphical user interface;
the processing module 820 may be configured to generate a cursor corresponding to a current location of the one or more first virtual characters at a first location on the thumbnail map; and responding to the continuous dragging of the user on the thumbnail map, and moving the cursor from the first position along the dragging track of the user, wherein the moving distance of the cursor is reduced according to a preset proportion relative to the length of the dragging track;
the movement control module 830 may be configured to cause the one or more first virtual characters to move to a first target location corresponding to a second location on the thumbnail map in response to the user releasing the drag when the cursor is moved to the second location.
In an exemplary embodiment of the present disclosure, a value of the preset scale is positively correlated with a size of the thumbnail map.
In an exemplary embodiment of the disclosure, the processing module 820 may be further configured to generate at least one first control on the thumbnail map after the user releases the drag; the movement control module 830 may be further configured to cause the one or more first avatars to perform corresponding actions upon reaching the first target location in response to a user touching one of the first controls.
In an exemplary embodiment of the disclosure, the processing module 820 may be further configured to generate a second control on the graphical user interface after the user releases the drag; the movement control module 830 may be further configured to cause the one or more second avatars to move to the first target location in response to the user selecting the one or more second avatars and touching and pressing the second control.
In an exemplary embodiment of the disclosure, the processing module 820 may be further configured to generate a third control on the graphical user interface after the user releases the drag; the map activation module 810 may be further configured to activate the thumbnail map and display the cursor on the thumbnail map in response to a user touching the third control; the movement control module 830 may be further configured to move the one or more first virtual characters to a second target location corresponding to a third location on the thumbnail map in response to the user dragging the cursor and releasing the dragging at the third location.
In an exemplary embodiment of the present disclosure, the movement control module 830 may be further configured to cause the one or more first avatars to move to the first target location in an automatic way-finding manner; or enabling the one or more first virtual characters to move to the first target position according to the dragging track.
In an exemplary embodiment of the present disclosure, the map activation module 810 may be further configured to adjust a transparency of the thumbnail map in response to a touch command of a user.
The details of each module or unit in the virtual character movement control device are already described in detail in the corresponding virtual character movement control method, and therefore, the details are not described herein again.
FIG. 9 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present disclosure.
It should be noted that the computer system 900 of the electronic device shown in fig. 9 is only an example, and should not bring any limitation to the functions and the scope of the application of the embodiments of the present disclosure.
As shown in fig. 9, the computer system 900 includes a Central Processing Unit (CPU)901 that can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)902 or a program loaded from a storage section 908 into a Random Access Memory (RAM) 903. In the RAM 903, various programs and data necessary for system operation are also stored. The CPU 901, ROM 902, and RAM 903 are connected to each other via a bus 904. An input/output (I/O) interface 905 is also connected to bus 904.
The following components are connected to the I/O interface 905: an input portion 906 including a keyboard, a mouse, and the like; an output portion 907 including a display such as a Cathode Ray Tube (CRT) display, a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 908 including a hard disk and the like; and a communication section 909 including a network interface card such as a LAN card, a modem, or the like. The communication section 909 performs communication processing via a network such as the internet. The drive 910 is also connected to the I/O interface 905 as necessary. A removable medium 911 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 910 as necessary, so that a computer program read out therefrom is mounted into the storage section 908 as necessary.
In particular, the processes described below with reference to the flowcharts may be implemented as computer software programs, according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 909, and/or installed from the removable medium 911. The computer program executes various functions defined in the method and apparatus of the present application when executed by a Central Processing Unit (CPU) 901.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the embodiments above.
It should be noted that the computer readable media shown in the present disclosure may be computer readable signal media or computer readable storage media or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. A virtual character movement control method is applied to a touch terminal presenting a graphical user interface, and is characterized by comprising the following steps:
activating the thumbnail map in response to the user selecting one or more first virtual characters and dragging more than a predetermined distance on the graphical user interface;
generating a cursor corresponding to a current location of the one or more first virtual characters at a first location on the thumbnail map;
responding to the continuous dragging of the user on the thumbnail map, and enabling the cursor to move from the first position along a dragging track of the user, wherein the moving distance of the cursor is reduced according to a preset proportion relative to the length of the dragging track;
in response to a user releasing the drag when the cursor is moved to a second location on the thumbnail map, moving the one or more first avatars to a first target location corresponding to the second location.
2. The virtual character movement control method according to claim 1, wherein the value of the preset scale is positively correlated with the size of the thumbnail map.
3. The virtual character movement control method according to claim 1, characterized in that the method further comprises:
after the user releases the dragging, generating at least one first control on the thumbnail map;
in response to a user touching one of the first controls, causing the one or more first avatars to perform a corresponding action upon reaching the first target location.
4. The virtual character movement control method according to claim 1, characterized in that the method further comprises:
after the user releases the drag, generating a second control on the graphical user interface;
and responding to one or more second virtual characters selected by the user and touching and pressing the second control, and moving the one or more second virtual characters to the first target position.
5. The virtual character movement control method according to claim 1, characterized in that the method further comprises:
after the user releases the drag, generating a third control on the graphical user interface;
in response to a user touching and pressing the third control, activating the thumbnail map and displaying the cursor on the thumbnail map;
in response to a user dragging the cursor and releasing the dragging at a third location on the thumbnail map, moving the one or more first avatars to second target locations corresponding to the third location.
6. The virtual character movement control method according to claim 1, wherein the moving the one or more first virtual characters to a first target position corresponding to the second position includes:
causing the one or more first avatars to move to the first target location in an automated way; or
And moving the one or more first virtual characters to the first target position according to the dragging track.
7. The virtual character movement control method according to any one of claims 1 to 6, characterized in that the method further comprises:
and responding to a touch command of a user, and adjusting the transparency of the thumbnail map.
8. A virtual character movement control device is applied to a touch terminal presenting a graphical user interface, and is characterized by comprising:
a map activation module for activating a thumbnail map in response to a user selecting one or more first virtual characters and dragging more than a predetermined distance on a graphical user interface;
a processing module to generate a cursor corresponding to a current location of the one or more first virtual characters at a first location on the thumbnail map; and responding to the continuous dragging of the user on the thumbnail map, and moving the cursor from the first position along the dragging track of the user, wherein the moving distance of the cursor is reduced according to a preset proportion relative to the length of the dragging track;
and the movement control module is used for responding to the release of the dragging of the user when the cursor moves to the second position on the thumbnail map, so that the one or more first virtual characters move to the first target position corresponding to the second position.
9. An electronic device, comprising:
a memory; and
a processor coupled to the memory, the processor configured to perform the virtual character movement control method of any of claims 1-7 based on instructions stored in the memory.
10. A computer-readable storage medium on which a program is stored, the program, when executed by a processor, implementing the virtual character movement control method according to any one of claims 1 to 7.
CN202111015668.XA 2021-08-31 2021-08-31 Virtual character movement control method and device, electronic equipment and storage medium Pending CN113694530A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111015668.XA CN113694530A (en) 2021-08-31 2021-08-31 Virtual character movement control method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111015668.XA CN113694530A (en) 2021-08-31 2021-08-31 Virtual character movement control method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113694530A true CN113694530A (en) 2021-11-26

Family

ID=78658252

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111015668.XA Pending CN113694530A (en) 2021-08-31 2021-08-31 Virtual character movement control method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113694530A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114296609A (en) * 2022-03-09 2022-04-08 广州三七极耀网络科技有限公司 Interface processing method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090247300A1 (en) * 2008-03-27 2009-10-01 Sony Computer Entertainment Inc. Device and method to control game where characters are moved on map
CN109976650A (en) * 2019-01-25 2019-07-05 网易(杭州)网络有限公司 Man-machine interaction method, device and electronic equipment
WO2019153824A1 (en) * 2018-02-09 2019-08-15 腾讯科技(深圳)有限公司 Virtual object control method, device, computer apparatus, and storage medium
CN112535866A (en) * 2020-12-17 2021-03-23 网易(杭州)网络有限公司 Method and device for processing virtual object in game and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090247300A1 (en) * 2008-03-27 2009-10-01 Sony Computer Entertainment Inc. Device and method to control game where characters are moved on map
WO2019153824A1 (en) * 2018-02-09 2019-08-15 腾讯科技(深圳)有限公司 Virtual object control method, device, computer apparatus, and storage medium
CN109976650A (en) * 2019-01-25 2019-07-05 网易(杭州)网络有限公司 Man-machine interaction method, device and electronic equipment
CN112535866A (en) * 2020-12-17 2021-03-23 网易(杭州)网络有限公司 Method and device for processing virtual object in game and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114296609A (en) * 2022-03-09 2022-04-08 广州三七极耀网络科技有限公司 Interface processing method and device, electronic equipment and storage medium
CN114296609B (en) * 2022-03-09 2022-05-31 广州三七极耀网络科技有限公司 Interface processing method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
KR102625233B1 (en) Method for controlling virtual objects, and related devices
US10716997B2 (en) Information processing method and apparatus, electronic device, and storage medium
US20190060745A1 (en) Information Processing Method and Apparatus, Storage Medium, and Electronic Device
US10990274B2 (en) Information processing program, information processing method, and information processing device
AU2016279157B2 (en) Operation and control method based on touch screen, and terminal
CN109589605B (en) Game display control method and device
CN109843399B (en) Apparatus, method and graphical user interface for providing game control
WO2018090893A1 (en) Touch screen-based control method and device
KR102602113B1 (en) Information interaction methods and related devices
CN110339556B (en) Display control method and device in game
US20240123331A1 (en) Method and electronic device for processing game signal
CN111888766B (en) Information processing method and device in game, electronic equipment and storage medium
CN110448904B (en) Game view angle control method and device, storage medium and electronic device
CN110354506A (en) Game operation method and device
WO2023138192A1 (en) Method for controlling virtual object to pick up virtual prop, and terminal and storage medium
CN110420459A (en) Virtual unit is formed a team control method, device, electronic equipment and storage medium
CN112619124A (en) Control method and device for game object movement and electronic equipment
JP2022534661A (en) VIRTUAL OBJECT CONTROL METHOD AND DEVICE, TERMINAL AND COMPUTER PROGRAM
KR20140135276A (en) Method and Apparatus for processing a gesture input on a game screen
CN113694530A (en) Virtual character movement control method and device, electronic equipment and storage medium
JP2016095716A (en) Information processing apparatus, information processing method, and program
CN113559501A (en) Method and device for selecting virtual units in game, storage medium and electronic equipment
CN111638820A (en) Information processing method, device, processor and terminal
WO2022267570A1 (en) Game character moving state switching method and apparatus, device, and storage medium
CN114225372B (en) Virtual object control method, device, terminal, storage medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination