CN112619124A - Control method and device for game object movement and electronic equipment - Google Patents

Control method and device for game object movement and electronic equipment Download PDF

Info

Publication number
CN112619124A
CN112619124A CN202011599449.6A CN202011599449A CN112619124A CN 112619124 A CN112619124 A CN 112619124A CN 202011599449 A CN202011599449 A CN 202011599449A CN 112619124 A CN112619124 A CN 112619124A
Authority
CN
China
Prior art keywords
virtual object
game
control
dragging operation
map area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011599449.6A
Other languages
Chinese (zh)
Inventor
杨晓城
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202011599449.6A priority Critical patent/CN112619124A/en
Publication of CN112619124A publication Critical patent/CN112619124A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a control method and a control device for game object movement and electronic equipment, which relate to the technical field of man-machine interaction and comprise the following steps: responding to a first dragging operation aiming at the mobile control, and controlling the first virtual object to move in the game scene; responding to the end of the first dragging operation, and judging whether to control the first virtual object to enter an automatic moving state according to the end position when the first dragging operation is ended; if yes, determining the target position of the first virtual object according to the end position, and controlling the first virtual object to automatically move to the target position. The invention can effectively reduce the operation burden of the player, thereby improving the game experience.

Description

Control method and device for game object movement and electronic equipment
Technical Field
The invention relates to the technical field of man-machine interaction, in particular to a method and a device for controlling movement of a game object and electronic equipment.
Background
At present, a virtual moving wheel is usually arranged in a game picture, the moving function of a virtual object needs to be realized by always touching the moving wheel, and if the touch on the moving wheel is stopped, the virtual object also stops moving, so that a player cannot simultaneously execute other operations and control the movement of the virtual object, the operation of the player is limited, the operation burden of the player is increased, and the game experience is influenced.
Disclosure of Invention
In view of the above, the present invention provides a method, an apparatus and an electronic device for controlling movement of a game object, which can effectively reduce the operation burden of a player and thereby improve the game experience.
In a first aspect, an embodiment of the present invention provides a method for controlling movement of a game object, where a game screen is displayed on a graphical user interface of a touch terminal, the game screen at least partially includes a game scene of a game, the game scene includes at least one virtual object, the graphical user interface further includes a virtual movement control, and the method includes: controlling a first virtual object to move in the game scene in response to a first drag operation for the mobile control; responding to the end of the first dragging operation, and judging whether to control the first virtual object to enter an automatic moving state according to the end position when the first dragging operation is ended; if yes, determining the target position of the first virtual object according to the end position, and controlling the first virtual object to automatically move to the target position.
In one embodiment, the graphic user interface further comprises an avatar area of a second virtual object, wherein the second virtual object is a teammate and/or an enemy of the first virtual object; the step of determining whether to control the first virtual object to enter the automatic moving state according to the end position of the first dragging operation when the first dragging operation ends includes: judging whether the ending position of the first dragging operation is positioned in the head portrait area or not; and if so, determining to control the first virtual object to enter an automatic moving state.
In one embodiment, the method further comprises: and if the touch point moves to the head portrait area in the first dragging operation process, controlling the head portrait area to perform designated response.
In one embodiment, the step of determining the target position of the first virtual object according to the end position comprises: and determining the position of a second virtual object corresponding to the head portrait area where the end position is located in the game scene as the target position of the first virtual object.
In one embodiment, the method further comprises: and if the first virtual object moves to the target position and the second virtual object is a teammate of the first virtual object, controlling the first virtual object to follow the second virtual object corresponding to the head portrait region where the end position is located.
In one embodiment, the graphical user interface further comprises a map area of a first preset size; the step of determining whether to control the first virtual object to enter the automatic moving state according to the end position of the first dragging operation when the first dragging operation ends includes: judging whether the ending position of the first dragging operation is located in the map area or not; and if so, determining to control the first virtual object to enter an automatic moving state.
In one embodiment, before the step of determining whether to control the first virtual object to enter the automatic moving state according to the end position at the end of the first drag operation, the method further includes: if the touch point moves to the map area with the first preset size in the first dragging operation process, judging whether the holding time of the touch point in the map area with the first preset size meets a preset time condition; if so, setting the map area with the first preset size as a map area with a second preset size; wherein the second preset size is larger than the first preset size.
In one embodiment, the method further comprises: and when the first dragging operation is finished, controlling the map area with the second preset size to recover to the map area with the first preset size.
In one embodiment, the step of determining the target position of the first virtual object according to the end position comprises: and determining a scene position in the game scene corresponding to the end position according to the map area, and determining the scene position as a target position of the first virtual object.
In one embodiment, the step of controlling the first virtual object to automatically move to the target position comprises: calculating a moving path of the first virtual object automatically moving according to the current position and the target position of the first virtual object; and controlling the first virtual object to automatically move to the target position according to the moving path.
In one embodiment, the method further comprises: displaying the moving path through a map area; and/or displaying a thumbnail path corresponding to the moving path through a map area.
In one embodiment, the method further comprises: and responding to a second dragging operation aiming at the mobile control, and controlling the first virtual object to exit the automatic moving state.
In a second aspect, an embodiment of the present invention further provides a device for controlling movement of a game object, where a game screen is displayed on a graphical user interface of a touch terminal, the game screen at least partially includes a game scene of the game, the game scene includes at least one virtual object, the graphical user interface further includes a virtual movement control, and the device includes: a first moving module, configured to control a first virtual object to move in the game scene in response to a first drag operation for the mobile control; the judging module is used for responding to the end of the first dragging operation and judging whether to control the first virtual object to enter an automatic moving state or not according to the end position when the first dragging operation is ended; and the second moving module is used for determining the target position of the first virtual object according to the end position and controlling the first virtual object to automatically move to the target position when the judgment result of the judgment module is yes.
In a third aspect, an embodiment of the present invention further provides an electronic device, including a processor and a memory; the memory has stored thereon a computer program which, when executed by the processor, performs the method of any one of the aspects as provided in the first aspect.
In a fourth aspect, an embodiment of the present invention further provides a computer storage medium for storing computer software instructions for use in any one of the methods provided in the first aspect.
According to the method, the device and the electronic equipment for controlling the movement of the game object, the first virtual object can be controlled to move in the game scene in response to the first dragging operation aiming at the movement control, and when the first dragging operation is finished, if the first virtual object is determined to be controlled to enter the automatic movement state according to the end position when the first dragging operation is finished, the target position of the first virtual object is determined according to the end position, and the first virtual object is controlled to automatically move to the target position. According to the method, whether the first virtual object is controlled to enter the automatic moving state or not can be judged according to the end position of the mobile control when the first dragging is finished, so that the first virtual object can be automatically moved to the moving position.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic flow chart of a method for controlling movement of a game object according to an embodiment of the present invention;
FIG. 2 is a diagram of a game screen according to an embodiment of the present invention;
FIG. 3 is a diagram of another game screen according to an embodiment of the present invention;
FIG. 4 is a diagram of another game screen according to an embodiment of the present invention;
FIG. 5 is a diagram of another game screen according to an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of a control device for controlling movement of a game object according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the embodiments, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
At present, the movement function needs to be realized by always touching the virtual moving wheel in the game picture, if the finger leaves the moving wheel virtual object, the movement will stop, and the operation is relatively limited, so that the movement of the virtual character will be interrupted when the finger leaves the moving wheel to execute other operations. Therefore, the invention provides a method and a device for controlling the movement of a game object and an electronic device, which can effectively reduce the operation burden of a player so as to improve the game experience.
To facilitate understanding of the present embodiment, first, a detailed description is given to a control method for game object movement disclosed in the embodiment of the present invention, where the control method is applied to a touch terminal, a game screen is displayed on a graphical user interface of the touch terminal, the game screen at least partially includes a game scene of a game, the game scene includes at least one virtual object, and the graphical user interface further includes a virtual movement control, referring to a flow diagram of the control method for game object movement shown in fig. 1, the method mainly includes the following steps S102 to S106:
and step S102, responding to the first dragging operation aiming at the mobile control, and controlling the first virtual object to move in the game scene. The first dragging operation can be understood as a sliding operation of the player on the moving control, and the first virtual object is a game character operated by the player. In one embodiment, the mobile control is in an initial state when not being operated, and when a player drags the mobile control through a finger, the mobile control moves along with the finger, and meanwhile, the first virtual object also moves in a game scene according to the dragging direction of the mobile control. It should be noted that the movement control may move to any position in the game screen along with the touch point of the first dragging operation, or may only move along with the touch point of the first dragging operation in a specific area of the game screen (for example, an area defined by a dashed circle on the periphery of the moving wheel in fig. 2 to 5), which is not limited in this embodiment of the invention.
And step S104, responding to the end of the first dragging operation, and judging whether to control the first virtual object to enter an automatic moving state according to the end position when the first dragging operation is ended. The end position of the first dragging operation at the end, that is, the position of the finger touch point of the player in the game screen at the moment when the player stops the first dragging operation, and the automatic moving state may be understood as a state in which the first virtual character can move by itself without the player manipulating the moving control. In practical applications, when the finger of the player performing the first dragging operation leaves the touch display screen of the touch terminal, it can be determined that the first dragging operation is finished. In one embodiment, it may be determined whether the ending position is located in a designated area, and if the ending position is located in the designated area, it may be determined to control the first virtual object to enter the automatic moving state, and optionally, the designated area may include an avatar area and/or a map area of the second virtual object, and when the ending position is located in the avatar area or the map area, it may be determined to control the first virtual object to enter the automatic moving state.
And step S106, if so, determining the target position of the first virtual object according to the end position, and controlling the first virtual object to automatically move to the target position. In one embodiment, assuming that the end position of the first drag operation is an avatar area of a certain second virtual object, the position of the second virtual object in the game scene may be determined as a target position; assuming that the ending position of the first drag operation is a map area and the ending position is M points in the map area, N points in the game scene corresponding to the M points in the map area can be determined, where the N points are the target positions.
According to the method for controlling the movement of the game object, whether the first virtual object is controlled to enter the automatic movement state or not can be judged according to the end position of the movement control at the end of the first dragging, so that the first virtual object can be automatically moved to the movement position.
To facilitate understanding of the control method for game object movement provided in the foregoing embodiment, two implementation manners are exemplarily provided in the embodiment of the present invention, which are shown in the following manner one to manner two:
in a first mode, the control method for moving the game object is realized according to the head portrait area of the second virtual object included in the graphical user interface, wherein the second virtual object is a teammate and/or an enemy of the first virtual object. On this basis, embodiments of the present invention provide an implementation manner for determining whether to control the first virtual object to enter the automatic moving state according to the end position when the first dragging operation ends, by determining whether the end position when the first dragging operation ends is located in the avatar area, and when the determination result is yes, determining to control the first virtual object to enter the automatic moving state. For example, assuming that the second virtual objects are teammates of the first virtual object and the number of the second virtual objects is multiple, the avatar regions of the respective second virtual objects are respectively displayed on the graphical user interface, and when the ending position is any avatar region of the second virtual object, it may be determined to control the first virtual object to enter the automatic moving state.
In an alternative embodiment, if the touch point moves to the avatar region during the first drag operation, the avatar region is controlled to perform a designated response. The designated response may be an enlarged response or a highlighted response, and the player may be assisted in determining whether to end the first drag operation in the avatar area by controlling the avatar area to perform the designated response. For example, the user graphical interface respectively shows the avatar regions of the second virtual object a1, the second virtual object a2 and the third virtual object A3, when the player does not finish the first drag operation and the player's finger moves to the avatar region a1 of the second virtual object a1, the avatar region a1 will respond to the enlargement, the rest of the avatar regions are in the initial state, and when the player's finger moves from the avatar region a1 to the avatar region a2 of the second virtual object a2, the avatar region a2 will respond to the enlargement, and the rest of the avatar regions are in the initial state.
In addition, an embodiment of the present invention further provides an implementation manner that determines a target position of the first virtual object according to the end position, and a position of the second virtual object in the game scene, which corresponds to the avatar area where the end position is located, may be determined as the target position of the first virtual object. Assuming that the player's finger ends the first drag operation when located in the avatar region a2, that is, the end position is in the avatar region a2, the avatar region a2 returns to the initial state (in order to indicate that the player is currently in the state of automatically moving to the second virtual object a2, the avatar region a2 may be set to a state different from the initial state), and the position of the second virtual object a2 in the game scene is determined as the target position.
In order to further reduce the operation burden of the player, if the first virtual object moves to the target position and the second virtual object is a teammate of the first virtual object, the first virtual object is controlled to follow the second virtual object corresponding to the head portrait region where the end position is located. For example, when the first virtual object automatically moves to the position where the second virtual object a2 is located in the game scene, if the second virtual object is moving in the game scene, the first virtual object will follow the movement path of the second virtual object, and at this time, the player still does not need to manipulate the movement control, so that the player can perform other operations in the following process.
The second method comprises the following steps: the control method for realizing the movement of the game object according to the map area with the first preset size in the graphical user interface is characterized in that the map area can show the game scene in a short-cut mode and can also show the position of each virtual character in the game scene through the head portrait. On this basis, another embodiment of the present invention provides another implementation manner for determining whether to control the first virtual object to enter the automatic moving state according to the end position when the first dragging operation ends, which is shown in the following steps 1 to 3:
step 1, if the touch point moves to a map area with a first preset size in the first dragging operation process, judging whether the holding time of the touch point in the map area with the first preset size meets a preset time condition. If yes, executing step 2; if not, the size of the map area is kept to be the first preset size. The duration of the hold time may be understood as the duration of the hold time since the touch point of the first dragging operation moves to the map area of the first preset size, and the player ends the first dragging operation in the map area of the first preset size to stop timing, where the obtained duration is the duration of the hold time. In one embodiment, the touch point of the finger of the player performing the first dragging operation is moved to the map area of the first preset size and kept for a certain time, for example, the preset time condition is 800ms, and if the touch point of the first dragging operation is kept for 800ms in the map area of the first preset size, it can be determined that the preset time condition is satisfied.
And 2, setting the map area with the first preset size as the map area with the second preset size. Wherein the second predetermined size is larger than the first predetermined size. In an optional implementation manner, a map area of a first preset size may be displayed in the upper left corner of the user graphical interface to roughly display a game scene, when the duration of time that the mobile control is located in the map area of the first preset size meets a preset time condition, a map area of a second preset size with a larger size may be expanded, and the map area of the second preset size may be located on the left half side of the user graphical interface to display the game scene in more detail, so that a player may determine the release position of the mobile control more conveniently, and more accurate automatic movement control is achieved. Obviously, the position of the map area in the game screen may be other positions, and is not limited to the scheme shown in the drawings of the present specification.
And 3, judging whether the end position of the first dragging operation is positioned in a map area with a second preset size or not. If yes, determining to control the first virtual object to enter an automatic moving state; and if not, controlling the first virtual character to move according to the dragging direction of the first dragging operation.
Considering that the second preset-sized map area affects the player's sight line, the second preset-sized map area is controlled to be restored to the first preset-sized map area at the end of the first drag operation. Specifically, when the player finishes the first dragging operation in the map area with the second preset size, the size of the map area is restored to the first preset size from the second preset size at the same time, so that more game scenes can be displayed, and the influence on the realization of the player due to the overlarge map area is avoided.
In addition, an embodiment of the present invention further provides an implementation manner of determining a target position of the first virtual object according to the end position, and optionally, a scene position in the game scene corresponding to the end position may be determined according to a map area of a second preset size, and the scene position is determined as the target position of the first virtual object. For example, in one embodiment, each point in the map area corresponds to each point in the game scene one-to-one, and if the ending position of the first dragging operation is M points in the map area, N points in the game scene corresponding to M points in the map area can be determined, where the N points are the target positions.
When the step of controlling the first virtual object to automatically move to the target position is performed, a moving path along which the first virtual object automatically moves may be calculated according to the current position and the target position of the first virtual object, and then the first virtual object is controlled to automatically move to the target position according to the moving path. In one embodiment, an algorithm may be used to calculate the shortest path for the first virtual object to move from the current location to the destination location and determine the shortest path as the movement path for the first virtual object to automatically move in order to minimize the time required for the first virtual object to move to the destination location. Of course, the moving path may be a path meeting other preset conditions besides the shortest path.
In order to facilitate the player to know the moving path of the first virtual character, the moving path can be displayed through a map area; and/or displaying a thumbnail path corresponding to the moving path through the map area. The abbreviated path can be a line segment (e.g., a dashed line specifying a color) connecting the current location and the target location of the first avatar. Further, a dashed line of a specified color may be used to connect the avatar of the first virtual object shown in the map area and the avatar of the second virtual object corresponding to the avatar area where the end position is located, and the display of the dashed line may be updated as the first virtual object and/or the second virtual object moves.
In one embodiment, if the first virtual object needs to exit the automatic moving state, the first virtual object may be controlled to exit the automatic moving state in response to a second drag operation for the moving control. For example, when the player clicks and/or drags the move control again, the target moving object will exit the auto-move state.
To facilitate understanding of the control method for game object movement provided in the foregoing embodiment, an application example of a control method for game object movement is provided by taking an MOBA (Multiplayer Online Battle sports game) game as an example, and first referring to a schematic diagram of a game screen shown in fig. 2, for example, the game screen includes a game scene, a plurality of virtual objects, a moving roulette (i.e., the above-mentioned moving control), an avatar area, and a map area, where the virtual objects may include a first virtual object, a second virtual object, and an NPC (Non-Player Character) virtual object, the moving control is located at a lower left corner of the game screen, the map area is located at an upper left corner of the game screen, and the avatar area is located at a right side of the map area.
On the basis of the above fig. 2, an embodiment of the present invention provides a specific implementation manner of a method for controlling movement of a game object, and referring to another schematic diagram of a game screen shown in fig. 3, when a player drags a moving roulette and drags the moving roulette to an avatar area of a teammate, the avatar area at a corresponding position is amplified to respond, when the player releases the moving roulette, the first virtual character automatically moves to the position of the teammate corresponding to the avatar area, and simultaneously, a map area displays an automatically moving thumbnail path. In addition, the teammate is followed after moving to the position where the teammate is located. In actual practice, if the player clicks and/or drags the moving roulette wheel again, the automatic movement of the first virtual character will be cancelled.
On the basis of the foregoing fig. 2, an embodiment of the present invention further provides another specific implementation manner of a method for controlling movement of a game object, and referring to another schematic diagram of a game screen shown in fig. 4, first, a player drags a moving roulette to a map area in fig. 4 (or a touch point of a drag operation of dragging the moving roulette moves to the map area), determines whether a duration of time of the moving roulette or the touch point of the drag operation in the map area satisfies a preset time condition (such as 800ms), if so, expands the map area (such as the another schematic diagram of the game screen shown in fig. 5), a finger can be released at any position in the expanded map area, and at the same time, the map area shrinks back to an original position and displays an automatically moved thumbnail path, and a first virtual object of the player automatically moves to a position where the finger is released. In actual practice, if the player clicks and/or drags the moving roulette wheel again, the automatic movement of the first virtual character will be cancelled.
In conclusion, the embodiment of the invention can realize quick intelligent movement, reduce the operation burden of the player, release the fingers of the player, facilitate the player to perform other game operations and improve the operation efficiency and experience.
For the control method for game object movement provided by the foregoing embodiment, an embodiment of the present invention provides a control device for game object movement, where the device is applied to a touch terminal, a game screen is displayed on a graphical user interface of the touch terminal, the game screen at least partially contains a game scene of a game, the game scene includes at least one virtual object, the graphical user interface further includes a virtual movement control, referring to a schematic structural diagram of the control device for game object movement shown in fig. 6, and the device may include the following parts:
a first moving module 602, configured to control a first virtual object to move in a game scene in response to a first drag operation for a mobile control;
a determining module 604, configured to respond to the end of the first dragging operation, and determine whether to control the first virtual object to enter an automatic moving state according to an end position when the first dragging operation ends;
and a second moving module 606, configured to determine a target position of the first virtual object according to the end position and control the first virtual object to automatically move to the target position when the determination result of the determining module is yes.
The control device for moving the game object provided by the embodiment of the invention can judge whether to control the first virtual object to enter the automatic moving state according to the end position when the first dragging is finished so that the first virtual object can automatically move to the moving position.
In one embodiment, the graphic user interface further comprises an avatar area of a second virtual object, wherein the second virtual object is a teammate and/or an enemy of the first virtual object; the determining module 604 is further configured to: judging whether the ending position of the first dragging operation is positioned in the head portrait area or not; if so, it is determined to control the first virtual object to enter an auto-move state.
In one embodiment, the apparatus further comprises an avatar response module configured to: and if the touch point moves to the head portrait area in the first dragging operation process, controlling the head portrait area to perform appointed response.
In one embodiment, the second moving module 606 is further configured to: and determining the position of the second virtual object corresponding to the head portrait area where the end position is located in the game scene as the target position of the first virtual object.
In one embodiment, the apparatus further comprises a following module for: and if the first virtual object moves to the target position and the second virtual object is a teammate of the first virtual object, controlling the first virtual object to follow the second virtual object corresponding to the head portrait region where the end position is located.
In one embodiment, the graphical user interface further comprises a map area of a first preset size; the apparatus further comprises a region setting module configured to: if the touch point moves to a map area with a first preset size in the first dragging operation process, judging whether the holding time of the touch point in the map area with the first preset size meets a preset time condition; if so, setting the map area with the first preset size as the map area with the second preset size; wherein the second predetermined size is larger than the first predetermined size.
In one embodiment, the determining module 604 is further configured to: judging whether the ending position of the first dragging operation is located in a map area with a second preset size or not; if so, it is determined to control the first virtual object to enter an auto-move state.
In one embodiment, the apparatus further includes a region recovery module configured to: and when the first dragging operation is finished, controlling the map area with the second preset size to recover to the map area with the first preset size.
In one embodiment, the second moving module 606 is further configured to: and determining a scene position in the game scene corresponding to the end position according to the map area, and determining the scene position as a target position of the first virtual object.
In one embodiment, the second moving module 606 is further configured to: calculating a moving path of the first virtual object automatically moving according to the current position and the target position of the first virtual object; and controlling the first virtual object to automatically move to the target position according to the moving path.
In one embodiment, the apparatus further comprises a display module for: displaying a moving path through a map area; and/or displaying a thumbnail path corresponding to the moving path through the map area.
In one embodiment, the apparatus further includes a state exit module configured to: and responding to a second dragging operation aiming at the mobile control, and controlling the first virtual object to exit the automatic mobile state.
The device provided by the embodiment of the present invention has the same implementation principle and technical effect as the method embodiments, and for the sake of brief description, reference may be made to the corresponding contents in the method embodiments without reference to the device embodiments.
The embodiment of the invention provides electronic equipment, which particularly comprises a processor and a storage device; the storage means has stored thereon a computer program which, when executed by the processor, performs the method of any of the above described embodiments.
Fig. 7 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present invention, where the electronic device 100 includes: a processor 70, a memory 71, a bus 72 and a communication interface 73, wherein the processor 70, the communication interface 73 and the memory 71 are connected through the bus 72; the processor 70 is arranged to execute executable modules, such as computer programs, stored in the memory 71.
The Memory 71 may include a high-speed Random Access Memory (RAM) and may further include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 73 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, and the like can be used.
The bus 72 may be an ISA bus, PCI bus, EISA bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 7, but this does not indicate only one bus or one type of bus.
The memory 71 is configured to store a program, and the processor 70 executes the program after receiving an execution instruction, and the method executed by the apparatus defined by the flow process disclosed in any of the foregoing embodiments of the present invention may be applied to the processor 70, or implemented by the processor 70.
The processor 70 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 70. The Processor 70 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the device can also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA), or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory 71, and the processor 70 reads the information in the memory 71 and completes the steps of the method in combination with the hardware thereof.
The computer program product of the readable storage medium provided in the embodiment of the present invention includes a computer readable storage medium storing a program code, where instructions included in the program code may be used to execute the method described in the foregoing method embodiment, and specific implementation may refer to the foregoing method embodiment, which is not described herein again.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present invention, which are used for illustrating the technical solutions of the present invention and not for limiting the same, and the protection scope of the present invention is not limited thereto, although the present invention is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (15)

1. A control method for game object movement is applied to a touch terminal, and is characterized in that a game picture is displayed on a graphical user interface of the touch terminal, the game picture at least partially contains a game scene of a game, the game scene comprises at least one virtual object, the graphical user interface further comprises a virtual movement control, and the method comprises the following steps:
controlling a first virtual object to move in the game scene in response to a first drag operation for the mobile control;
responding to the end of the first dragging operation, and judging whether to control the first virtual object to enter an automatic moving state according to the end position when the first dragging operation is ended;
if yes, determining the target position of the first virtual object according to the end position, and controlling the first virtual object to automatically move to the target position.
2. The method of claim 1, further comprising an avatar area of a second virtual object, the second virtual object being a teammate and/or an enemy of the first virtual object;
the step of determining whether to control the first virtual object to enter the automatic moving state according to the end position of the first dragging operation when the first dragging operation ends includes:
judging whether the ending position of the first dragging operation is positioned in the head portrait area or not;
and if so, determining to control the first virtual object to enter an automatic moving state.
3. The method of claim 2, further comprising:
and if the touch point moves to the head portrait area in the first dragging operation process, controlling the head portrait area to perform designated response.
4. The method of claim 2, wherein the step of determining the target position of the first virtual object based on the end position comprises:
and determining the position of a second virtual object corresponding to the head portrait area where the end position is located in the game scene as the target position of the first virtual object.
5. The method of claim 4, further comprising:
and if the first virtual object moves to the target position and the second virtual object is a teammate of the first virtual object, controlling the first virtual object to follow the second virtual object corresponding to the head portrait region where the end position is located.
6. The method of claim 1, further comprising a first preset size of map area in the graphical user interface;
the step of determining whether to control the first virtual object to enter the automatic moving state according to the end position of the first dragging operation when the first dragging operation ends includes:
judging whether the ending position of the first dragging operation is located in the map area or not;
and if so, determining to control the first virtual object to enter an automatic moving state.
7. The method according to claim 6, wherein before the step of determining whether to control the first virtual object to enter the automatic moving state according to an end position at the end of the first drag operation, the method further comprises:
if the touch point moves to the map area with the first preset size in the first dragging operation process, judging whether the holding time of the touch point in the map area with the first preset size meets a preset time condition;
if so, setting the map area with the first preset size as a map area with a second preset size; wherein the second preset size is larger than the first preset size.
8. The method of claim 7, further comprising:
and when the first dragging operation is finished, controlling the map area with the second preset size to recover to the map area with the first preset size.
9. The method of claim 6, wherein the step of determining the target position of the first virtual object based on the end position comprises:
and determining a scene position in the game scene corresponding to the end position according to the map area, and determining the scene position as a target position of the first virtual object.
10. The method of claim 1, wherein the step of controlling the automatic movement of the first virtual object to the target position comprises:
calculating a moving path of the first virtual object automatically moving according to the current position and the target position of the first virtual object;
and controlling the first virtual object to automatically move to the target position according to the moving path.
11. The method of claim 10, further comprising:
displaying the moving path through a map area; and/or displaying a thumbnail path corresponding to the moving path through a map area.
12. The method of claim 1, further comprising:
and responding to a second dragging operation aiming at the mobile control, and controlling the first virtual object to exit the automatic moving state.
13. A control device for game object movement is applied to a touch terminal, and is characterized in that a game picture is displayed on a graphical user interface of the touch terminal, the game picture at least partially contains a game scene of a game, the game scene comprises at least one virtual object, the graphical user interface further comprises a virtual movement control, and the device comprises:
a first moving module, configured to control a first virtual object to move in the game scene in response to a first drag operation for the mobile control;
the judging module is used for responding to the end of the first dragging operation and judging whether to control the first virtual object to enter an automatic moving state or not according to the end position when the first dragging operation is ended;
and the second moving module is used for determining the target position of the first virtual object according to the end position and controlling the first virtual object to automatically move to the target position when the judgment result of the judgment module is yes.
14. An electronic device comprising a processor and a memory;
the memory has stored thereon a computer program which, when executed by the processor, performs the method of any of claims 1 to 12.
15. A computer storage medium storing computer software instructions for use in the method of any one of claims 1 to 12.
CN202011599449.6A 2020-12-29 2020-12-29 Control method and device for game object movement and electronic equipment Pending CN112619124A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011599449.6A CN112619124A (en) 2020-12-29 2020-12-29 Control method and device for game object movement and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011599449.6A CN112619124A (en) 2020-12-29 2020-12-29 Control method and device for game object movement and electronic equipment

Publications (1)

Publication Number Publication Date
CN112619124A true CN112619124A (en) 2021-04-09

Family

ID=75286647

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011599449.6A Pending CN112619124A (en) 2020-12-29 2020-12-29 Control method and device for game object movement and electronic equipment

Country Status (1)

Country Link
CN (1) CN112619124A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113082708A (en) * 2021-04-14 2021-07-09 网易(杭州)网络有限公司 Task guiding method and device in game
CN113426125A (en) * 2021-07-02 2021-09-24 网易(杭州)网络有限公司 Method and device for controlling virtual unit in game, storage medium, and electronic device
CN113730911A (en) * 2021-09-02 2021-12-03 网易(杭州)网络有限公司 Game message processing method and device and electronic terminal
WO2022247196A1 (en) * 2021-05-24 2022-12-01 网易(杭州)网络有限公司 Game positioning method and apparatus, and mobile terminal
WO2024152504A1 (en) * 2023-01-18 2024-07-25 网易(杭州)网络有限公司 Game interaction method and apparatus, and computer device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140066200A1 (en) * 2012-08-31 2014-03-06 Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) Video game processing apparatus and video game processing program product
CN107803028A (en) * 2017-09-30 2018-03-16 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN107899235A (en) * 2017-10-13 2018-04-13 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN109621411A (en) * 2017-09-30 2019-04-16 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN111032170A (en) * 2017-08-15 2020-04-17 多玩国株式会社 Object control system, program, and method in position game
CN111760268A (en) * 2020-07-06 2020-10-13 网易(杭州)网络有限公司 Path finding control method and device in game
CN112057865A (en) * 2020-09-14 2020-12-11 网易(杭州)网络有限公司 Game role control method and device and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140066200A1 (en) * 2012-08-31 2014-03-06 Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) Video game processing apparatus and video game processing program product
CN111032170A (en) * 2017-08-15 2020-04-17 多玩国株式会社 Object control system, program, and method in position game
CN107803028A (en) * 2017-09-30 2018-03-16 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN109621411A (en) * 2017-09-30 2019-04-16 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN107899235A (en) * 2017-10-13 2018-04-13 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN111760268A (en) * 2020-07-06 2020-10-13 网易(杭州)网络有限公司 Path finding control method and device in game
CN112057865A (en) * 2020-09-14 2020-12-11 网易(杭州)网络有限公司 Game role control method and device and electronic equipment

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113082708A (en) * 2021-04-14 2021-07-09 网易(杭州)网络有限公司 Task guiding method and device in game
WO2022247196A1 (en) * 2021-05-24 2022-12-01 网易(杭州)网络有限公司 Game positioning method and apparatus, and mobile terminal
CN113426125A (en) * 2021-07-02 2021-09-24 网易(杭州)网络有限公司 Method and device for controlling virtual unit in game, storage medium, and electronic device
CN113730911A (en) * 2021-09-02 2021-12-03 网易(杭州)网络有限公司 Game message processing method and device and electronic terminal
WO2024152504A1 (en) * 2023-01-18 2024-07-25 网易(杭州)网络有限公司 Game interaction method and apparatus, and computer device and storage medium

Similar Documents

Publication Publication Date Title
CN112619124A (en) Control method and device for game object movement and electronic equipment
US10702774B2 (en) Information processing method, apparatus, electronic device and storage medium
US10716997B2 (en) Information processing method and apparatus, electronic device, and storage medium
CN112807686B (en) Game sightseeing method and device and electronic equipment
CN110302530B (en) Virtual unit control method, device, electronic equipment and storage medium
CN110215684B (en) Game object control method and device
CN111840987B (en) Information processing method and device in game and electronic equipment
CN110575671A (en) Method and device for controlling view angle in game and electronic equipment
CN112755516B (en) Interactive control method and device, electronic equipment and storage medium
CN111659107A (en) Game skill release method and device and electronic equipment
CN111957041A (en) Map viewing method in game, terminal, electronic equipment and storage medium
CN113262476B (en) Position adjusting method and device of operation control, terminal and storage medium
CN112791410A (en) Game control method and device, electronic equipment and storage medium
CN112535866A (en) Method and device for processing virtual object in game and electronic equipment
CN111773671A (en) Method and device for controlling movement of virtual object and terminal equipment
CN113546412B (en) Display control method and device in game and electronic equipment
JP2016154707A (en) Game program with automatic control function
CN113900570A (en) Game control method, device, equipment and storage medium
CN111603757B (en) Processing method, device and equipment for equipment in game
CN111617474B (en) Information processing method and device
JP5977878B1 (en) Program, game control method, and information processing apparatus
CN117046100A (en) Formation control method and device in game and electronic equipment
CN115671735A (en) Object selection method and device in game and electronic equipment
CN109045685B (en) Information processing method, information processing device, electronic equipment and storage medium
US20240216807A1 (en) Positioning method and device for game and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination