CN110652725A - Method and device for controlling aiming direction in game, electronic equipment and storage medium - Google Patents

Method and device for controlling aiming direction in game, electronic equipment and storage medium Download PDF

Info

Publication number
CN110652725A
CN110652725A CN201910935794.3A CN201910935794A CN110652725A CN 110652725 A CN110652725 A CN 110652725A CN 201910935794 A CN201910935794 A CN 201910935794A CN 110652725 A CN110652725 A CN 110652725A
Authority
CN
China
Prior art keywords
touch point
sliding operation
sliding
distance
initial position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910935794.3A
Other languages
Chinese (zh)
Inventor
武文浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201910935794.3A priority Critical patent/CN110652725A/en
Publication of CN110652725A publication Critical patent/CN110652725A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5375Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for graphically or textually suggesting an action, e.g. by displaying an arrow indicating a turn in a driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a method and a device for controlling aiming direction in a game, electronic equipment and a storage medium, wherein in the method provided by the application, a touch terminal responds to sliding operation acting on a graphical user interface to form a touch point, the touch point moves correspondingly according to the sliding operation, and when the touch point is controlled to move, the initial position of the touch point is obtained; in the process of controlling the touch point to move, calculating the sliding distance of the touch point relative to the initial position in real time based on the initial position of the touch point; when the calculated sliding distance of the touch point is smaller than the first distance, the aiming direction is not adjusted; and when the sliding distance is greater than or equal to the first distance, adjusting the aiming direction according to the sliding operation. The method for adjusting the aiming direction avoids the influence of misoperation caused by slight shake of fingers when the user presses on the graphical user interface or hand lifting of the user after the sliding operation is finished to a certain extent.

Description

Method and device for controlling aiming direction in game, electronic equipment and storage medium
Technical Field
The present application relates to the field of game technology development, and in particular, to a method and an apparatus for controlling a pointing direction in a game, an electronic device, and a storage medium.
Background
Shooting games are a common game type, and compared with other game types, shooting games are characterized in that attack operations of users are executed to achieve the goal of attacking targets.
When a shooting game is played at the PC, a user usually selects a target to be attacked by a mouse, for example, the user can complete the movement in the virtual game screen by dragging the mouse, so as to achieve the purpose of selecting the target. When the touch terminal performs a shooting game, the user operates the touch screen with a finger to select a target to be attacked, for example, the user presses a point on the touch screen with the finger and performs a dragging operation on the touch screen, so that the touch terminal controls the virtual game screen to move according to the dragging operation.
Disclosure of Invention
In view of the above, an object of the embodiments of the present application is to provide a method and an apparatus for controlling a pointing direction in a game, an electronic device, and a storage medium.
In some embodiments, an embodiment of the present application provides a method for controlling a pointing direction in a game, which is applied to a touch terminal, where the method includes:
responding to a sliding operation acted on a graphical user interface, and acquiring an initial position of a touch point of the sliding operation and a sliding distance of the touch point relative to the initial position;
when the sliding distance is smaller than the first distance, keeping the aiming direction unchanged;
when the sliding distance is greater than the first distance, the aiming direction is adjusted according to the sliding operation;
and updating the initial position according to the stop position of the touch point in response to the stop movement of the touch point in the sliding operation.
In some embodiments, the stopping movement of the touch point in response to the sliding operation includes:
the dwell time of the touch point at the stop position in response to the sliding operation exceeds a first threshold.
In some embodiments, the method further comprises:
in response to the end of the swipe operation, the aim direction at the end of the swipe operation is determined.
In some embodiments, the method further comprises:
in response to the end of the swipe operation, an attack operation is performed according to the aiming direction at the end of the swipe operation.
In some embodiments, the method further comprises:
and responding to the touch operation acted on the skill control, and executing attack operation according to the aiming direction when the sliding operation stops.
In some embodiments, the method further comprises:
and controlling the virtual game picture corresponding to the current aiming direction to be amplified in response to the stop of the sliding operation.
In some embodiments, the step of obtaining the initial position of the touch point of the sliding operation in response to the sliding operation applied to the graphical user interface includes:
and responding to the sliding operation acted on the graphical user interface, acquiring the initial position of the touch point of the sliding operation, and generating a virtual key at the initial position, wherein the virtual key is configured to move according to the movement of the touch point of the sliding operation.
In some embodiments, the present application further provides a device for controlling a pointing direction in a game, which is applied to a touch terminal, and includes: the acquisition module is used for responding to the sliding operation acted on the graphical user interface and acquiring the initial position of the touch point of the sliding operation and the sliding distance of the touch point relative to the initial position;
the maintaining module is used for maintaining the aiming direction unchanged when the sliding distance is smaller than the first distance;
the adjusting module is used for adjusting the aiming direction according to the sliding operation when the sliding distance is greater than the first distance;
and the updating module is used for responding to the stop movement of the touch point of the sliding operation and updating the initial position according to the stop position of the touch point.
In some embodiments, the present application further provides an electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is operated, the machine-readable instructions when executed by the processor performing steps of a method of controlling, for example, aiming a direction in a game.
In some embodiments, the present application further provides a computer-readable storage medium, wherein the computer-readable storage medium has stored thereon a computer program which, when executed by a processor, performs the steps of a control method such as targeting a direction in a game.
The method for controlling the aiming direction in the game is applied to the touch terminal, and in the specific implementation process, the touch terminal responds to the sliding operation acting on the graphical user interface and controls the touch point to move; when the touch point is controlled to move, the initial position of the touch point is obtained; in the process of controlling the touch point to move, calculating the sliding distance of the touch point relative to the initial position in real time based on the initial position of the touch point; when the calculated sliding distance of the touch point is smaller than the first distance, keeping the aiming direction unchanged, namely not adjusting the aiming direction; and when the sliding distance is greater than or equal to the first distance, adjusting the aiming direction according to the sliding operation, responding to the stop movement of the touch point of the sliding operation, and updating the initial position according to the stop position of the touch point. According to the method and the device, after the touch point stops moving, the initial position of the touch point is updated, the sliding distance of the touch point is determined according to the distance between the initial position of the touch point and the current position, and finally whether the aiming direction is kept unchanged or adjusted is determined by comparing the sliding distance of the touch point with the first distance, so that misoperation caused when the finger of a user presses on a graphical user interface can be avoided to a certain extent.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
FIG. 1 shows a schematic view of a graphical user interface for a shooting game without distraction;
FIG. 2 shows a schematic diagram of a graphical user interface for a shooting game with distraction;
FIG. 3 is a flow chart illustrating a method for controlling the aiming direction in a game according to an embodiment of the present application;
FIG. 4 shows a graphical user interface diagram of a shooting game prior to adjustment of the aim;
FIG. 5 is a schematic diagram of a graphical user interface of a shooting game after adjustment of the aiming direction;
FIG. 6 is a schematic diagram of a graphical user interface of another shooting game after adjustment of the aiming direction;
FIG. 7 is a schematic structural diagram of a control device for aiming at a direction in a game according to an embodiment of the present application;
fig. 8 shows a schematic structural diagram of a server provided in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
When a user uses a touch terminal to play a shooting game, aiming is usually accomplished by sliding a finger on a touch screen.
It should be noted that a virtual camera is generally arranged in a game scene, and the game scene content captured by the virtual camera is a game picture presented by the graphical user interface.
For example, in a first person 3D game, a virtual camera may be provided at the head of a virtual character, moving following the movement of the virtual character; and the orientation of the virtual camera is rotated following the rotation of the virtual character. In the first-person 3D game, the virtual camera cannot capture the virtual character, or only a part of the virtual character (such as an arm of the virtual character, a hand-held virtual instrument, etc.) can be captured, and the virtual character is not displayed or cannot be completely displayed on the graphical user interface.
As another example, in a third person 3D game, a virtual camera may be disposed above and behind a virtual character, moving following the movement of the virtual character; and the orientation of the virtual camera is rotated following the rotation of the virtual character. In the third person 3D game, a virtual camera can capture game scene content including virtual characters, which are displayed on a graphical user interface.
For another example, in some games without a virtual character, the movement and rotation of the virtual camera may also be directly controlled, so as to update the game picture presented by the graphical user interface.
In one embodiment, aiming is realized by controlling the virtual camera to rotate, and the game scene picture presented on the graphical user interface is changed by rotating the virtual camera, and the direction of the game picture presented on the screen is the aiming direction. As shown in FIG. 1, a schematic diagram of a graphical user interface for a shooting game without the presence of an aim (the aim being used to reflect the aiming direction) is shown; as shown in fig. 2, which is a schematic view of a graphical user interface of a shooting game with a certain degree of care, a user can adjust the aiming direction by sliding on the graphical user interface, and when the aiming direction is adjusted by a sliding operation, the current game scene containing virtual characters and virtual trees in fig. 1 and 2 moves relative to the graphical user interface. For example, when the user slides to the left on the graphical user interface, the current game scene containing the virtual character and the virtual tree in fig. 1 moves to the right relative to the graphical user interface, and the target (e.g., the virtual character) to be aimed is located at a preset position (e.g., the center) of the graphical user interface through the sliding operation; the alignment center in fig. 2 is fixedly arranged at the center of the graphical user interface in one embodiment, and the user slides leftwards on the graphical user interface to control the current game scene containing the virtual character and the virtual tree to move rightwards relative to the graphical user interface.
In another embodiment, the aiming action may be performed by moving the center of sight on the graphical user interface, for example, the user slides to the left on the graphical user interface, keeps the game scene still relative to the graphical user interface, and controls the center of sight to move to the left, so that the center of sight is aimed at the target (e.g., virtual character) to be aimed at.
In practice, the user may choose to set or not to have a focus in the graphical user interface depending on the particular game type. When the number of elements in a game picture (a part of game scene displayed on a graphical user interface) is small and the virtual game is determined to belong to a simpler game type, the virtual game can be set to have no alignment center, and the target to be aimed is positioned at a preset position in the graphical user interface through sliding operation to realize aiming; when the number of elements in the virtual game screen is large and the virtual game is determined to belong to a complicated game type, the virtual game may be set to have a center of sight so as to indicate the currently aimed target among a plurality of elements through the center of sight.
Generally, any sliding operation performed by a user on a touch screen (a touch screen on a touch terminal) during a game causes a change in the pointing direction. However, the inventor of the present application has found through trial that there is a case where the user has an erroneous operation when performing the sliding operation. The following provides a common way to adjust the aiming direction by a sliding operation: the method includes the steps that a user firstly presses a certain point on a touch screen (after the touch screen is pressed, a touch point is generated by a touch terminal and reflects the current position of a finger of the user), then the touch point moves adaptively while the user performs sliding operation, the aiming direction is adjusted adaptively while the touch point moves (a virtual game picture displayed on the touch screen moves, such as moves leftwards or rightwards), and finally when the finger leaves the touch screen, a virtual character in the game shoots towards the final aiming direction (if an aiming center is set, the virtual character shoots towards the position where the aiming center is located last). According to the mode, shooting can be carried out when the fingers of the user leave the touch screen. However, after the user finishes aiming, when the user lifts his hand (when the finger leaves the screen of the touch terminal), the touch point may move a small amount, which causes the process that the finger leaves the touch screen to be recognized as a sliding operation (the touch point moves) by the touch terminal, and further causes the process that the user finger leaves the touch screen to perform a misoperation on the touch point, which is also the process that the user finger leaves the touch screen to perform an incorrect displacement. After the erroneous displacement of the sighting direction after the erroneous displacement is made, the accuracy of shooting according to the sighting direction after the erroneous displacement cannot be guaranteed.
In view of the above situation, an embodiment of the present application provides a method for controlling shooting of a virtual character in a game, where the method is applied to a touch terminal. Here, the specific control method is as shown in fig. 3, and the steps are as follows:
s301, responding to the sliding operation acting on the graphical user interface, and acquiring the initial position of the touch point of the sliding operation and the sliding distance of the touch point relative to the initial position; when the sliding distance is smaller than the first distance, the aiming direction is kept unchanged; when the sliding distance is greater than or equal to the first distance, the aiming direction is adjusted according to the sliding operation;
s302, responding to the stop movement of the touch point of the sliding operation, and updating the initial position according to the stop position of the touch point.
When a user aims at a virtual character in a game (which may be displayed in a game picture or not) on the touch terminal, the user needs to perform a corresponding sliding operation on a touch screen of the touch terminal to adjust the aiming direction. Specifically, when the sliding operation is performed, an aiming control area (an operation area specially used for adjusting the aiming direction, which may be a local area in the graphical user interface or the entire graphical user interface) may be set on the graphical user interface, and the user may press any point in the aiming control area with a finger and then slide the pointing control area, thereby completing the sliding operation on the graphical user interface. When a user operates on the graphical user interface, the touch terminal forms a touch point, the touch point is generated in response to the action of the user pressing on the graphical user interface, and when the user performs a sliding operation (when a finger of the user slides on the graphical user interface), the touch point also performs adaptive movement according to the sliding operation.
In the above steps S301 to S302, when the user performs a sliding operation on the graphical user interface to determine the pointing direction, the touch terminal responds to the sliding operation of the touch point to calculate the sliding distance generated at the touch point of the sliding operation. During specific calculation, the touch terminal firstly acquires the initial position of the touch point of the sliding operation. Specifically, when the touch terminal identifies the touch point for the first time, the current position of the touch point can be used as the initial position; when a sliding operation performed on a touch point on the graphical user interface by a user is identified, the initial point position of the sliding track of the sliding operation is determined, and the initial point position is used as the initial position of the touch point. Here, if the user directly performs the sliding operation without performing other operations after forming the touch point on the touch screen, the current position of the touch point when the touch terminal recognizes the touch point may be used as the initial position, or the position of the initial point of the sliding track may be used as the initial position of the touch point; if a user performs another operation (for example, clicks a certain area on the touch screen with another finger) before starting to perform a sliding operation after pressing a touch point on the touch screen with a certain finger, at this time, clicking a certain area on the touch screen with another finger may cause the touch point to generate a certain displacement (for example, move to a certain position between two fingers).
After determining the initial position of the touch point, a sliding distance of the current position of the touch point relative to the initial position may be calculated based on the initial position. Furthermore, after determining (may be determined in real time) the sliding distance of the current position of the touch point relative to the initial position, the sliding distance may be compared with the first distance, and when the sliding distance is smaller than the first distance, the aiming direction is kept unchanged; that is, when the calculated sliding distance of the touch point is smaller than the first distance (the sliding distance of the sliding operation is sufficiently small), it can be determined that the sliding operation is an erroneous operation by the user, and the pointing direction is not changed. That is, since the width of the slide locus corresponding to the erroneous operation by the user is small, the current aiming direction is kept unchanged when the slide distance is smaller than the first distance. When the sliding distance is greater than the first distance, adjusting the aiming direction according to the sliding operation (the sliding track of the touch point); that is, when the calculated sliding distance of the touch point is greater than or equal to the first distance, it may be determined that the sliding operation is actively (expected) performed by the user, and thus, when the sliding distance is greater than or equal to the first distance, the pointing direction may be adjusted according to the sliding operation.
Taking a third person 3D shooting game as an example, a graphical user interface of a shooting game is shown in fig. 4, a moving joystick 301, an aiming control area 302, and an aiming center 305 are arranged on the graphical user interface, and contents presented by the graphical user interface further include a virtual character 303 and a target to be aimed 304.
And responding to the touch operation acted on the moving rocker 301, controlling the virtual character 303 to move in the game scene, and adjusting the game picture currently presented by the graphical user interface according to the movement of the virtual character 303.
In response to the slide operation applied to the aiming manipulation area 302, the orientation of the virtual character 303 in the game scene is controlled, thereby controlling the aiming direction. Specifically, an initial position of a touch point of a sliding operation and a sliding distance of the touch point relative to the initial position are obtained; when the sliding distance of the touch point formed in the aiming manipulation area 302 is smaller than the first distance, the aiming direction is kept unchanged, that is, the relative position of the collimation center 305 and the target 304 to be aimed in fig. 4 is kept unchanged; when the sliding distance of the touch point formed in the aiming manipulation area 302 is greater than or equal to the first distance, the aiming direction is adjusted, and at this time, the game scene containing the target to be aimed 304 moves according to the sliding operation, so that the target to be aimed 304 coincides with the center of collimation 305 provided at the center of the graphical user interface, as shown in fig. 6.
In other embodiments, the center of gravity 305 may be moved while keeping the game scene still relative to the graphical user interface, and the center of gravity 305 may be moved to coincide with the target 304 to be aimed in the current game screen by a sliding operation, as shown in fig. 5.
Specifically, there are two ways to calculate the sliding distance, the first way is that the touch terminal can acquire the position of the moving touch point in real time and calculate the sliding distance of the touch point relative to the initial position in real time, and the second way is that the sliding distance of the touch point relative to the initial position is calculated when the touch point stops moving.
Here, in the two methods for calculating the sliding distance of the touch point with respect to the initial position, the sliding distances calculated in the two calculation methods may be different due to different time nodes. Specifically, in the first way of calculating the sliding distance, the touch terminal acquires the position of the moving touch point in real time, and calculates the sliding distance of the touch point relative to the initial position in real time; during the process of moving the touch point, that is, during the process from the beginning of moving the touch point to the stopping of moving the touch point, the touch terminal calculates at least one sliding distance (the number of the calculated sliding distances is determined by the sliding distance, the sliding time and the calculation frequency; the longer the distance, the longer the sliding time and the higher the calculation frequency, the more the calculated sliding distance), compares the sliding distance with the first distance after calculating one sliding distance each time, and adjusts the aiming direction according to the sliding operation once the sliding distance is greater than or equal to the first distance (usually, the aiming direction is adjusted according to the relative position relationship between the current position and the initial position of the touch point). That is, in the first way of calculating the sliding distance, before a certain sliding operation stops, as long as a sliding distance greater than or equal to the first distance is determined, the aiming direction is continuously adjusted until the current sliding operation stops (the touch point does not move any more), or a sliding distance smaller than the first distance occurs.
In the second way of calculating the sliding distance, when the sliding operation is stopped (i.e. when the touch point stops moving), the touch terminal responds to the stop of the sliding operation to calculate the sliding distance between the current position of the touch point and the initial position. Specifically, when the sliding operation of the user on the graphical user interface is stopped, the touch terminal responds to the stop of the sliding operation, and the current position of the touch point is used as the stop position of the touch point. After the initial position and the stop position of the touch point are determined, the sliding distance of the touch point in the sliding operation is calculated by using the initial position and the stop position, and if the sliding distance is greater than or equal to the sliding distance of the first distance, the aiming direction is continuously adjusted (or the aiming direction is adjusted according to a preset adjustment range) until the sliding operation is finished (the user lifts the hand), or the sliding operation is moved again after the sliding operation is stopped, the aiming direction is stopped. In one embodiment of the present application, the sliding distance may be a straight distance between the initial position and the stop position.
Compared with the two calculation modes, the first mode of calculating the sliding distance requires frequent calculation no matter whether the sliding operation is stopped or not by the touch terminal, and the aiming direction is adjusted, and the second mode of calculating the sliding distance only requires calculation when the sliding operation is stopped. Therefore, the second calculation mode saves resources of the touch terminal compared with the first calculation mode. However, the first calculation method is more real-time and provides a better user experience.
During specific implementation, the initial position and the stop position of the touch point can be identified by pixel coordinates, and can also be identified by interface coordinates. Specifically, when the pixel coordinates are used for identification, a pixel coordinate system may be established based on the currently displayed image in the graphical user interface, a center point of the image may be used as an origin of the pixel coordinate system, or any one of vertices corresponding to four corners of the image may be used as the origin of the pixel coordinate system, so as to determine the position of the virtual key according to the pixel coordinate system. When the interface coordinate is used for identification, an interface coordinate system is established based on the current graphical user interface, similarly, the central point of the graphical user interface can be used as the origin of the interface coordinate system, any one of the vertexes corresponding to four corners of the graphical user interface can also be used as the origin of the interface coordinate system, and then the position of the virtual key is determined according to the interface coordinate system. Certainly, when the position is a pixel coordinate, that is, a pixel coordinate system is established based on the currently displayed image in the graphical user interface, because the change frequency of the currently displayed image in the graphical user interface is large, the calculation amount of the touch terminal is large when the position of the touch point is calculated based on the pixel coordinate system; when the position is an interface coordinate, that is, an interface coordinate system is established based on the current graphical user interface, because the graphical user interface is a display interface of a touch screen of the touch terminal, the graphical user interface is not changed when a user does not replace the touch terminal (such as a mobile phone, a tablet computer, and the like), and therefore, when the position of the virtual key is calculated based on the interface coordinate system, the calculation amount of the touch terminal is small.
In general, since it is difficult for the user to adjust the aiming direction to the optimum by one sliding operation when performing the sliding operation, that is, the user may need to adjust the aiming direction by a plurality of consecutive sliding operations when adjusting the aiming direction. When the user performs a plurality of sliding operations to adjust the aiming direction, after the touch point stops moving (the user's finger stops moving but does not leave the aiming manipulation area of the graphical user interface), the previous sliding operation is finished, and if the touch point moves again, the next sliding operation is started. After the current sliding operation is stopped, that is, after the touch point stops moving, the touch terminal may update the initial position according to the stop position of the touch point. That is, the stop position (stop position of the touch point) at the time of stopping the current sliding operation is set as the initial position at the time of starting the next sliding operation. That is, in the present scheme, in a specific implementation, a complete sliding operation (a sliding operation corresponding to a behavior of only pressing the targeting manipulation area of the gui once and a behavior of only lifting the hand from the targeting manipulation area of the gui once) is divided into a plurality of consecutive sliding operations (sub-sliding operations) according to whether the sliding operation is paused (whether the touch point is paused), and the determination is performed by using the mechanism of step S301 for each sliding operation.
In one embodiment of the present application, the sliding distance determination mechanism in step S301 is adopted every time a sliding operation is performed, and this setting manner mainly considers that a user may cause an error operation when the user presses a finger of the user and presses an aiming manipulation area of the gui besides lifting the hand. That is, in the solution provided in the present application, no matter which specific sliding operation in a complete sliding operation is performed, as long as the sliding distance of the sliding operation is too short, that is, the sliding distance of the sliding operation is smaller than the first distance, the aiming direction cannot be adjusted according to the sliding operation, that is, the misoperation caused by fine adjustment of the center of gravity of the finger when the finger of the user presses on the aiming control area of the graphical user interface is avoided to a certain extent.
It should be noted that the multiple-sliding operation mentioned in the above description refers to a plurality of sub-sliding operations that divide a complete one sliding operation, and here, the complete sliding operation refers to a sliding operation corresponding to only one behavior of pressing down the aiming manipulation area of the graphical user interface and only one behavior of lifting up the hand from the aiming manipulation area of the graphical user interface.
In the multiple sub-sliding operations, two adjacent sub-sliding operations are divided according to the stop movement of the touch point. That is, when the user performs the sliding operation, the sub-sliding operation is performed every time the user stops (the finger stops moving, but the finger is not lifted from the aiming manipulation area of the graphical user interface), and when the user performs the sliding operation, the movement continued after the stop is counted as the next sub-sliding operation. In step S302, after the touch point of the sub-sliding operation stops moving, the sub-sliding operations are divided into different sub-sliding operations by updating the initial position. After the touch point stops moving, the position where the touch point stops moving is the starting position (initial position) of the next sub-sliding operation, and the initial position is updated according to the stopping position of the touch point, which can be understood as the behavior of dividing a complete sliding operation into multiple continuous sub-sliding operations.
It should be noted that, in the above case, regardless of the adjustment of the pointing direction according to the sliding operation, the initial position is updated according to the stop position of the touch point as long as the sliding operation is performed.
According to the scheme, the moving distance judgment mechanism and the corresponding aiming direction adjustment mechanism of the touch point are set in each sub-sliding operation in the plurality of sub-sliding operations, so that the slight displacement of the touch point of the finger of the user when the finger of the user is pressed on the aiming control area of the graphical user interface and the slight displacement of the touch point when the finger of the user leaves the aiming control area of the graphical user interface can not cause the change of the aiming direction, and the accuracy of the aiming direction adjustment is improved to a certain extent.
After the user completes the aiming direction adjustment, the following 3 cases can be roughly classified:
the first case is to determine the aiming direction at the end of the swipe operation in response to the end of the swipe operation. Specifically, the user performs a sliding operation on the aiming control area of the graphical user interface to adjust the aiming direction, and at the end of the sliding operation (the user raises his hand), only the current aiming direction is determined (which may be understood as storing or sending the aiming direction to a certain terminal/unit).
In the second case, in response to the end of the sliding operation, an attack operation is executed according to the aiming direction at the end of the sliding operation; specifically, the user performs a sliding operation on the aiming control area of the graphical user interface to adjust the aiming direction, and when the sliding operation is finished (the user lifts his hand), the current aiming direction adjustment is completed, and subsequent operations (such as shooting according to the current aiming direction) are automatically performed.
In the third case, in response to a touch operation applied to the skill control, an attack operation is executed according to the aiming direction (current aiming direction) when the sliding operation is stopped; specifically, it may be understood that the user performs a sliding operation on the aiming manipulation area of the graphical user interface to adjust the aiming direction, and clicks a virtual key displayed on the graphical user interface or a physical key disposed on the touch terminal with a finger (usually, other fingers that do not perform the sliding operation), and issues a control instruction (e.g., a shooting instruction) for the skill control to the touch terminal, so that the touch terminal performs a subsequent operation (e.g., shooting) in response to the other control instruction.
Compared with the above 3 cases, the first case is a mode in which automatic shooting is not performed after the user raises his hand; the second case can be understood as the way of automatic shooting after the user lifts his hand; the third case can be understood as a way that the user completes shooting by operating other keys (virtual keys displayed on a graphical user interface or physical keys arranged on a touch terminal).
In a specific implementation, the touch point may be a point or an area adapted to a finger of the user. When the touch point is one point, the method of steps S301 to S302 can be directly executed; when the touch point is an area adapted to the finger of the user, and the initial position, the stop position and the sliding distance of the touch point are calculated, the initial position and the stop position of the central point of the area adapted to the finger of the user can be calculated, the initial position of the central point is used as the initial position of the touch point, and the stop position of the central point is used as the stop position of the touch point; of course, it can also be calculated using any point of the area that is adapted to the user's finger.
In a specific implementation, the size and the position of the aiming manipulation region may be different for different games, for example, for a game scene with general operability (for example, when a game is played, a user needs to use a small number of virtual buttons to operate on a touch screen), the size of the aiming manipulation region may be set to be equal to the area of the touch screen, that is, the user may press any point of the touch screen with a finger to form a touch point, and the formed touch point may control a virtual character in the game to aim, attack, and the like; for a game scene with strong operability (that is, when playing a game, a user needs to operate on the touch screen by using a large number of virtual buttons), the aiming control area can be set to be smaller than the area of the touch screen, so that the system can set other types of virtual buttons on the touch screen except the aiming control area. That is, the touch terminal may determine the size of the aiming manipulation area according to the number of virtual keys displayed on the current graphic user interface and to be displayed on the graphic user interface.
Here, the aiming manipulation area may be set in a lower right corner area of the touch screen, a lower left corner area of the touch screen, a middle area of the touch screen (for example, there are many other virtual keys on the touch screen, and the virtual keys are set on both sides of the touch screen), and a position of the aiming manipulation area in the touch screen may be determined according to a setting of a user. Since the habits of fingers used by users are different when the users operate the touch screen (for example, part of users are used to operate the touch screen by using fingers of the left hand, and part of users are used to operate the touch screen by using fingers of the right hand), preferably, the positions of the aiming control areas in the touch screen can be determined according to the settings of the users. For example, the touch terminal acquires data of the user on the touch screen in historical operation, determines that the user is a user who uses a finger of a left hand to perform aiming operation on the touch screen, and can set an aiming control area in an area at the lower left corner of the touch screen; the user is determined to be a user performing aiming operation on the touch screen by using the finger of the right hand, and the aiming control area can be set in the area at the lower right corner of the touch screen. It should be noted that, if the game is downloaded and used by the user for the first time, the touch terminal may obtain data of the user operating the touch screen when using another game, and further determine whether the user is a user using the left-hand finger to perform the pointing operation on the touch screen or a user using the right-hand finger to perform the pointing operation on the touch screen. That is, the touch terminal may determine the position of the aiming manipulation area in the graphical user interface according to the historical data of the user operating the touch screen, and may also determine the position of the aiming manipulation area in the graphical user interface according to the data generated by the user operating the touch screen while playing other games.
In an embodiment, the aiming manipulation area may be a touch area with a visual indication, which is set on the graphical user interface, for example, the aiming manipulation area may be a touch area rendered with a preset transparency or a preset color, which is convenient for a user to quickly locate the area, reduces the learning cost of a player, and improves the game experience.
In another embodiment, the aiming manipulation area may also be a touch area without a visual indication, which is disposed on the graphical user interface, so as to avoid the occlusion of the game screen.
In one embodiment of the present application, in order to increase the prompting effect on the user, a virtual key may be added in the implementation of the scheme, that is, in step S301, in response to the sliding operation applied to the graphical user interface, obtaining the initial position of the touch point of the sliding operation may be implemented as follows:
and responding to the sliding operation acted on the graphical user interface, acquiring the initial position of the touch point of the sliding operation, and generating a virtual key at the initial position, wherein the virtual key is configured to move according to the movement of the touch point of the sliding operation.
That is, after the user's finger contacts the graphical user interface (after the touch point is determined), a virtual key is generated, which moves according to the movement of the touch point, typically following the sliding of the user's finger. The virtual key is mainly used for prompting the user to inform the user that the current sliding is accepted by the touch terminal. It should be noted that the movement of the virtual key and the movement of the pointing direction are not necessarily related, the movement of the virtual key only depends on whether the touch point moves, and the movement of the pointing direction is required to determine whether the sliding distance of the touch point is large enough.
In one embodiment of the present application, in order to further improve accuracy of the attack or shooting, each time the sliding operation is stopped, the touch terminal may control the virtual game screen corresponding to the current aiming direction to be enlarged in response to the stop of the sliding operation, so as to enlarge and display the current aiming direction of the user, and if the user considers that the current aiming direction is not the aiming direction desired by the user, the sliding operation may be performed again according to the current aiming direction and the aiming direction desired by the user, so that the current aiming direction is consistent with the aiming direction desired by the user. When the virtual game picture corresponding to the current aiming direction is controlled to be amplified, a new picture can be popped up, the picture is positioned in the graphical user interface, and the virtual game picture corresponding to the current aiming direction can be amplified by at least 2 times. Here, the new screen that pops up may be displayed in a preset area (e.g., the upper left or right corner of the graphic user interface), and the new screen existence area that pops up may be determined according to the position of the aiming manipulation area currently pressed by the finger and the position of the current aiming direction. That is to say, the touch terminal responds to the stop of the sliding operation and controls the virtual game picture corresponding to the current aiming direction to be amplified, so that the accuracy of the aiming direction can be improved to a certain extent. When the user performs the slide operation again, the pointing direction may be adjusted directly on the new pop-up screen.
In specific implementation, the target to be aimed also moves, so that after the touch point stops moving, if the target to be aimed moves, the user presses the aiming control area again to perform a sliding operation, so as to achieve the purpose of aiming the target again.
According to the method for controlling the aiming direction in the game, provided by one embodiment of the application, the aiming direction can be kept unchanged when the moving distance of the touch point is smaller than the first distance; therefore, when shooting operation is carried out, attack failure or shooting miss caused by displacement of the touch point due to misoperation of hand shaking is avoided to a certain extent, and accuracy of attack or shooting is improved.
The method provided by the present application is illustrated below by a specific example:
in this example, the touch terminal is a mobile terminal, such as an electronic device like a mobile phone or a tablet computer, which can perform a series of operations on its touch screen by using a finger of a user. The graphical user interface displayed on the touch terminal is rectangular, and the shape of the graphical user interface can be set according to different game scenes, such as a square, a regular polygon, a circle and the like.
After a user starts to perform sliding operation on the aiming control area of the graphical user interface by using a finger, the touch control terminal counts whether the sliding distance of the sliding operation exceeds a first distance or not and whether the sliding operation is stopped or not in real time (the finger of the user does not leave the aiming control area of the graphical user interface, but stops moving); if the sliding distance exceeds the first distance before the sliding operation is stopped, adjusting the aiming direction according to the sliding operation; if the sliding distance does not exceed the first distance before the sliding operation is stopped, keeping the current aiming direction unchanged; if the sliding operation is determined to be stopped (the finger is static relative to the aiming control area of the graphical user interface and does not lift the hand), the sliding operation is finished. Starting the calculation of the next sliding operation, and when the calculation of the next sliding operation is carried out, carrying out real-time statistics on whether the sliding distance of the next sliding operation exceeds the first distance and whether the next sliding operation is stopped; if it is determined that the sliding distance exceeds the first distance before the next sliding operation is stopped, the aiming direction is adjusted according to the next sliding operation.
Based on the same inventive concept, the embodiment of the present application further provides a device for controlling an aiming direction in a game corresponding to the method for controlling an aiming direction in a game, and since the principle of solving the problem of the device in the embodiment of the present application is similar to the method for controlling the aiming direction in the game in the embodiment of the present application, the implementation of the device can refer to the implementation of the method, and repeated details are omitted.
Referring to fig. 7, a device for controlling a pointing direction in a game according to another embodiment of the present application includes:
an obtaining module 701, configured to respond to a sliding operation performed on a graphical user interface, and obtain an initial position of a touch point of the sliding operation and a sliding distance of the touch point with respect to the initial position;
a holding module 702, configured to hold the aiming direction unchanged when the sliding distance is smaller than the first distance;
an adjusting module 703, configured to adjust the aiming direction according to the sliding operation when the sliding distance is greater than the first distance;
and an updating module 704, configured to update the initial position according to the stop position of the touch point in response to the stop movement of the touch point in the sliding operation.
In one embodiment, an update module includes:
and the first updating unit is used for responding that the staying time of the touch point of the sliding operation at the stop position exceeds a first threshold value and updating the initial position according to the stop position of the touch point.
In one embodiment, the device for controlling the aiming direction in the game further comprises:
the determining module 705 is configured to determine the aiming direction when the sliding operation ends in response to the sliding operation ending.
In one embodiment, the device for controlling the aiming direction in the game further comprises:
and a first executing module 706, configured to, in response to the end of the sliding operation, execute an attack operation according to the aiming direction at the end of the sliding operation.
In one embodiment, the device for controlling the aiming direction in the game further comprises:
and a second executing module 707, configured to respond to the touch operation applied to the skill control, and execute an attack operation according to the aiming direction when the sliding operation is stopped.
In one embodiment, the device for controlling the aiming direction in the game further comprises:
and the enlarging module 708 is used for controlling the virtual game picture corresponding to the current aiming direction to be enlarged in response to the stop of the sliding operation.
In an embodiment, the obtaining module 701 includes:
the acquisition unit is used for responding to the sliding operation acted on the graphical user interface, acquiring the initial position of the touch point of the sliding operation and generating a virtual key at the initial position, wherein the virtual key is configured to move according to the movement of the touch point of the sliding operation.
Fig. 8 illustrates a structure of an electronic device 800 according to an embodiment of the present invention, where the electronic device 800 includes: at least one processor 801, at least one network interface 804 or other user interface 803, memory 805, at least one communication bus 802. A communication bus 802 is used to enable connective communication between these components. The electronic device 800 optionally contains a user interface 803 including a display (e.g., touchscreen, LCD, CRT, Holographic (Holographic) or projection (Projector), etc.), a keyboard or a pointing device (e.g., mouse, trackball (trackball), touch pad or touchscreen, etc.).
Memory 805 may include both read-only memory and random-access memory, and provides instructions and data to processor 801. A portion of the memory 805 may also include non-volatile random access memory (NVRAM).
In some embodiments, memory 805 stores the following elements, executable modules or data structures, or a subset thereof, or an expanded set thereof:
an operating system 8051, which contains various system programs for implementing various basic services and for handling hardware-based tasks;
the application module 8052 contains various applications, such as a desktop (launcher), a Media Player (Media Player), a Browser (Browser), and the like, for implementing various application services.
In an embodiment of the invention, the processor 801, by invoking programs or instructions stored in the memory 805, is configured to:
responding to a sliding operation acted on a graphical user interface, and acquiring an initial position of a touch point of the sliding operation and a sliding distance of the touch point relative to the initial position;
when the sliding distance is smaller than the first distance, keeping the aiming direction unchanged;
when the sliding distance is greater than the first distance, the aiming direction is adjusted according to the sliding operation;
and updating the initial position according to the stop position of the touch point in response to the stop movement of the touch point in the sliding operation.
Optionally, the processor 801 executes a method, in which the stopping movement of the touch point in response to the sliding operation includes:
the dwell time of the touch point at the stop position in response to the sliding operation exceeds a first threshold.
Optionally, the processor 801 executes a method further including:
in response to the end of the swipe operation, the aim direction at the end of the swipe operation is determined.
Optionally, the processor 801 executes a method further including:
in response to the end of the swipe operation, an attack operation is performed according to the aiming direction at the end of the swipe operation.
Optionally, in the method performed by the processor 801, the method further includes:
and responding to the touch operation acted on the skill control, and executing attack operation according to the aiming direction when the sliding operation stops.
Optionally, in the method performed by the processor 801, the method further includes:
and controlling the virtual game picture corresponding to the current aiming direction to be amplified in response to the stop of the sliding operation.
Optionally, the processor 801 executes a method, in which the step of obtaining an initial position of a touch point of a sliding operation in response to the sliding operation applied to the graphical user interface includes:
and responding to the sliding operation acted on the graphical user interface, acquiring the initial position of the touch point of the sliding operation, and generating a virtual key at the initial position, wherein the virtual key is configured to move according to the movement of the touch point of the sliding operation.
The computer program product of the method and the apparatus for controlling the pointing direction in the game provided in the embodiment of the present application includes a computer readable storage medium storing a program code, and instructions included in the program code may be used to execute the method in the foregoing method embodiment.
Specifically, the storage medium can be a general-purpose storage medium, such as a removable disk, a hard disk, or the like, and when the computer program on the storage medium is executed, the method for controlling the pointing direction in the game can be executed, so that the misoperation caused when the user's finger presses on the graphical user interface can be avoided to a certain extent.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present application, and are used for illustrating the technical solutions of the present application, but not limiting the same, and the scope of the present application is not limited thereto, and although the present application is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope disclosed in the present application; such modifications, changes or substitutions do not depart from the spirit and scope of the exemplary embodiments of the present application, and are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A method for controlling aiming direction in game is applied to a touch terminal, and is characterized in that the method comprises the following steps:
responding to a sliding operation acted on a graphical user interface, and acquiring an initial position of a touch point of the sliding operation and a sliding distance of the touch point relative to the initial position;
when the sliding distance is smaller than a first distance, keeping the aiming direction unchanged;
when the sliding distance is larger than the first distance, adjusting the aiming direction according to the sliding operation;
and responding to the stop movement of the touch point of the sliding operation, and updating the initial position according to the stop position of the touch point.
2. The control method according to claim 1, wherein the stop movement of the touch point in response to the slide operation includes:
the staying time of the touch point responding to the sliding operation at the stop position exceeds a first threshold value.
3. The control method according to claim 1, characterized by further comprising:
in response to the end of the swipe operation, the aim direction at the end of the swipe operation is determined.
4. The control method according to claim 1, characterized by further comprising:
and responding to the end of the sliding operation, and executing attack operation according to the aiming direction when the sliding operation is ended.
5. The control method according to claim 1, characterized in that the method further comprises:
and responding to the touch operation acted on the skill control, and executing attack operation according to the aiming direction when the sliding operation stops.
6. The control method according to claim 1, characterized in that the method further comprises:
and responding to the stop of the sliding operation, and controlling the virtual game picture corresponding to the current aiming direction to be amplified.
7. The control method according to claim 1, wherein the step of obtaining the initial position of the touch point of the sliding operation in response to the sliding operation applied to the graphical user interface comprises:
responding to a sliding operation acted on a graphical user interface, acquiring an initial position of a touch point of the sliding operation, and generating a virtual key at the initial position, wherein the virtual key is configured to move according to the movement of the touch point of the sliding operation.
8. A control device aiming at direction in game is applied to a touch terminal and is characterized by comprising:
the acquisition module is used for responding to sliding operation acted on a graphical user interface, and acquiring an initial position of a touch point of the sliding operation and a sliding distance of the touch point relative to the initial position;
the maintaining module is used for maintaining the aiming direction unchanged when the sliding distance is smaller than a first distance;
the adjusting module is used for adjusting the aiming direction according to the sliding operation when the sliding distance is larger than the first distance;
and the updating module is used for responding to the stop movement of the touch point of the sliding operation and updating the initial position according to the stop position of the touch point.
9. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the method of controlling a target direction in a game as claimed in any one of claims 1 to 7.
10. A computer-readable storage medium, having stored thereon a computer program which, when being executed by a processor, carries out the steps of the method of controlling a targeting direction in a game according to any one of claims 1 to 7.
CN201910935794.3A 2019-09-29 2019-09-29 Method and device for controlling aiming direction in game, electronic equipment and storage medium Pending CN110652725A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910935794.3A CN110652725A (en) 2019-09-29 2019-09-29 Method and device for controlling aiming direction in game, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910935794.3A CN110652725A (en) 2019-09-29 2019-09-29 Method and device for controlling aiming direction in game, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN110652725A true CN110652725A (en) 2020-01-07

Family

ID=69039912

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910935794.3A Pending CN110652725A (en) 2019-09-29 2019-09-29 Method and device for controlling aiming direction in game, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110652725A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112121416A (en) * 2020-09-30 2020-12-25 腾讯科技(深圳)有限公司 Control method, device, terminal and storage medium of virtual prop
CN112451969A (en) * 2020-12-04 2021-03-09 腾讯科技(深圳)有限公司 Virtual object control method and device, computer equipment and storage medium
CN112783368A (en) * 2021-01-14 2021-05-11 惠州Tcl移动通信有限公司 Method for optimizing touch screen point reporting stability, storage medium and terminal equipment
EP3939681A4 (en) * 2020-04-15 2022-06-29 Tencent Technology (Shenzhen) Company Limited Virtual object control method and apparatus, device, and storage medium
US12005360B2 (en) 2020-06-05 2024-06-11 Tencent Technology (Shenzhen) Company Ltd Virtual object control method and apparatus, computer device, and storage medium
US12017141B2 (en) 2020-04-15 2024-06-25 Tencent Technology (Shenzhen) Company Limited Virtual object control method and apparatus, device, and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107193479A (en) * 2017-05-26 2017-09-22 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN109324739A (en) * 2018-09-28 2019-02-12 腾讯科技(深圳)有限公司 Control method, device, terminal and the storage medium of virtual objects

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107193479A (en) * 2017-05-26 2017-09-22 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN109324739A (en) * 2018-09-28 2019-02-12 腾讯科技(深圳)有限公司 Control method, device, terminal and the storage medium of virtual objects

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3939681A4 (en) * 2020-04-15 2022-06-29 Tencent Technology (Shenzhen) Company Limited Virtual object control method and apparatus, device, and storage medium
US12017141B2 (en) 2020-04-15 2024-06-25 Tencent Technology (Shenzhen) Company Limited Virtual object control method and apparatus, device, and storage medium
US12005360B2 (en) 2020-06-05 2024-06-11 Tencent Technology (Shenzhen) Company Ltd Virtual object control method and apparatus, computer device, and storage medium
CN112121416A (en) * 2020-09-30 2020-12-25 腾讯科技(深圳)有限公司 Control method, device, terminal and storage medium of virtual prop
CN112451969A (en) * 2020-12-04 2021-03-09 腾讯科技(深圳)有限公司 Virtual object control method and device, computer equipment and storage medium
CN112783368A (en) * 2021-01-14 2021-05-11 惠州Tcl移动通信有限公司 Method for optimizing touch screen point reporting stability, storage medium and terminal equipment

Similar Documents

Publication Publication Date Title
CN110652725A (en) Method and device for controlling aiming direction in game, electronic equipment and storage medium
CN109550247B (en) Method and device for adjusting virtual scene in game, electronic equipment and storage medium
US10857462B2 (en) Virtual character controlling method and apparatus, electronic device, and storage medium
US10702774B2 (en) Information processing method, apparatus, electronic device and storage medium
CN108355354B (en) Information processing method, device, terminal and storage medium
CN110639203B (en) Control response method and device in game
CN110665222B (en) Aiming direction control method and device in game, electronic equipment and storage medium
CN107213636B (en) Lens moving method, device, storage medium and processor
CN110575671A (en) Method and device for controlling view angle in game and electronic equipment
KR20220130257A (en) Adaptive display method and apparatus for virtual scene, electronic device, storage medium and computer program product
WO2023109328A1 (en) Game control method and apparatus
CN108635850B (en) Information processing method, device and storage medium
CN110665226A (en) Method, device and storage medium for controlling virtual object in game
CN110665216A (en) Method and device for controlling aiming direction in game, electronic equipment and storage medium
WO2024001191A1 (en) Operation method and apparatus in game, nonvolatile storage medium, and electronic apparatus
CN111617474B (en) Information processing method and device
CN116099195A (en) Game display control method and device, electronic equipment and storage medium
JP2023001925A (en) Program, information processing device, method and system
CN115920395A (en) Interactive control method and device in game and electronic equipment
CN110215687B (en) Game object selection method and device
CN114832371A (en) Method, device, storage medium and electronic device for controlling movement of virtual character
CN113680048A (en) Method and device for adjusting rocker control in game
CN113440835A (en) Control method and device of virtual unit, processor and electronic device
CN113457157A (en) Method and device for switching virtual props in game and touch terminal
WO2024045776A1 (en) Game skill cast method and apparatus, electronic device, and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination