CN117732060A - Aiming method, device, equipment and storage medium in game - Google Patents

Aiming method, device, equipment and storage medium in game Download PDF

Info

Publication number
CN117732060A
CN117732060A CN202311657469.8A CN202311657469A CN117732060A CN 117732060 A CN117732060 A CN 117732060A CN 202311657469 A CN202311657469 A CN 202311657469A CN 117732060 A CN117732060 A CN 117732060A
Authority
CN
China
Prior art keywords
user interface
touch operation
graphical user
aiming
movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311657469.8A
Other languages
Chinese (zh)
Inventor
周逸恒
刘勇成
胡志鹏
袁思思
程龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202311657469.8A priority Critical patent/CN117732060A/en
Publication of CN117732060A publication Critical patent/CN117732060A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The application provides an aiming method, device, equipment and storage medium in a game, wherein the aiming method comprises the following steps: in a target aiming mode, in response to a touch operation for a target virtual control in a graphical user interface, displaying an aiming indicator for indicating a skill aiming direction of a first game character on a virtual ground in a game scene; in response to movement of the touch operation on the graphical user interface, controlling the end position of the aiming indicator to move along with movement of the touch operation, and maintaining the movement angular velocity of the end position on the virtual ground to be the same as the movement angular velocity of the touch operation on the graphical user interface. In this way, the angular speed of the player for controlling the rotation of the control on the terminal screen is kept consistent with the angular speed of the sighting indication direction for rotating on the virtual ground in the target sighting mode, deviation between the operation hand feeling and visual display of the player can be eliminated, and the control accuracy of the player for the sighting indication direction is improved.

Description

Aiming method, device, equipment and storage medium in game
Technical Field
The present application relates to the field of game technologies, and in particular, to a targeting method, device, equipment and storage medium in a game.
Background
At present, in the conventional third person game, when the skill with the aiming indication is used, because the control of the aiming indication direction by the player is not in the same space plane with the virtual ground actually displaying the aiming indication direction in the game scene, the angular speed (corresponding to the rotation angle) of the control on the terminal screen by the player cannot be consistent with the angular speed of the aiming indication direction rotating on the virtual ground, so that the deviation exists between the operation hand feeling and the visual display of the player, and the control accuracy of the player on the aiming indication direction is lower.
Disclosure of Invention
In view of the foregoing, it is an object of the present application to provide a method, apparatus, device and storage medium for aiming in a game, which provides a new target aiming mode for a player, and by keeping the angular velocity of the rotation of a control on a terminal screen by the player consistent with the angular velocity of the rotation of an aiming indication direction on a virtual ground in the target aiming mode, eliminates the deviation between the operation hand feeling and visual display of the player, and improves the accuracy of the control of the aiming indication direction by the player.
In order to make the above objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
In a first aspect, an embodiment of the present application provides a targeting method in a game, where a graphical user interface is provided by a terminal device, where at least a part of a game scene and a first game character are displayed on the graphical user interface; the aiming method comprises the following steps:
in a target aiming mode, in response to a touch operation for a target virtual control in the graphical user interface, displaying an aiming indicator for indicating a skill aiming direction of the first game character on a virtual ground in the game scene; wherein the origin position of the aiming indicator is determined from the character position of the first game character on the virtual ground;
controlling the end point position of the aiming indicator to move along with the movement of the touch operation in response to the movement of the touch operation on the graphical user interface, and keeping the movement angular speed of the end point position on the virtual ground the same as the movement angular speed of the touch operation on the graphical user interface; and determining the final position of the aiming indicator on the virtual ground according to the current position coordinate of the touch operation on the graphical user interface or the target deflection angle of the touch operation relative to the center of the target virtual control.
In a second aspect, an embodiment of the present application provides an aiming device in a game, where a graphical user interface is provided by a terminal device, where at least a part of a game scene and a first game character are displayed on the graphical user interface; the aiming device includes:
a first response module for responding to touch operation of a target virtual control in the graphical user interface in a target aiming mode, and displaying an aiming indicator for indicating a skill aiming direction of the first game character on a virtual ground in the game scene; wherein the origin position of the aiming indicator is determined from the character position of the first game character on the virtual ground;
a second response module, configured to respond to movement of the touch operation on the graphical user interface, control an end position of the aiming indicator to move along with movement of the touch operation, and keep a movement angular velocity of the end position on the virtual ground identical to a movement angular velocity of the touch operation on the graphical user interface; and determining the final position of the aiming indicator on the virtual ground according to the current position coordinate of the touch operation on the graphical user interface or the target deflection angle of the touch operation relative to the center of the target virtual control.
In a third aspect, embodiments of the present application provide a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method of targeting in a game described above when the computer program is executed.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having a computer program stored thereon, which when executed by a processor performs the steps of the targeting method in a game described above.
The technical scheme provided by the embodiment of the application can comprise the following beneficial effects:
in a target aiming mode, responding to touch operation for a target virtual control in a graphical user interface, and displaying an aiming indicator for indicating a skill aiming direction of a first game role on a virtual ground in a game scene; in response to movement of the touch operation on the graphical user interface, controlling the end position of the aiming indicator to move along with movement of the touch operation, and maintaining the movement angular velocity of the end position on the virtual ground to be the same as the movement angular velocity of the touch operation on the graphical user interface. In this way, the angular speed of the player for controlling the rotation of the control on the terminal screen is kept consistent with the angular speed of the sighting indication direction for rotating on the virtual ground in the target sighting mode, deviation between the operation hand feeling and visual display of the player can be eliminated, and the control accuracy of the player for the sighting indication direction is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered limiting the scope, and that other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method of aiming in a game according to an embodiment of the present application;
FIG. 2a is a schematic view illustrating an angle division of a plane in which a graphical user interface according to an embodiment of the present application is located;
fig. 2b shows an angular division schematic diagram of a plane where a virtual ground provided in an embodiment of the present application is located;
FIG. 2c is a schematic diagram illustrating interaction of a control aiming indicator according to an existing control mode in a non-target aiming mode according to an embodiment of the present application;
FIG. 3 is a flow chart of a first method for determining an end position of an aiming indicator on a virtual ground according to an embodiment of the present application;
FIG. 4 is a flow chart of a second method for determining an end position of an aiming indicator on a virtual ground according to an embodiment of the present application;
FIG. 5a is a schematic diagram illustrating a first touch operation performed on a target virtual control according to an embodiment of the present disclosure;
FIG. 5b is a schematic diagram illustrating a second touch operation performed on a target virtual control according to an embodiment of the present disclosure;
FIG. 6 is a schematic view of a targeting device in a game according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device 700 according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it should be understood that the accompanying drawings in the present application are only for the purpose of illustration and description, and are not intended to limit the protection scope of the present application. In addition, it should be understood that the schematic drawings are not drawn to scale. A flowchart, as used in this application, illustrates operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be implemented out of order and that steps without logical context may be performed in reverse order or concurrently. Moreover, one or more other operations may be added to the flow diagrams and one or more operations may be removed from the flow diagrams as directed by those skilled in the art.
In addition, the described embodiments are only some, but not all, of the embodiments of the present application. The components of the embodiments of the present application, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as provided in the accompanying drawings, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, are intended to be within the scope of the present application.
It should be noted that the term "comprising" will be used in the embodiments of the present application to indicate the presence of the features stated hereinafter, but not to exclude the addition of other features.
At present, in the conventional third person game, when the skill with the aiming indication is used, because the control of the aiming indication direction by the player is not in the same space plane with the virtual ground actually displaying the aiming indication direction in the game scene, the angular speed (corresponding to the rotation angle) of the control on the terminal screen by the player cannot be consistent with the angular speed of the aiming indication direction rotating on the virtual ground, so that the deviation exists between the operation hand feeling and the visual display of the player, and the control accuracy of the player on the aiming indication direction is lower.
Based on this, the embodiment of the application provides a method, a device, equipment and a storage medium for aiming in a game, which provide a new target aiming mode for a player, and by keeping the rotation angular speed of a control on a terminal screen by the player consistent with the rotation angular speed of an aiming indication direction on a virtual ground in the target aiming mode, the deviation between the operation hand feeling and visual display of the player is eliminated, and the control accuracy of the player to the aiming indication direction is improved.
In one embodiment of the present application, a method of targeting in a game may be run on a terminal device or a server. The terminal device may be a local terminal device. When the in-game targeting method is run on a server, the targeting method can be implemented and executed based on a cloud interaction system, wherein the cloud interaction system comprises a server and a client device (i.e., a terminal device).
In an alternative embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, a running main body of the game program and a game picture presentation main body are separated, the storage and running of an aiming method in the game are completed on a cloud game server, and a client device is used for receiving and sending data and presenting game pictures, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the terminal device for information processing is cloud game server of cloud. When playing the game, the player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
In order to facilitate understanding of the embodiments of the present application, a method, an apparatus, a device, and a storage medium for targeting in a game provided in the embodiments of the present application are described in detail below.
Referring to fig. 1, fig. 1 is a flow chart illustrating a targeting method in a game according to an embodiment of the present application, wherein a graphical user interface is provided through a terminal device, and at least part of a game scene and a first game character are displayed on the graphical user interface; the aiming method comprises the steps of S101-S102; specific:
S101, in a target aiming mode, responding to touch operation of a target virtual control in the graphical user interface, and displaying an aiming indicator for indicating the skill aiming direction of the first game character on a virtual ground in the game scene.
S102, responding to the movement of the touch operation on the graphical user interface, controlling the end position of the aiming indicator to move along with the movement of the touch operation, and keeping the movement angular speed of the end position on the virtual ground the same as the movement angular speed of the touch operation on the graphical user interface.
In the aiming method in the game provided by the embodiment of the application, in a target aiming mode, responding to touch operation aiming at a target virtual control in a graphical user interface, and displaying an aiming indicator for indicating the skill aiming direction of a first game role on the virtual ground in a game scene; in response to movement of the touch operation on the graphical user interface, controlling the end position of the aiming indicator to move along with movement of the touch operation, and maintaining the movement angular velocity of the end position on the virtual ground to be the same as the movement angular velocity of the touch operation on the graphical user interface. In this way, the angular speed of the player for controlling the rotation of the control on the terminal screen is kept consistent with the angular speed of the sighting indication direction for rotating on the virtual ground in the target sighting mode, deviation between the operation hand feeling and visual display of the player can be eliminated, and the control accuracy of the player for the sighting indication direction is improved.
For the above-mentioned aiming method in the game provided by the embodiment of the application, it should be noted that, in essence, the embodiment of the application provides a new aiming mode (i.e. the above-mentioned target aiming mode) that can be automatically selected to be opened for the player on the basis of the existing control mode of the aiming indicator in the game, the player can select the target aiming mode not to be opened in the game or select the target aiming mode to be closed after the target aiming mode is opened, at this time, the game can be restored to the state of the non-target aiming mode, and in the non-target aiming mode, the player can still control the skill aiming mode indicated by the aiming indicator displayed in the game scene according to the existing control mode, and the target aiming mode is not in conflict with the existing control mode in the game.
Here, in this embodiment of the present application, as an optional embodiment, corresponding to the above steps S101 to S102, in the non-target aiming mode, after the terminal device responds to the touch operation of the player on the target virtual control in the graphical user interface to display the above aiming indicator on the virtual ground in the game scene, the existing control manner for the aiming indicator retained in the game may be executed according to the following method described in steps a1 to a3, specifically:
And a1, responding to the movement of the touch operation on the graphical user interface in a non-target aiming mode, and acquiring the current position coordinate of the touch operation on the graphical user interface.
And a2, calculating a first position coordinate mapped by the current position coordinate on the virtual ground according to the perspective relation between the graphical user interface and the virtual ground.
It should be noted that, the target virtual control is displayed on the graphical user interface provided by the terminal device (which is also equivalent to the plane on which the screen of the terminal device is located), and the aiming indicator is displayed on the virtual ground in the game scene, at this time, since the virtual ground belongs to one spatial plane in the screen of the terminal device, the graphical user interface of the player control target virtual control and the virtual ground on which the aiming indicator is actually displayed respectively belong to different spatial planes, so that a perspective relationship exists between the graphical user interface and the virtual ground, and the angular velocity (which is also equivalent to the angle of rotation) of the player on the graphical user interface through the touch operation cannot be completely equivalent to the angular velocity of rotation of the aiming indicator on the virtual ground, resulting in a deviation between the operation hand feeling (which corresponds to the angular velocity of rotation of the touch operation on the graphical user interface) and the visual display (which corresponds to the angular velocity of rotation of the aiming indicator on the virtual ground).
Here, fig. 2a shows an angle division schematic diagram of a plane where the gui provided in the embodiment of the present application is located, fig. 2b shows an angle division schematic diagram of a plane where the virtual ground provided in the embodiment of the present application is located, as can be seen by comparing fig. 2a with fig. 2b, the plane where the gui shown in fig. 2a is located corresponds to a plane before perspective, and the plane where the virtual plane shown in fig. 2b is located corresponds to a plane after perspective, and specific perspective relationships existing between the two planes before and after perspective are:
taking the horizontal direction on the right side as 0 ° (corresponding to the reference direction) as an example, as shown in fig. 2a and 2b, in the perspective front plane (i.e., the plane in which the graphical user interface is located) shown in fig. 2a, the original 60 ° angle on the right side centered at 0 ° is compressed to 29.5 ° in the plane (i.e., the plane in which the virtual ground is located) after the perspective shown in fig. 2b, and the arc length is compressed to 11.6% from 16.7% of the circumference; whereas in the perspective front plane shown in fig. 2a (i.e. the plane in which the graphical user interface lies), the original 60 ° angle above centered at 90 ° and the original 60 ° angle below centered at 270 °, in the plane after perspective shown in fig. 2b (i.e. the plane in which the virtual ground lies) are respectively expanded to 103.4 °, the arc length is elongated from 16.7% of the circumference to 19.5% -23.2%.
Specifically, the perspective matrix may be used to represent any perspective relationship, so when step a2 is performed, a target perspective matrix representing the mapping from the graphical user interface before perspective to the virtual ground after perspective may be obtained according to the perspective relationship between the graphical user interface (corresponding to the plane before perspective) and the virtual ground (corresponding to the plane after perspective), so that the first position coordinate of the current position coordinate of the touch operation on the graphical user interface mapped on the virtual ground is calculated by means of the target perspective matrix.
Here, regarding the above-described target perspective matrix, both the graphical user interface and the virtual ground are located in a three-dimensional space, and thus, the target perspective matrix may be expressed as a3×3 square matrix as follows:
wherein the element a33 in the bottom right corner of the target perspective matrix is always equal to 1 (which can be understood as the principle of perspective mechanism, always takes effect).
It should be noted that the target perspective matrix may be directly calculated by using a general code tool (for example, a general code tool that may be used includes a function named get Perspective Transform in an open CV source library, a related calculation function in matlab, etc.), or may be calculated by using a mathematical method: for example, the 4 angles of the perspective front plane shown in fig. 2a correspond to the 4 angles of the perspective rear plane shown in fig. 2b, each angle corresponds to one two-dimensional coordinate in the perspective front plane and the perspective rear plane, at this time, by means of the two-dimensional coordinates corresponding to each angle in the two planes, the correspondence between 8 numbers and 8 numbers (corresponding to the correspondence between 4 horizontal coordinates and the correspondence between 4 vertical coordinates) can be obtained, while only 8 unknown elements (i.e. other elements except for the element a33 with the known value of 1) in the target perspective matrix are substituted into the target perspective matrix, and the remaining 8 unknown elements in the target perspective matrix can be obtained by solving an 8-element once equation; based on this, the embodiment of the present application is not limited in any way for the specific calculation mode of the above-mentioned target perspective matrix.
And a3, controlling the end position of the aiming indicator to move to the first position coordinate along with the touch operation.
Here, since the aiming indicator is a skill aiming direction for indicating the first game character, the starting point position of the aiming indicator is always determined according to the character position of the first game character on the virtual ground and does not move along with the touch operation movement, that is, the player can control the ending point position of the aiming indicator to move through the touch operation movement, thereby achieving the effect of changing the skill aiming direction indicated by the aiming indicator.
Specifically, taking an example that the aiming indicator is an arrow indication mark taking the character position of the first game character 100 as a starting point as an example, fig. 2c shows an interaction schematic diagram of controlling the aiming indicator according to the existing control manner in the non-target aiming mode provided by the embodiment of the present application, as shown in fig. 2c, the terminal device responds to the touch operation of the player on the target virtual control 210, displays the aiming indicator 220 on the virtual ground 201 in the game scene (the default orientation of the initial display of the aiming indicator 220 may be the direction shown in fig. 2c, which is not limited thereto), the player can continue to move in the effective control area 211 associated with the target virtual control 210 on the graphical user interface 200, acquires the current position coordinate of the touch operation on the graphical user interface 200 according to the movement of the touch operation, and calculates the first position coordinate mapped by the current position coordinate on the virtual ground 201 according to the perspective relation between the graphical user interface 200 and the virtual ground 201, controls the end point position of the aiming indicator 220 to the first position coordinate on the virtual ground 201, and the virtual control 220 is displayed at the default angle of the virtual control point relative to the initial position of the virtual ground 201 (the initial angle of the virtual control 220 is greater than the initial angle of the first position of the user plane 29 °) when the touch is compared with the initial position of the user position of the virtual control 220 is displayed at the initial angle of the point of the virtual ground (the point of 29 °).
Based on this, it can be found that, in the non-target aiming mode as shown in fig. 2c, when the aiming indicator is controlled to move according to the existing control mode, the angular velocity (corresponding to the rotated angle) of the player rotating on the graphical user interface through the touch operation is not completely equal to the angular velocity of the aiming indicator rotating on the virtual ground, so that a deviation exists between the operation hand (corresponding to the angular velocity of the touch operation rotating on the graphical user interface) and the visual display (corresponding to the angular velocity of the aiming indicator rotating on the virtual ground).
The following exemplary descriptions are given to each step in the above-mentioned aiming method in the game according to the embodiment of the present application:
s101, in a target aiming mode, responding to touch operation of a target virtual control in the graphical user interface, and displaying an aiming indicator for indicating the skill aiming direction of the first game character on a virtual ground in the game scene.
Here, in order to avoid a conflict with the existing control manner of the aiming indicator in the reserved game, the embodiment of the application provides a target aiming mode which can be freely opened and closed for a player in the game (for example, the player can freely select to open the target aiming mode or close the target aiming mode in a setting option in the game), so that the player can control the movement angular speed of the aiming indicator on the virtual ground to be the same as the movement angular speed of the touch operation on the graphical user interface in the target aiming mode, thereby eliminating the deviation between the operation hand feeling and the visual display of the player.
Specifically, the target virtual control is a virtual control for controlling the aiming indicator; the touch operation is an operation for triggering and displaying the aiming indicator in a game scene (particularly on a virtual ground in the game scene); wherein, in either the target aiming mode or the non-target aiming mode (i.e. the aiming indicator is controlled according to the prior control mode), the terminal device responds to the touch operation of the target virtual control, and triggers and displays the aiming indicator for indicating the skill aiming direction of the first game character on the virtual ground by taking the character position as a starting point according to the character position (particularly the position on the virtual ground) of the first game character controlled by the player in the game scene (i.e. the starting point position of the aiming indicator is determined according to the character position of the first game character on the virtual ground); that is, the initial trigger display mode of the aiming indicator (i.e., the step S101) is the same in both the target aiming mode and the non-target aiming mode.
It should be noted that, the control shape of the target virtual control may be a circle, or may be any shape such as a square or a star; the aiming indicator is only used for indicating the skill aiming direction of the first game role; the control shape of the target virtual control and the specific display style of the aiming indicator are not limited in any way.
S102, responding to the movement of the touch operation on the graphical user interface, controlling the end position of the aiming indicator to move along with the movement of the touch operation, and keeping the movement angular speed of the end position on the virtual ground the same as the movement angular speed of the touch operation on the graphical user interface.
Here, based on the touch operation control for the target virtual control, the specific skill aiming direction indicated by the aiming indicator on the virtual ground is controlled, and is not used for changing the character position of the first game character on the virtual ground, so that the starting point position of the aiming indicator is kept unchanged (still determined according to the character position of the first game character on the virtual ground) in the process of moving along with the touch operation, and the ending point position of the aiming indicator is controlled to move along with the movement of the touch operation, so that the effect of changing the skill releasing direction indicated by the aiming indicator (namely, the specific direction of the aiming indicator on the virtual ground) is achieved.
In this embodiment of the present application, unlike the above-mentioned existing control method (refer to the above-mentioned steps a1-a 3) for obtaining the current position coordinates of the moved touch operation on the gui, so as to determine the end position of the moved aiming indicator on the virtual ground according to the current position coordinates and the perspective relationship (i.e. the perspective relationship between the gui and the virtual ground), the present application provides at least the following two alternative embodiments to execute the above-mentioned step S102 according to whether the underlying code logic of the above-mentioned existing control method is maintained (i.e. the current position coordinates of the moved touch operation on the gui are still obtained, and the end position of the aiming indicator on the virtual ground is determined according to the correspondence relationship between the coordinates):
In a first alternative embodiment, the destination position of the moved aiming indicator on the virtual ground is determined based on the underlying code logic of the existing control manner, that is, still according to the current position coordinates of the touch operation on the gui, where fig. 3 is a schematic flow chart of a first method for determining the destination position of the aiming indicator on the virtual ground according to the embodiment of the present application, as shown in fig. 3, and when the step S102 is performed, the method includes steps S301-S303, specifically:
s301, responding to the movement of the touch operation on the graphical user interface, and acquiring the current position coordinate of the touch operation on the graphical user interface.
Here, the touch-based operation is to move on the graphical user interface, and thus the obtained current position coordinates belong to position coordinates on the perspective front plane (i.e., the graphical user interface).
S302, correcting the current position coordinate of the touch operation on the graphical user interface according to the perspective relation between the graphical user interface and the virtual ground, and obtaining the corrected position coordinate of the touch operation on the graphical user interface.
Here, when step S302 is performed, the current position coordinates of the touch operation on the gui may be corrected according to the following method shown in steps b1-b3, specifically:
and b1, calculating a first position coordinate mapped by the current position coordinate on the virtual ground based on the perspective relation.
The specific embodiment of the step b1 is the same as the step a2 in the above-mentioned prior control method, and the repetition is not repeated here.
Specifically, the current position coordinates on the perspective front plane (i.e., the graphical user interface) in touch operation are (x) w ,y w ) For example, the (x) can be calculated by means of the following formula by means of a target perspective matrix characterizing the above perspective relationship w ,y w ) First position coordinates (x) mapped on perspective back plane (i.e. virtual ground) s ,y s ):
Wherein, due to (x) w ,y w ) Is the coordinate on the perspective front plane (also equivalent to the coordinate on the plane looking forward from the z-axis) resulting in one dimension in three dimensions (now equivalent to the spatial dimension represented by the z-axis) never becoming effective, mathematically taking a 1.
At this time, since (x) w ,y w ) And each element in the target perspective matrix is known data, so that specific values of t1, t2 and t3 in the formula can be solved, and then a first position coordinate (x s ,y s ):
And b2, calculating a second position coordinate of the current position coordinate mapped on the virtual ground under the condition of no perspective relation according to the target deflection angle and the first position coordinate.
Here, after the current position coordinate of the touch operation on the graphical user interface is obtained, the target deflection angle of the touch operation relative to the center of the target virtual control at this time may be calculated according to the original position coordinate of the touch operation before moving and the current position coordinate.
It should be noted that, in step b2, the underlying code logic is not changed, that is, the terminal device directly obtains the current position coordinates of the touch operation on the graphical user interface, instead of directly obtaining the target deflection angle of the touch operation relative to the center of the target virtual control.
Specifically, the target deflection angle is θ, the first position coordinate is (x s ,y s ) For example, the current position coordinates (x w ,y w ) The second position coordinates mapped on the virtual ground without perspective are (x) s +ρcosθ,y s +ρsin θ); wherein ρ represents a radius in the polar coordinate system, and since the value of ρ does not affect the calculation in the subsequent step in the present application, for the sake of calculation, ρ may be defaulted to be 1, which is not limited in this embodiment of the present application.
And b3, calculating the coordinate mapped by the second position coordinate on the graphical user interface based on the perspective relation as the corrected position coordinate.
Here, the aforementioned target perspective matrix represents the perspective relationship mapped from the graphical user interface to the virtual ground, and the inverse matrix of the target perspective matrix may represent the perspective relationship mapped from the virtual ground to the graphical user interface.
Specifically, the target perspective matrixIs written as the inverse of: />At this time, by calculation: />
Wherein all elements in the inverse matrix of the target perspective matrix and the second position coordinates are (x s +ρcosθ,y s The +ρsin θ is known data, and specific values of t4, t5, and t6 can be obtained by solving the above, and at this time, corrected position coordinates (x) w ′,y w ′):
And S303, controlling the end position of the aiming indicator to move to the target position coordinate mapped by the correction position coordinate on the virtual ground based on the perspective relation.
Here, the deflection angle of the target position coordinate relative to the character position of the first game character is the same as the target deflection angle, that is, on the basis of still keeping the underlying code logic unchanged, the angular speed of the movement of the aiming indicator on the virtual ground is controlled to be the same as the movement angular speed of the touch operation of the player on the graphical user interface through the mathematical calculation mode (invisible to the external player) of each step, so that in the target aiming mode, the deviation between the operation hand feeling and the visual display of the player is eliminated, and the control accuracy of the player on the aiming indication direction is improved.
In a second alternative embodiment, the underlying code logic of the existing control manner may be modified, where the current position coordinate of the touch operation on the gui in the underlying code logic needs to be obtained is modified to directly obtain the target deflection angle of the touch operation relative to the center of the target virtual control (equivalent to determining the destination position of the moved aiming indicator on the virtual ground according to the target deflection angle), where fig. 4 shows a flowchart of a second method for determining the destination position of the aiming indicator on the virtual ground according to the embodiment of the present application, as shown in fig. 4, and when executing the step S102, the method includes steps S401-S402, specifically:
s401, responding to the movement of the touch operation on the graphical user interface, and acquiring a target deflection angle of the touch operation relative to the target virtual control center.
Here, unlike the above steps S301 to S303, at this time, on the basis of changing the underlying code logic, the terminal device may directly obtain the target deflection angle of the touch operation with respect to the target virtual control center, without performing the complex mathematical calculation steps of the above steps S301 to S303 by obtaining the current position coordinates of the touch operation on the graphical user interface.
And S402, controlling the end point position of the aiming indicator to deflect relative to the character position of the first game character according to the target deflection angle on the virtual ground.
Specifically, after the target deflection angle is obtained, the original code content, which needs to consider the perspective relation, in the underlying code logic can be modified, and by modifying the underlying code logic, the movement angular speed of the aiming indicator on the virtual ground is directly controlled to be the same as the movement angular speed of the touch operation of the player on the graphical user interface (namely, the end position of the aiming indicator is controlled to deflect according to the target deflection angle relative to the character position of the first game character).
It should be noted that, although the second alternative embodiment of steps S401-S402 omits more mathematical calculation steps than the first alternative embodiment of steps S301-S303, the second alternative embodiment of steps S401-S402 also has more modification steps to the underlying code logic, so the two alternative embodiments may be selected according to the actual game setting requirements, and the embodiment of the present application is not limited in any way.
In this embodiment of the present application, when the step S102 is executed, when the current position coordinate of the touch operation on the gui needs to be obtained, according to different setting requirements of the target virtual control, the current position coordinate of the touch operation on the gui may be obtained in the following two alternative manners, specifically:
in a first alternative, when the target virtual control is set as the movable control, the terminal device may acquire the current position coordinates of the touch operation on the graphical user interface in the following manner in step c 1:
and c1, responding to the movement of the touch operation on the graphical user interface, and acquiring the current position coordinate of the target virtual control on the graphical user interface as the current position coordinate of the touch operation on the graphical user interface when the target virtual control moves along with the movement of the touch operation.
For example, as shown in fig. 5a, fig. 5a shows an interaction schematic diagram of a first touch operation performed on a target virtual control according to an embodiment of the present application, where a terminal device responds to a movement of the touch operation on a graphical user interface 200, when a player drags the target virtual control 210 through the touch operation to rotate 45 ° in an effective control area 211, a current position of the target virtual control 210 may be obtained as a current position coordinate of the touch operation on the graphical user interface, so as to control the aiming indicator 220 to follow the movement of the touch operation, and determine that an angle through which the end position of the aiming indicator 220 also rotates is 45 °, thereby eliminating a deviation between an operation hand feel and a visual display of the player, and improving a control accuracy of the player for an aiming indication direction.
In a second alternative, when the target virtual control is set as the non-movable control (i.e. the position of the target virtual control on the graphical user interface is fixed), the terminal device may acquire the current position coordinates of the touch operation on the graphical user interface in the following manner in step c 2:
and c2, responding to the movement of the touch operation on the graphical user interface, and directly acquiring the current position coordinate of the touch operation on the graphical user interface when the position of the target virtual control on the graphical user interface is fixed.
For example, as shown in fig. 5b, fig. 5b shows an interaction schematic diagram of a second touch operation performed on a target virtual control provided by the embodiment of the present application, where, unlike fig. 5a, a terminal device responds to a movement of the touch operation on a graphical user interface 200, where the target virtual control 210 belongs to an immovable control, a current position coordinate of the touch operation on the graphical user interface may be directly obtained, so that the aiming indicator 220 is controlled to follow the movement of the touch operation, and an angle through which the end position of the aiming indicator 220 is also determined to rotate is 45 °, which can also eliminate a deviation between an operation hand feel of a player and a visual display, and improve control accuracy of the player for an aiming indication direction.
According to the aiming method in the game, in a target aiming mode, responding to touch operation aiming at a target virtual control in a graphical user interface, and displaying an aiming indicator for indicating the skill aiming direction of a first game role on a virtual ground in a game scene; in response to movement of the touch operation on the graphical user interface, controlling the end position of the aiming indicator to move along with movement of the touch operation, and maintaining the movement angular velocity of the end position on the virtual ground to be the same as the movement angular velocity of the touch operation on the graphical user interface. In this way, the angular speed of the player for controlling the rotation of the control on the terminal screen is kept consistent with the angular speed of the sighting indication direction for rotating on the virtual ground in the target sighting mode, deviation between the operation hand feeling and visual display of the player can be eliminated, and the control accuracy of the player for the sighting indication direction is improved.
Based on the same inventive concept, the present application also provides an aiming device corresponding to the aiming method in the game, and since the principle of solving the problem of the aiming device in the embodiment of the present application is similar to that of the aiming method in the game in the embodiment of the present application, the implementation of the aiming device can refer to the implementation of the aiming method, and the repetition is omitted.
Referring to fig. 6, fig. 6 is a schematic structural diagram of an aiming device in a game according to an embodiment of the present application, where a graphical user interface is provided through a terminal device, and at least part of a game scene and a first game character are displayed on the graphical user interface; the aiming device includes:
a first response module 601, configured to display, in a target aiming mode, an aiming indicator for indicating a skill aiming direction of the first game character on a virtual ground in the game scene in response to a touch operation for a target virtual control in the graphical user interface; wherein the origin position of the aiming indicator is determined from the character position of the first game character on the virtual ground;
a second response module 602, configured to control, in response to movement of the touch operation on the graphical user interface, an end position of the aiming indicator to move along with movement of the touch operation, and keep an angular velocity of movement of the end position on the virtual ground equal to an angular velocity of movement of the touch operation on the graphical user interface; and determining the final position of the aiming indicator on the virtual ground according to the current position coordinate of the touch operation on the graphical user interface or the target deflection angle of the touch operation relative to the center of the target virtual control.
In an alternative embodiment, the second response module 602 is configured to control the movement of the end position of the aiming indicator along with the movement of the touch operation by the following method, and keep the angular speed of the movement of the end position on the virtual ground equal to the angular speed of the movement of the touch operation on the graphical user interface:
responding to the movement of the touch operation on the graphical user interface, and acquiring the current position coordinate of the touch operation on the graphical user interface;
correcting the current position coordinate of the touch operation on the graphical user interface according to the perspective relation between the graphical user interface and the virtual ground to obtain the corrected position coordinate of the touch operation on the graphical user interface;
controlling the end position of the aiming indicator to move to a target position coordinate mapped by the corrected position coordinate on the virtual ground based on the perspective relation; wherein the target position coordinates have the same deflection angle with respect to the character position of the first game character as the target deflection angle.
In an alternative embodiment, when the current position coordinates of the touch operation on the gui are corrected according to the perspective relationship between the gui and the virtual ground, the second response module 602 is configured to:
Calculating a first position coordinate mapped by the current position coordinate on the virtual ground based on the perspective relation;
calculating a second position coordinate of the current position coordinate mapped on the virtual ground under the condition of no perspective relation according to the target deflection angle and the first position coordinate;
and calculating the coordinate mapped by the second position coordinate on the graphical user interface as the corrected position coordinate based on the perspective relation.
In an alternative embodiment, the second response module 602 is configured to control the movement of the end position of the aiming indicator along with the movement of the touch operation by the following method, and keep the angular speed of the movement of the end position on the virtual ground equal to the angular speed of the movement of the touch operation on the graphical user interface:
responding to the movement of the touch operation on the graphical user interface, and acquiring a target deflection angle of the touch operation relative to the target virtual control center;
and controlling the end position of the aiming indicator to deflect relative to the character position of the first game character according to the target deflection angle on the virtual ground.
In an alternative embodiment, the second response module 602 is configured to obtain the current position coordinates of the touch operation on the graphical user interface by:
and responding to the movement of the touch operation on the graphical user interface, and when the target virtual control moves along with the movement of the touch operation, acquiring the current position coordinate of the target virtual control on the graphical user interface as the current position coordinate of the touch operation on the graphical user interface.
In an alternative embodiment, the second response module 602 is configured to obtain the current position coordinates of the touch operation on the graphical user interface by:
and responding to the movement of the touch operation on the graphical user interface, and directly acquiring the current position coordinate of the touch operation on the graphical user interface when the position of the target virtual control on the graphical user interface is fixed.
In an alternative embodiment, the targeting method further comprises: the third response module is used for:
in a non-target aiming mode, responding to the movement of the touch operation on the graphical user interface, and acquiring the current position coordinate of the touch operation on the graphical user interface;
Calculating a first position coordinate mapped by the current position coordinate on the virtual ground according to the perspective relation between the graphical user interface and the virtual ground;
an end position of the aiming indicator is controlled to move to the first position coordinates following the touch operation.
Based on the aiming device in the game provided by the embodiment of the application, in a target aiming mode, responding to touch operation aiming at a target virtual control in a graphical user interface, and displaying an aiming indicator for indicating the skill aiming direction of a first game role on a virtual ground in a game scene; in response to movement of the touch operation on the graphical user interface, controlling the end position of the aiming indicator to move along with movement of the touch operation, and maintaining the movement angular velocity of the end position on the virtual ground to be the same as the movement angular velocity of the touch operation on the graphical user interface. In this way, the angular speed of the player for controlling the rotation of the control on the terminal screen is kept consistent with the angular speed of the sighting indication direction for rotating on the virtual ground in the target sighting mode, deviation between the operation hand feeling and visual display of the player can be eliminated, and the control accuracy of the player for the sighting indication direction is improved.
Based on the same inventive concept, the present application further provides an electronic device corresponding to the above-mentioned aiming method in the game, and since the principle of solving the problem of the electronic device in the embodiment of the present application is similar to that of the above-mentioned aiming method in the embodiment of the present application, the implementation of the electronic device may refer to the implementation of the above-mentioned aiming method, and the repetition is omitted.
Fig. 7 is a schematic structural diagram of an electronic device 700 according to an embodiment of the present application, including: a processor 701, a memory 702, and a bus 703, the memory 702 storing machine-readable instructions executable by the processor 701, the processor 701 and the memory 702 in communication via the bus 703 when the electronic device is running an aiming method in a game as in the embodiments, the processor 701 executing the machine-readable instructions, wherein a graphical user interface is provided by the terminal device, the graphical user interface having at least a portion of the game scene and the first game character displayed thereon; the processor 701 executes the machine-readable instructions to implement the following steps, in particular:
in a target aiming mode, in response to a touch operation for a target virtual control in the graphical user interface, displaying an aiming indicator for indicating a skill aiming direction of the first game character on a virtual ground in the game scene; wherein the origin position of the aiming indicator is determined from the character position of the first game character on the virtual ground;
Controlling the end point position of the aiming indicator to move along with the movement of the touch operation in response to the movement of the touch operation on the graphical user interface, and keeping the movement angular speed of the end point position on the virtual ground the same as the movement angular speed of the touch operation on the graphical user interface; and determining the final position of the aiming indicator on the virtual ground according to the current position coordinate of the touch operation on the graphical user interface or the target deflection angle of the touch operation relative to the center of the target virtual control.
In an alternative embodiment, processor 701 is configured to control the movement of the destination location of the sighting indicator following the movement of the touch operation by maintaining the same angular velocity of movement of the destination location on the virtual ground as the angular velocity of movement of the touch operation on the graphical user interface by:
responding to the movement of the touch operation on the graphical user interface, and acquiring the current position coordinate of the touch operation on the graphical user interface;
correcting the current position coordinate of the touch operation on the graphical user interface according to the perspective relation between the graphical user interface and the virtual ground to obtain the corrected position coordinate of the touch operation on the graphical user interface;
Controlling the end position of the aiming indicator to move to a target position coordinate mapped by the corrected position coordinate on the virtual ground based on the perspective relation; wherein the target position coordinates have the same deflection angle with respect to the character position of the first game character as the target deflection angle.
In an alternative embodiment, when correcting the current position coordinates of the touch operation on the graphical user interface according to the perspective relationship between the graphical user interface and the virtual ground, the processor 701 is configured to:
calculating a first position coordinate mapped by the current position coordinate on the virtual ground based on the perspective relation;
calculating a second position coordinate of the current position coordinate mapped on the virtual ground under the condition of no perspective relation according to the target deflection angle and the first position coordinate;
and calculating the coordinate mapped by the second position coordinate on the graphical user interface as the corrected position coordinate based on the perspective relation.
In an alternative embodiment, processor 701 is configured to control the movement of the destination location of the sighting indicator following the movement of the touch operation by maintaining the same angular velocity of movement of the destination location on the virtual ground as the angular velocity of movement of the touch operation on the graphical user interface by:
Responding to the movement of the touch operation on the graphical user interface, and acquiring a target deflection angle of the touch operation relative to the target virtual control center;
and controlling the end position of the aiming indicator to deflect relative to the character position of the first game character according to the target deflection angle on the virtual ground.
In an alternative embodiment, the processor 701 is configured to obtain the current position coordinates of the touch operation on the graphical user interface by:
and responding to the movement of the touch operation on the graphical user interface, and when the target virtual control moves along with the movement of the touch operation, acquiring the current position coordinate of the target virtual control on the graphical user interface as the current position coordinate of the touch operation on the graphical user interface.
In an alternative embodiment, the processor 701 is configured to obtain the current position coordinates of the touch operation on the graphical user interface by:
and responding to the movement of the touch operation on the graphical user interface, and directly acquiring the current position coordinate of the touch operation on the graphical user interface when the position of the target virtual control on the graphical user interface is fixed.
In an alternative embodiment, processor 701 is further configured to:
in a non-target aiming mode, responding to the movement of the touch operation on the graphical user interface, and acquiring the current position coordinate of the touch operation on the graphical user interface;
calculating a first position coordinate mapped by the current position coordinate on the virtual ground according to the perspective relation between the graphical user interface and the virtual ground;
an end position of the aiming indicator is controlled to move to the first position coordinates following the touch operation.
According to the electronic equipment provided by the embodiment of the application, in a target aiming mode, responding to touch operation aiming at a target virtual control in a graphical user interface, and displaying an aiming indicator for indicating the skill aiming direction of a first game role on the virtual ground in a game scene; in response to movement of the touch operation on the graphical user interface, controlling the end position of the aiming indicator to move along with movement of the touch operation, and maintaining the movement angular velocity of the end position on the virtual ground to be the same as the movement angular velocity of the touch operation on the graphical user interface. In this way, the angular speed of the player for controlling the rotation of the control on the terminal screen is kept consistent with the angular speed of the sighting indication direction for rotating on the virtual ground in the target sighting mode, deviation between the operation hand feeling and visual display of the player can be eliminated, and the control accuracy of the player for the sighting indication direction is improved.
Based on the same inventive concept, the embodiments of the present application also provide a computer readable storage medium, wherein a graphical user interface is provided through a terminal device, where at least a part of a game scene and a first game character are displayed on the graphical user interface; the computer readable storage medium has stored thereon a computer program which, when executed by a processor, performs the steps of:
in a target aiming mode, in response to a touch operation for a target virtual control in the graphical user interface, displaying an aiming indicator for indicating a skill aiming direction of the first game character on a virtual ground in the game scene; wherein the origin position of the aiming indicator is determined from the character position of the first game character on the virtual ground;
controlling the end point position of the aiming indicator to move along with the movement of the touch operation in response to the movement of the touch operation on the graphical user interface, and keeping the movement angular speed of the end point position on the virtual ground the same as the movement angular speed of the touch operation on the graphical user interface; and determining the final position of the aiming indicator on the virtual ground according to the current position coordinate of the touch operation on the graphical user interface or the target deflection angle of the touch operation relative to the center of the target virtual control.
In an alternative embodiment, the processor is configured to control the movement of the end position of the aiming indicator to follow the movement of the touch operation by maintaining the same angular velocity of movement of the end position on the virtual ground as the angular velocity of movement of the touch operation on the graphical user interface by:
responding to the movement of the touch operation on the graphical user interface, and acquiring the current position coordinate of the touch operation on the graphical user interface;
correcting the current position coordinate of the touch operation on the graphical user interface according to the perspective relation between the graphical user interface and the virtual ground to obtain the corrected position coordinate of the touch operation on the graphical user interface;
controlling the end position of the aiming indicator to move to a target position coordinate mapped by the corrected position coordinate on the virtual ground based on the perspective relation; wherein the target position coordinates have the same deflection angle with respect to the character position of the first game character as the target deflection angle.
In an alternative embodiment, when correcting the current position coordinates of the touch operation on the graphical user interface according to the perspective relationship between the graphical user interface and the virtual ground, the processor is configured to:
Calculating a first position coordinate mapped by the current position coordinate on the virtual ground based on the perspective relation;
calculating a second position coordinate of the current position coordinate mapped on the virtual ground under the condition of no perspective relation according to the target deflection angle and the first position coordinate;
and calculating the coordinate mapped by the second position coordinate on the graphical user interface as the corrected position coordinate based on the perspective relation.
In an alternative embodiment, the processor is configured to control the movement of the end position of the aiming indicator to follow the movement of the touch operation by maintaining the same angular velocity of movement of the end position on the virtual ground as the angular velocity of movement of the touch operation on the graphical user interface by:
responding to the movement of the touch operation on the graphical user interface, and acquiring a target deflection angle of the touch operation relative to the target virtual control center;
and controlling the end position of the aiming indicator to deflect relative to the character position of the first game character according to the target deflection angle on the virtual ground.
In an alternative embodiment, the processor is configured to obtain the current position coordinates of the touch operation on the graphical user interface by:
and responding to the movement of the touch operation on the graphical user interface, and when the target virtual control moves along with the movement of the touch operation, acquiring the current position coordinate of the target virtual control on the graphical user interface as the current position coordinate of the touch operation on the graphical user interface.
In an alternative embodiment, the processor is configured to obtain the current position coordinates of the touch operation on the graphical user interface by:
and responding to the movement of the touch operation on the graphical user interface, and directly acquiring the current position coordinate of the touch operation on the graphical user interface when the position of the target virtual control on the graphical user interface is fixed.
In an alternative embodiment, the processor is further configured to:
in a non-target aiming mode, responding to the movement of the touch operation on the graphical user interface, and acquiring the current position coordinate of the touch operation on the graphical user interface;
Calculating a first position coordinate mapped by the current position coordinate on the virtual ground according to the perspective relation between the graphical user interface and the virtual ground;
an end position of the aiming indicator is controlled to move to the first position coordinates following the touch operation.
With the computer-readable storage medium provided by the embodiments of the present application, in a target aiming mode, in response to a touch operation for a target virtual control in a graphical user interface, an aiming indicator for indicating a skill aiming direction of a first game character is displayed on a virtual ground in a game scene; in response to movement of the touch operation on the graphical user interface, controlling the end position of the aiming indicator to move along with movement of the touch operation, and maintaining the movement angular velocity of the end position on the virtual ground to be the same as the movement angular velocity of the touch operation on the graphical user interface. In this way, the angular speed of the player for controlling the rotation of the control on the terminal screen is kept consistent with the angular speed of the sighting indication direction for rotating on the virtual ground in the target sighting mode, deviation between the operation hand feeling and visual display of the player can be eliminated, and the control accuracy of the player for the sighting indication direction is improved.
In the embodiments of the present application, the computer readable storage medium may also execute other machine readable instructions when executed by the processor to perform the targeting method in the game as described in other embodiments, and the specific implementation of the targeting method steps and principles are referred to in the description of the method side embodiments and are not repeated herein.
In the embodiments provided herein, it should be understood that the disclosed systems and methods may be implemented in other ways. The system embodiments described above are merely illustrative, e.g., the division of the elements is merely a logical functional division, and there may be additional divisions in actual implementation, and e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, system or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments provided in the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
It should be noted that: like reference numerals and letters in the following figures denote like items, and thus once an item is defined in one figure, no further definition or explanation of it is required in the following figures, and furthermore, the terms "first," "second," "third," etc. are used merely to distinguish one description from another and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the foregoing examples are merely specific embodiments of the present application, and are not intended to limit the scope of the present application, but the present application is not limited thereto, and those skilled in the art will appreciate that while the foregoing examples are described in detail, the present application is not limited thereto. Any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or make equivalent substitutions for some of the technical features within the technical scope of the disclosure of the present application; such modifications, changes or substitutions do not depart from the spirit and scope of the corresponding technical solutions. Are intended to be encompassed within the scope of this application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A method for aiming in a game, characterized in that a graphical user interface is provided through a terminal device, wherein at least part of a game scene and a first game character are displayed on the graphical user interface; the aiming method comprises the following steps:
in a target aiming mode, in response to a touch operation for a target virtual control in the graphical user interface, displaying an aiming indicator for indicating a skill aiming direction of the first game character on a virtual ground in the game scene; wherein the origin position of the aiming indicator is determined from the character position of the first game character on the virtual ground;
Controlling the end point position of the aiming indicator to move along with the movement of the touch operation in response to the movement of the touch operation on the graphical user interface, and keeping the movement angular speed of the end point position on the virtual ground the same as the movement angular speed of the touch operation on the graphical user interface; and determining the final position of the aiming indicator on the virtual ground according to the current position coordinate of the touch operation on the graphical user interface or the target deflection angle of the touch operation relative to the center of the target virtual control.
2. The aiming method according to claim 1, wherein the end position of the aiming indicator is controlled to move following the movement of the touch operation by keeping the angular velocity of the movement of the end position on the virtual ground the same as the angular velocity of the movement of the touch operation on the graphical user interface by:
responding to the movement of the touch operation on the graphical user interface, and acquiring the current position coordinate of the touch operation on the graphical user interface;
correcting the current position coordinate of the touch operation on the graphical user interface according to the perspective relation between the graphical user interface and the virtual ground to obtain the corrected position coordinate of the touch operation on the graphical user interface;
Controlling the end position of the aiming indicator to move to a target position coordinate mapped by the corrected position coordinate on the virtual ground based on the perspective relation; wherein the target position coordinates have the same deflection angle with respect to the character position of the first game character as the target deflection angle.
3. The aiming method according to claim 2, wherein correcting the current position coordinates of the touch operation on the graphical user interface according to the perspective relationship between the graphical user interface and the virtual ground comprises:
calculating a first position coordinate mapped by the current position coordinate on the virtual ground based on the perspective relation;
calculating a second position coordinate of the current position coordinate mapped on the virtual ground under the condition of no perspective relation according to the target deflection angle and the first position coordinate;
and calculating the coordinate mapped by the second position coordinate on the graphical user interface as the corrected position coordinate based on the perspective relation.
4. The aiming method according to claim 1, wherein the end position of the aiming indicator is controlled to move following the movement of the touch operation by keeping the angular velocity of the movement of the end position on the virtual ground the same as the angular velocity of the movement of the touch operation on the graphical user interface by:
Responding to the movement of the touch operation on the graphical user interface, and acquiring a target deflection angle of the touch operation relative to the target virtual control center;
and controlling the end position of the aiming indicator to deflect relative to the character position of the first game character according to the target deflection angle on the virtual ground.
5. The aiming method according to claim 1, wherein the current position coordinates of the touch operation on the graphical user interface are obtained by:
and responding to the movement of the touch operation on the graphical user interface, and when the target virtual control moves along with the movement of the touch operation, acquiring the current position coordinate of the target virtual control on the graphical user interface as the current position coordinate of the touch operation on the graphical user interface.
6. The aiming method according to claim 1, wherein the current position coordinates of the touch operation on the graphical user interface are obtained by:
and responding to the movement of the touch operation on the graphical user interface, and directly acquiring the current position coordinate of the touch operation on the graphical user interface when the position of the target virtual control on the graphical user interface is fixed.
7. The aiming method of claim 1, further comprising:
in a non-target aiming mode, responding to the movement of the touch operation on the graphical user interface, and acquiring the current position coordinate of the touch operation on the graphical user interface;
calculating a first position coordinate mapped by the current position coordinate on the virtual ground according to the perspective relation between the graphical user interface and the virtual ground;
an end position of the aiming indicator is controlled to move to the first position coordinates following the touch operation.
8. An aiming device in a game is characterized in that a graphical user interface is provided through a terminal device, and at least part of a game scene and a first game role are displayed on the graphical user interface; the aiming device includes:
a first response module for responding to touch operation of a target virtual control in the graphical user interface in a target aiming mode, and displaying an aiming indicator for indicating a skill aiming direction of the first game character on a virtual ground in the game scene; wherein the origin position of the aiming indicator is determined from the character position of the first game character on the virtual ground;
A second response module, configured to respond to movement of the touch operation on the graphical user interface, control an end position of the aiming indicator to move along with movement of the touch operation, and keep a movement angular velocity of the end position on the virtual ground identical to a movement angular velocity of the touch operation on the graphical user interface; and determining the final position of the aiming indicator on the virtual ground according to the current position coordinate of the touch operation on the graphical user interface or the target deflection angle of the touch operation relative to the center of the target virtual control.
9. An electronic device, comprising: a processor, a memory and a bus, said memory storing machine readable instructions executable by said processor, said processor and said memory communicating over the bus when the electronic device is running, said machine readable instructions when executed by said processor performing the steps of the method of targeting in a game according to any of claims 1 to 7.
10. A computer-readable storage medium, characterized in that it has stored thereon a computer program which, when executed by a processor, performs the steps of the targeting method in a game according to any of the claims 1 to 7.
CN202311657469.8A 2023-12-05 2023-12-05 Aiming method, device, equipment and storage medium in game Pending CN117732060A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311657469.8A CN117732060A (en) 2023-12-05 2023-12-05 Aiming method, device, equipment and storage medium in game

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311657469.8A CN117732060A (en) 2023-12-05 2023-12-05 Aiming method, device, equipment and storage medium in game

Publications (1)

Publication Number Publication Date
CN117732060A true CN117732060A (en) 2024-03-22

Family

ID=90251895

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311657469.8A Pending CN117732060A (en) 2023-12-05 2023-12-05 Aiming method, device, equipment and storage medium in game

Country Status (1)

Country Link
CN (1) CN117732060A (en)

Similar Documents

Publication Publication Date Title
JP7256283B2 (en) Information processing method, processing device, electronic device and storage medium
US10500484B2 (en) Information processing method and apparatus, storage medium, and electronic device
US11250641B2 (en) System and methods for mating virtual objects to real-world environments
JP5813948B2 (en) Program and terminal device
US20170186219A1 (en) Method for 360-degree panoramic display, display module and mobile terminal
CN111228802B (en) Information prompting method and device, storage medium and electronic device
CN110833694B (en) Display control method and device in game
CN108154548A (en) Image rendering method and device
CN109844820B (en) Modifying hand occlusion of holograms based on contextual information
CN107213636B (en) Lens moving method, device, storage medium and processor
JP6291155B2 (en) Program and server
JP2023549753A (en) Mark processing method and device, computer equipment, and computer program
WO2022247204A1 (en) Game display control method, non-volatile storage medium and electronic device
JP2019136358A (en) Game system and program
CN113117332A (en) Method and device for adjusting visual angle of lens, electronic equipment and storage medium
CN111494945A (en) Virtual object processing method and device, storage medium and electronic equipment
CN117732060A (en) Aiming method, device, equipment and storage medium in game
CN114612553B (en) Control method and device for virtual object, computer equipment and storage medium
CN116501209A (en) Editing view angle adjusting method and device, electronic equipment and readable storage medium
CN112473138B (en) Game display control method and device, readable storage medium and electronic equipment
CN113041616B (en) Method, device, electronic equipment and storage medium for controlling skip word display in game
CN114949842A (en) Virtual object switching method and device, storage medium and electronic equipment
JP7287172B2 (en) Display control device, display control method, and program
US20240226744A1 (en) Virtual object control method and apparatus, computer device and storage medium
CN115970287A (en) Virtual object control method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination