CN117899476A - Self-aiming technology detection method and device, storage medium and electronic device - Google Patents

Self-aiming technology detection method and device, storage medium and electronic device Download PDF

Info

Publication number
CN117899476A
CN117899476A CN202410041887.2A CN202410041887A CN117899476A CN 117899476 A CN117899476 A CN 117899476A CN 202410041887 A CN202410041887 A CN 202410041887A CN 117899476 A CN117899476 A CN 117899476A
Authority
CN
China
Prior art keywords
feature
targeting
aiming
data
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410041887.2A
Other languages
Chinese (zh)
Inventor
夏爽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202410041887.2A priority Critical patent/CN117899476A/en
Publication of CN117899476A publication Critical patent/CN117899476A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a self-aiming technology detection method and device, a storage medium and an electronic device, and relates to the technical field of computers. The method comprises the following steps: acquiring a first position and a second position in a graphical user interface, wherein the first position is a position in the graphical user interface where a target object in a virtual game scene is mapped to the target object, and the second position is a position in the graphical user interface where the center of the virtual weapon is aligned; determining a first targeting feature based on the first location and the second location, wherein the first targeting feature is to reflect a targeting accuracy of the firing operation; a firing operation is detected based on the first targeting feature to determine whether the firing operation satisfies a target feature, wherein the target feature is indicative of employing a self-targeting technique. The application solves the technical problems of lower accuracy of the judgment result and lower detection speed caused by the fact that whether the player adopts the self-aiming technology in the game or not is judged by the automatic detection technology in the related technology.

Description

Self-aiming technology detection method and device, storage medium and electronic device
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and apparatus for detecting a self-aiming technology, a storage medium, and an electronic apparatus.
Background
The automatic aiming (Aimbot) technology, which is abbreviated as an automatic aiming technology, is software in an electronic game, particularly in a shooting game, and can assist a player to aim an enemy automatically or manually, so that the aiming difficulty is greatly reduced, and the player can easily hit the enemy.
At present, since the self-aiming technology can realize the self-aiming function without any modification to the memory, in order to ensure the fairness of the game, whether the player adopts the self-aiming technology in the game is judged by an automatic detection technology in the process of checking the technology. However, the accuracy of the judgment result obtained by the existing automatic detection technology is low, and the detection speed is low.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
At least some embodiments of the present application provide a method, an apparatus, a storage medium, and an electronic device for detecting a self-aiming technique, so as to at least solve the technical problems in the related art that whether a player adopts the self-aiming technique in a game is judged by an automatic detection technique, resulting in lower accuracy of a judgment result and lower detection speed.
According to one embodiment of the present application, there is provided a self-aiming technique detection method, including: acquiring a first position and a second position in a graphical user interface, wherein the first position is a position in the graphical user interface where a target object in a virtual game scene is mapped to the target object, and the second position is a position in the graphical user interface where the center of the virtual weapon is aligned; determining a first targeting feature based on the first location and the second location, wherein the first targeting feature is to reflect a targeting accuracy of the firing operation; a firing operation is detected based on the first targeting feature to determine whether the firing operation satisfies a target feature, wherein the target feature is indicative of employing a self-targeting technique.
According to one embodiment of the present application, there is also provided an abnormal operation detection apparatus including: the acquisition module is used for acquiring a first position and a second position in the graphical user interface, wherein the first position is a position in the graphical user interface where a target object in the virtual game scene is mapped to the target object, and the second position is a position in the graphical user interface where the center of gravity of the virtual weapon is aligned; a determination module for determining a first targeting feature based on the first location and the second location, wherein the first targeting feature is to reflect a targeting accuracy of the firing operation; and the detection module is used for detecting the shooting operation based on the first aiming feature to determine whether the shooting operation meets the target feature, wherein the target feature is used for representing that the self-aiming technology is adopted.
According to one embodiment of the present application, there is also provided a computer readable storage medium having a computer program stored therein, wherein the computer program is configured to perform the self-aiming technique detection method of the above embodiment when run on a computer or processor.
According to one embodiment of the present application, there is also provided an electronic device including a memory having a computer program stored therein and a processor configured to run the computer program to perform the self-aiming technique detection method of the above embodiment.
In at least some embodiments of the present application, a first location and a second location in a graphical user interface are obtained, where the first location is a location in the graphical user interface where a target object in a virtual game scene is mapped to the target object in the virtual game scene, and the second location is a location in the graphical user interface where a center of gravity of a virtual weapon is located; determining a first targeting feature based on the first location and the second location, wherein the first targeting feature is to reflect a targeting accuracy of the firing operation; a firing operation is detected based on the first targeting feature to determine whether the firing operation satisfies a target feature, wherein the target feature is indicative of employing a self-targeting technique. The purpose of automatically detecting whether shooting operation in a game adopts a self-aiming technology (namely self-aiming externally hanging) is achieved, the detection speed and the accuracy of a detection result are effectively improved, and the technical problems that whether a player adopts the self-aiming technology in the game is judged through the automatic detection technology in the related technology, the accuracy of the judgment result is low, and the detection speed is low are solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a block diagram of a hardware architecture of a mobile terminal detected by a self-aiming technique according to an embodiment of the present application;
FIG. 2 is a flow chart of a self-aiming technique detection method according to one embodiment of the present application;
FIG. 3 is a schematic diagram of a coordinate mapping according to one embodiment of the present application;
FIG. 4 is a schematic diagram of radian calculation according to one embodiment of the application;
FIG. 5 is a schematic view of aiming data according to one embodiment of the present application;
FIG. 6 is a schematic diagram of perspective relationships according to one embodiment of the application;
FIG. 7 is a schematic view of a sight according to one embodiment of the present application;
FIG. 8 is a schematic diagram of game data collection according to one embodiment of the application;
FIG. 9 is a block diagram of a self-aiming technique detection device according to an alternative embodiment of the present application;
Fig. 10 is a schematic diagram of an electronic device according to an embodiment of the application.
Detailed Description
For ease of understanding, a description of some of the concepts related to the embodiments of the application are given by way of example for reference.
The following is shown:
Pre-buried points: refers to a specific location preset in the program for inserting additional codes or functions. Pre-buried points are typically placed on critical parts of the program or on important execution paths in order to capture critical data or monitor the behavior of the program.
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In one possible implementation, in the field of computer technology, in order to detect abnormal operations occurring in a game, such as game plug-in, during the development of the game, it is generally determined whether a player adopts an auto-aiming technology in the game through an automatic detection technology.
After practice and careful study, the inventor finds that the method still has the problems of low accuracy of judgment results and low detection speed. Based on the above, the game scene applied in the embodiment of the present application may be the field of image processing in the game, and a self-aiming technology detection method is provided, by acquiring a first position and a second position in a graphical user interface, where the first position is a position where a target object in a virtual game scene is mapped to the graphical user interface, and the second position is a position where a center of a virtual weapon in the graphical user interface is located; determining a first targeting feature based on the first location and the second location, wherein the first targeting feature is to reflect a targeting accuracy of the firing operation; a firing operation is detected based on the first targeting feature to determine whether the firing operation satisfies a target feature, wherein the target feature is indicative of employing a self-targeting technique. The purpose of automatically detecting whether shooting operation in a game adopts a self-aiming technology (namely self-aiming externally hanging) is achieved, the detection speed and the accuracy of a detection result are effectively improved, and the technical problems that whether a player adopts the self-aiming technology in the game is judged through the automatic detection technology in the related technology, the accuracy of the judgment result is low, and the detection speed is low are solved.
The self-aiming technique detection method in one embodiment of the application can be operated on a local terminal device or a server. When the self-aiming technology detection method is operated on a server, the method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and client equipment.
In an alternative embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, the running main body of the game program and the game picture presentation main body are separated, the storage and running of the self-aiming technology detection method are completed on the cloud game server, and the function of the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the cloud game server which performs information processing is a cloud. When playing the game, the player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In an alternative embodiment, taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection.
The local terminal device may be, for example, a mobile terminal, a computer terminal or similar computing device, and may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
Taking the mobile terminal as an example, the mobile terminal can be a smart phone, a tablet computer, a palm computer, a mobile internet device, a PAD, a game machine and other terminal devices. Fig. 1 is a block diagram of a hardware structure of a mobile terminal according to a self-aiming technique detection method according to an embodiment of the present application. As shown in fig. 1, a mobile terminal may include one or more processors 102 (only one shown in fig. 1) and memory 104 for storing data. Optionally, the mobile terminal may further include a transmission device 106, an input/output device 108, and a display device 110.
It will be appreciated by those skilled in the art that the structure shown in fig. 1 is merely illustrative and not limiting of the structure of the mobile terminal described above. For example, the mobile terminal may also include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1.
In accordance with one embodiment of the present application, an embodiment of a self-aiming technique detection method is provided, it being noted that the steps illustrated in the flowchart of the figures may be performed in a computer system, such as a set of computer executable instructions, and that although a logical sequence is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in a different order than that illustrated herein.
In a possible implementation manner, the embodiment of the application provides a self-aiming technology detection method, and a software application is executed on a processor of a user terminal, and a graphical user interface is rendered on a display of the user terminal, wherein the content displayed by the graphical user interface at least partially comprises a virtual game scene, and the virtual game scene comprises at least one virtual weapon, and the user terminal can be the aforementioned local terminal equipment or the aforementioned client equipment in the cloud interaction system. Fig. 2 is a flowchart of a self-aiming technique detection method according to one embodiment of the present application, as shown in fig. 2, the method includes the following steps: the method comprises the following steps:
Step S20, a first position and a second position in the graphical user interface are obtained, wherein the first position is a position in the graphical user interface where a target object in the virtual game scene is mapped to the target object, and the second position is a position in the graphical user interface where the center of gravity of the virtual weapon is located.
The shooting operation in the game is detected by self-aiming technology, which is essentially to detect whether the difficulty and the accuracy of the shooting operation accord with the normal level controlled by a human player. Thus, it is desirable to acquire a first position of a target object to be shot and a second position of the center of gravity of the virtual weapon when the player manipulates the virtual character to perform a shooting operation, so that the shooting operation is detected by the self-aiming technique based on the first position and the second position.
The target object may be understood as a shooting object of a virtual character manipulated by a player, i.e., an enemy in a virtual game scene. The first position is two-dimensional coordinate information of the target object in the graphical user interface, and it can be understood that the first position is a position where three-dimensional coordinate information of the target object in the virtual game scene is mapped to the graphical user interface.
It should be noted that, since the target object has a certain volume in the virtual game scene and is mapped to occupy a certain area in the graphical user interface, the first position of the target object may be based on the position of the preset portion of the target object, for example, may be based on the head position of the target object or the chest position of the target object, which is not limited herein.
A virtual weapon may be understood as a shooting weapon used by a virtual character operated by a player when performing a shooting operation, which may be understood as a weapon capable of aiming a fire, i.e. a weapon capable of firing an arrow, a marble, a bullet, etc. for injuring an enemy, without limitation.
The second position may be understood as the position of the center of gravity identification of the virtual weapon in the graphical user interface, i.e. the position in the game screen used to assist the player in targeting the enemy. It will be appreciated that the location of the center of gravity identification in the graphical user interface corresponds to the location of the center of gravity of the virtual weapon in the virtual game scene.
According to the embodiment of the application, the first position of the shot enemy and the second position of the virtual weapon center mark in the graphical user interface are acquired, so that whether the shooting operation in the shooting game is abnormal operation or not can be judged, namely whether a self-aiming technology is adopted or not.
Step S22, determining a first aiming feature based on the first position and the second position, wherein the first aiming feature is used to reflect an aiming accuracy of the shooting operation.
The first targeting feature may be understood as a feature for reflecting the targeting accuracy of a shooting operation, which can be determined from the first position of the target object and the second position of the quasimian mark. The accuracy of the aiming of the shooting operation may be determined, for example, from the distance or the degree of coincidence between the first position of the target object and the second position of the center mark in the screen.
The distance between the center mark of the virtual weapon and the target object in the screen can be determined by the first position and the second position, and the farther the center mark is from the target object, the harder it is to aim, i.e. the lower the accuracy of shooting is. Therefore, if the hit rate of the shooting operation is still high in the case where the center mark is far from the target object, the aiming accuracy of the shooting operation can be high according to the determination. Accordingly, if the hit rate of the shooting operation is still high under the condition that the coincidence degree of the center mark and the target object is low, the aiming accuracy of the shooting operation can be high according to the determination.
According to the embodiment of the application, the first aiming feature is determined through the first position and the second position, so that the aiming precision of the shooting operation can be determined based on the first aiming feature, and whether the shooting operation accords with the operation level of a human player or not can be accurately determined based on the aiming precision of the shooting operation, and whether the shooting operation is externally hung or not, namely whether the shooting operation is abnormal or not is determined.
In step S24, a shooting operation is detected based on the first aiming feature to determine whether the shooting operation satisfies a target feature, wherein the target feature is used for representing that the self-aiming technique is adopted.
The shooting operation is detected according to the first aiming feature determined by the first position of the enemy in the screen and the second position of the virtual weapon center mark in the screen, so that whether the shooting operation is abnormal operation or not can be accurately determined, namely whether a self-aiming technology is adopted or not, and the accuracy and the efficiency of abnormality detection are improved.
For example, a target value reflecting the accuracy of the shooting operation may be determined based on the first aiming feature, and if the target value is greater than a preset threshold value, the shooting operation may be determined to be an abnormal operation, i.e., a self-aiming technique is employed. Or determining that the shooting operation is an abnormal operation probability value based on the first aiming feature, and if the probability value exceeds a preset probability value, determining that the shooting operation is abnormal operation, namely adopting a self-aiming technology, wherein the self-aiming technology is not limited.
Through the steps, a first position and a second position in the graphical user interface are obtained, wherein the first position is a position in the graphical user interface where a target object in the virtual game scene is mapped to the target object in the virtual game scene, and the second position is a position in the graphical user interface where the center of the virtual weapon is aligned; determining a first targeting feature based on the first location and the second location, wherein the first targeting feature is to reflect a targeting accuracy of the firing operation; a firing operation is detected based on the first targeting feature to determine whether the firing operation satisfies a target feature, wherein the target feature is indicative of employing a self-targeting technique. The purpose of automatically detecting whether shooting operation in a game adopts a self-aiming technology (namely self-aiming externally hanging) is achieved, the detection speed and the accuracy of a detection result are effectively improved, and the technical problems that whether a player adopts the self-aiming technology in the game is judged through the automatic detection technology in the related technology, the accuracy of the judgment result is low, and the detection speed is low are solved.
In a possible implementation manner, in step S20, acquiring the first position in the graphical user interface may include performing the following steps:
step S200, obtaining a third position of a preset part of a target object in a virtual game scene;
step S202, mapping the third position to the graphical user interface to obtain the first position.
It will be appreciated that since the target object has a certain volume in the virtual game scene, and is mapped to occupy a certain area in the graphical user interface, the first position of the target object may be based on the position at the preset location of the target object. Illustratively, the predetermined location may be the head or chest of the target subject, which is not limited herein.
Because the target object is located in the three-dimensional virtual game scene, when the first position of the preset part of the target object in the two-dimensional image user interface is acquired, the third three-dimensional position of the preset part of the target object in the virtual game scene is required to be acquired, then coordinate conversion is carried out, and the three-dimensional coordinates are mapped onto the graphical user interface, so that the two-dimensional first position is obtained.
It will be appreciated that since the quasimian marker of the virtual weapon is located in the graphical user interface, the two-dimensional coordinates of the quasimian marker can be directly acquired, i.e. the second position can be directly acquired.
Fig. 3 is a schematic diagram of coordinate mapping according to an embodiment of the present application, as shown in fig. 3, taking the center of a virtual weapon as the origin of coordinates of a three-dimensional coordinate system in a virtual game scene and the origin of coordinates of a two-dimensional coordinate system in a graphical user interface as an example, XYZ in fig. 3 is three coordinate axes of the three-dimensional coordinate system in the virtual game scene, and XY is two coordinate axes of the two-dimensional coordinate system in the graphical user interface. A third position (x 1, y1, z 1) of the preset part of the target object in the virtual game scene is mapped into the graphical user interface, so that a first position (x 2, y 2) on the graphical user interface can be obtained.
Illustratively, other positions may be selected for the location of the origin of coordinates, and embodiments of the present application are not limited.
In one possible implementation, in step S22, determining the first targeting feature based on the first location and the second location may include performing the steps of:
step S220, determining an aiming distance based on the first position and the second position, wherein the aiming distance is the distance between the target object mapped to the graphical user interface and the quasi center;
step S222, determining a first targeting feature based on the targeting distance.
The aiming accuracy of the shooting operation can be reflected according to the two-dimensional coordinates of the target object on the player screen in each frame of the game screen, and the first aiming feature for reflecting the aiming accuracy of the shooting operation can be determined based on the aiming distance by calculating the distance between the target object on the screen and the aiming mark, namely the aiming distance.
Illustratively, as shown in fig. 3, taking the center of the virtual weapon as the origin of coordinates of the three-dimensional coordinate system in the virtual game scene and the origin of coordinates of the two-dimensional coordinate system in the graphical user interface as examples, the first position of the target object under the two-dimensional coordinate system is (x 2, y 2), and the coordinates of the second position of the center mark under the two-dimensional coordinate system is (0, 0). Then based on (x 2, y 2) and (0, 0) it can be determined that the distance between the target object and the quasimarl marker on the graphical user interface isI.e. the aiming distance, such that the first aiming feature can be determined from the aiming distance.
It will be appreciated that the smaller the aiming distance, the higher the accuracy of the firing operation and vice versa, and the higher the aiming accuracy of the firing operation, the more preferred the firing operation to the self-aiming operation, i.e. the higher the likelihood of using self-aiming techniques.
In one possible implementation, in step S222, determining the first targeting feature based on the targeting distance may include performing the steps of:
Step S2220, determining radian characteristics of shooting operation based on the first position and the second position, wherein the radian characteristics are radian values of an included angle between a first target direction and a preset direction in the graphical user interface, and the first target direction is a direction of the first position relative to the second position in the graphical user interface;
step S2222 determines a first targeting feature based on the targeting distance and the arc feature.
In the embodiment of the application, the fact that the aiming accuracy of the shooting operation is judged only by the two-dimensional distance between the target object and the center mark can be smeared out to the fluctuation value in each direction is considered, so that the arc tangent value of the two-dimensional coordinate is also needed to be calculated to determine the radian characteristic of the shooting operation, and the first aiming characteristic is determined through the aiming distance and the radian characteristic of the shooting operation together, so that the accuracy of determining the first aiming characteristic is further improved.
In the embodiment of the application, the first target direction is a direction in the graphical user interface. It will be appreciated that taking the second position of the centroid mark as an example of the origin of the two-dimensional coordinate system in the graphical user interface, the first target direction is the direction of the first position relative to the second position in the graphical user interface, i.e. the direction pointing to the target object by the centroid mark. Correspondingly, taking the first position of the target object as an origin of a medium-two-dimensional coordinate system of the graphical user interface as an example, the first target direction is a direction of the second position relative to the first position in the graphical user interface, i.e. a direction marked by the pointing center of the target object.
The preset direction is the horizontal axis direction of the two-dimensional coordinate system in the graphical user interface, namely the X axis direction. The radian characteristic of the shooting operation can be understood as the radian value of the included angle between the first target direction and the preset direction in the graphical user interface, namely the radian value of the included angle between the first target direction and the X axis.
Fig. 4 is a schematic diagram of radian calculation according to an embodiment of the present application, as shown in fig. 4, an included angle G (radian) between a line between a first position (X2, y 2) of a target object in a graphical user interface and a center mark (i.e., a direction from the center mark to the target object) and an X-axis positive half axis of a two-dimensional coordinate system is calculated, that is, a radian feature with a range of [ -pi, pi ] is calculated, so that a fluctuation value of a collimation direction of an outgoing attack operation can be reflected based on the radian feature.
It should be noted that when determining the radian characteristic of the shooting operation, the radian characteristic corresponding to each frame of the game picture in the preset time period needs to be determined, so that the fluctuation value of the target object based on the accurate mark in the preset time period can be determined.
It will be appreciated that if the corresponding fluctuation value of each frame within the preset time period fluctuates within a smaller interval, this indicates that the aiming fluctuation of the shooting operation is smaller, and the more the shooting operation is biased to the self-aiming operation. Thus, in determining the first aiming feature, the aiming distance and the radian feature of the shooting operation need to be comprehensively considered, and the aiming accuracy needs to be comprehensively determined according to the aiming distance and the aiming fluctuation of the shooting operation. The higher the determined aiming accuracy is, the more the shooting operation is biased to the self-aiming operation.
In one possible implementation, in step S2222, determining the first targeting feature based on the targeting distance and the arc feature may include performing the steps of:
step S22220, determining a third aiming sub-feature based on the aiming distance, wherein the third aiming sub-feature is used to reflect the relation of the aiming distance and the aiming accuracy of the shooting operation;
step S22222, determining an arc characteristic interval in which the arc characteristic is located in a preset time period;
Step S22224, determining a fourth aiming sub-feature based on the radian characteristic interval, wherein the fourth aiming sub-feature is used for reflecting the relation between the radian characteristic interval and the aiming precision of shooting operation;
step S22226 determines the first targeting feature based on the third targeting sub-feature and the fourth targeting sub-feature.
In embodiments of the present application, when determining the first aiming feature based on the aiming distance and the radian feature, a third aiming sub-feature may be determined based on the aiming distance to reflect a relationship of the aiming distance to the aiming accuracy of the firing operation. It will be appreciated that the smaller the aiming distance, the higher the accuracy of the firing operation and vice versa, the more the firing operation is biased towards the self-aiming operation if the higher the aiming accuracy of the firing operation is determined.
The method can also determine the radian characteristic corresponding to each frame of the game picture in the preset time period, and then further determine the fluctuation range of the radian characteristic, namely the radian characteristic interval, namely the fluctuation value of the target object based on the accurate mark according to the determined radian characteristic in the preset time period. A fourth aiming sub-feature is then determined based on the arc characteristic interval to reflect a relationship of the arc characteristic interval to aiming accuracy of the firing operation. It will be appreciated that if the corresponding fluctuation value of each frame fluctuates within a smaller interval, i.e., the range of the arc characteristic interval is smaller, within the preset time period, the fluctuation of the aiming indicating the shooting operation is smaller, and thus the shooting operation is more biased to the self-aiming operation.
After the third aiming sub-feature and the fourth aiming sub-feature are determined, the first aiming feature can be determined together according to the third aiming sub-feature and the fourth aiming sub-feature, namely, the aiming precision is comprehensively determined.
In a possible implementation, in step S24, detecting the shooting operation based on the first targeting feature may include performing the steps of:
Step S240, shooting data are acquired, wherein the shooting data are used for describing shooting properties of the virtual weapon;
Step S242, determining a second aiming feature based on the shooting data, wherein the second aiming feature is used for reflecting the difficulty level of the shooting operation;
In step S244, the shooting operation is detected based on the first aiming feature, the second aiming feature and the preset weight value.
The firing data is used to describe firing properties of the virtual weapon, which may be understood as property information representing the virtual weapon during the firing operation, including, but not limited to, aiming information, recoil information, sighting telescope information, etc. of the virtual weapon.
The second aiming feature may be understood as a feature for reflecting the difficulty of the shooting operation, which can be determined from the shooting attribute information of the virtual weapon in the shooting data. The difficulty of the shooting operation of the virtual weapon can be determined by the sighting telescope data in the shooting data, and under normal conditions, the larger the sighting telescope number is, the harder the sighting telescope is to aim. If the hit rate of the firing operation is still high in the case of a virtual weapon equipped with a high-magnification sighting telescope, it is therefore difficult to determine the firing operation from the firing data.
The second aiming feature is determined through shooting data, so that the difficulty level of the shooting operation can be determined based on the second aiming feature, and whether the shooting operation accords with the operation level of a human player or not can be accurately determined based on the difficulty level of the shooting operation, and whether a self-aiming technology is adopted or not, namely whether the shooting operation is abnormal or not.
The preset weight value includes a weight value corresponding to the first aiming feature and a weight value corresponding to the second aiming feature, and in the embodiment of the present application, the preset weight value may be preset according to experience or determined according to big data, which is not limited herein.
The shooting operation is detected based on the first aiming feature, the second aiming feature and the preset weight value, so that the aiming accuracy and the aiming difficulty of the shooting operation can be comprehensively considered, whether the shooting operation is abnormal operation or not can be accurately determined, and the accuracy and the efficiency of abnormal detection are improved.
For example, a target value reflecting the accuracy and the aiming difficulty of the shooting operation may be determined based on the first aiming feature, the second aiming feature, and the preset weight value, and if the target value is greater than the preset threshold value, the shooting operation may be determined to be an abnormal operation. Or based on the first aiming feature, the second aiming feature and the preset weight value, determining that the shooting operation is an abnormal operation probability value, and if the probability value exceeds the preset probability value, determining that the shooting operation is an abnormal operation, wherein the abnormal operation is not limited.
In one possible implementation, the shooting data includes aiming data, where the aiming data is an angle value between a second target direction in the virtual game scene and a view direction, the second target direction is a direction of a target object in the virtual game scene relative to a center of gravity of a virtual weapon in the virtual game scene, and the view direction is a view center direction of a current virtual camera, and in step S242, determining the second aiming feature based on the shooting data may include performing the steps of:
step S2420 of determining an angular velocity and an angular acceleration of the movement of the target object based on the aiming data, wherein the angular velocity is used to reflect the movement velocity of the target object, and the angular acceleration is used to reflect the movement acceleration of the target object;
In step S2422, a first aiming sub-feature of the firing operation is determined based on the angular velocity and the angular acceleration, wherein the first aiming feature is used to reflect the relationship between the angular velocity and the angular acceleration and the difficulty level of the firing operation.
In the embodiment of the application, the second target direction is a direction in the virtual game scene. It will be appreciated that, taking the virtual weapon center as the origin of the three-dimensional coordinate system as an example, the second target direction may be understood as the direction of the target object relative to the center of the virtual weapon in the virtual game scene, and taking the preset position of the target object as the origin of the three-dimensional coordinate system as an example, the second target direction may be understood as the direction of the center of the virtual weapon relative to the target object in the virtual game scene.
It will be appreciated that in a three-dimensional game scene, a virtual camera is used to capture images of the game scene and present it to the player. The field of view direction refers to the range and direction that a virtual camera can see, and is typically determined by the position, orientation, and view angle of the camera. In the embodiment of the application, the view direction is the view center direction of the current virtual camera, taking the virtual weapon center as the origin of the three-dimensional coordinate system as an example, and the view direction is the direction of the center perpendicular to the graphical user interface.
The firing data includes targeting data, which may be understood as an angle value between the second target direction and the field of view direction. Fig. 5 is a schematic view of aiming data according to one embodiment of the present application, as shown in fig. 5, the direction indicated by the arrow a extending at the center position is the view direction, i.e. the direction in which the center of the virtual weapon is perpendicular to the graphical user interface. The direction indicated by the arrow B extending at the quasimal position is the second target direction, i.e. the direction of the target object relative to the quasimal of the virtual weapon. The angle value between the three-dimensional coordinate systems of the two directions in the virtual game scene is the aiming data. Illustratively, the angle values may be expressed in terms of radians, angles, cos equivalents, and are not limited herein.
Since the target object may move in real time during the game, if the moving speed is determined by selecting the moving distance of the target object in the adjacent frames of the game screen, the moving speed determination is wrong due to the perspective relationship existing in the three-dimensional game. Illustratively, in a three-dimensional game, objects at different longitudinal distances from the screen are moved the same distance corresponding to the movement distance presented in the screen are different, and the speed of the different longitudinal distances from the screen does not affect the speed in the screen.
Fig. 6 is a schematic perspective view of an embodiment of the present application, as shown in fig. 6, where enemies B and C are present at different distances from a centered position (which may be considered a screen position). In the virtual game scene, if the enemy B and the enemy C both move by the same distance x, but the movement distance of the enemy B in the screen is obviously larger than that of the enemy C in the screen, and if the movement distance of the enemy C in the screen is consistent with that of the enemy B, the enemy C also needs to move by the distance of x'. It can thus be seen that determining the moving speed of the target object based on the distance it moves in the adjacent frames of the game screen does not correctly reflect the moving speed thereof.
Therefore, in the embodiment of the application, the angular velocity value of the movement of the target object can be determined by determining the change of the included angle between the direction from the center position to the target object in the adjacent frames of the game picture and the view direction, namely the change of the angle corresponding to the aiming data in the adjacent frames, and the angular velocity value can reflect the movement velocity of the target object. It will be appreciated that a greater angular velocity indicates a greater speed of movement of the target object within the virtual game world.
It should be noted that the above-determined moving speed is an absolute value, and has no directional characteristic. And when the moving speed is too even, the shooting operation is easier to aim, i.e. the aiming difficulty is reduced. Therefore, in the embodiment of the application, the angular acceleration is also required to be determined, and the acceleration of the movement of the target object is reflected by determining the acceleration of the angular velocity. Thus, the first aiming sub-feature of the shooting operation is determined based on the angular velocity and the angular acceleration together and is used for reflecting the difficulty level of the shooting operation.
It will be appreciated that the greater the speed and acceleration of movement of the target object, the greater the difficulty of the corresponding firing operation, and conversely, the lesser the speed and acceleration of movement of the target object, the lesser the difficulty of the corresponding firing operation.
In a possible embodiment, the shooting data further comprises scope data representing a multiple of a scope employed for the shooting operation, and the determining of the first sighting sub-feature of the shooting operation based on the angular velocity and the angular acceleration in step S2422 may comprise the following performing steps: a first collimation sub-feature is determined based on the telescope data, the angular velocity, and the angular acceleration.
The scope data may be understood as a magnifying glass assembled by the virtual weapon for magnifying the field of view, and may be understood as a field of view observed based on the scope being equivalent to a field of view observed after the virtual character moves a distance in the direction of view, i.e. the starting point of the field of view moves forward, so that the position and the dynamics of the target object can be observed more clearly.
The scope includes multiple factors, such as a 2-fold scope, a 4-fold scope, an 8-fold scope, etc., the higher the factor the greater the magnification of the field of view.
Fig. 7 is a view of a telescope according to an embodiment of the present application, as shown in fig. 7, the view range is from D when the virtual weapon is not equipped with a telescope, from E when the virtual weapon is equipped with a telescope of 2 times, and from F when the virtual weapon is equipped with a telescope of 8 times.
It can be seen that the number of the magnifier of the sighting telescope during shooting greatly influences the difficulty of shooting operation, and after the visual field is enlarged, if the target object moving is to be aimed, the distance of the center movement after the visual field is enlarged is larger than the distance of the center movement when the visual field is not enlarged. It can thus be understood that the higher the multiple of the scope, the greater the difficulty of the shooting operation.
Therefore, when determining the difficulty level of the shooting operation, it is also required to comprehensively consider whether the virtual weapon is equipped with the magnifier to shoot, so that the difficulty level of the shooting operation is comprehensively determined based on the aiming data and the sighting telescope data, and the first aiming sub-feature is determined.
For example, in determining the difficulty level of the shooting operation based on the sighting data and the sighting telescope data, the new angle may be calculated by shifting the starting point of the visual field toward the target distance by n-1/n when the scope is opened according to the multiple n of the scope when calculating the angle value between the second target direction and the visual field direction.
In one possible implementation, the firing data further includes recoil data, and determining the second targeting feature based on the firing data in step S242 may further include performing the steps of:
step S2424, determining the shake velocity of the center of gravity based on the recoil data;
step S2426, determining a second aiming sub-feature of the shooting operation based on the shake speed, wherein the second aiming sub-feature is used for reflecting the relation between the shake speed and the difficulty level of the shooting operation;
Step S2428, determining a second targeting feature based on the first targeting sub-feature and the second targeting sub-feature.
Recoil data may be understood as shaking data of the muzzle after the virtual weapon performs a firing operation, i.e. shaking velocity of the quasiperiodic of the virtual weapon. It will be appreciated that the greater the recoil, the greater the shake speed, the greater the aiming difficulty and vice versa.
Because the recoil of the virtual weapon also affects the operational difficulty of the shooting operation, when determining the operational difficulty of the shooting operation, the recoil of the virtual weapon also needs to be comprehensively considered, and the second aiming sub-feature of the shooting operation is determined based on the shake speed of the aiming center, so that the difficulty of the shooting operation is comprehensively determined based on the aiming data, the sighting telescope data and the recoil data, that is, the second aiming feature is comprehensively determined based on the first aiming sub-feature and the second aiming sub-feature.
It will be appreciated that when the aiming accuracy of the firing operation is the same, the higher the aiming difficulty, the more the firing operation is biased to the self-aiming operation, i.e., the abnormal operation.
In a possible implementation manner, in step S244, detecting the shooting operation based on the first aiming feature, the second aiming feature and the preset weight value may include performing the following steps:
Step S2440, determining a probability that the shooting operation satisfies the target feature based on the first targeting feature, the second targeting feature, and the preset weight value;
in step S2442, in response to the probability being greater than the preset probability, it is determined that the firing operation satisfies the target feature.
Based on the first aiming feature, the second aiming feature and the preset weight value, the probability that the shooting operation meets the target feature can be determined, and whether the shooting operation meets the target feature or not is further determined according to the probability, namely whether a self-aiming technology is adopted or not. If the probability is greater than the preset probability, the shooting operation is determined to adopt the self-aiming technology, otherwise, the self-aiming technology is not adopted.
In one possible embodiment, the method may further comprise the following execution steps:
Step S260, shooting operation data are acquired, wherein the shooting operation data are used for controlling the virtual weapon to be in a shooting state;
Step S262, determining a third aiming feature based on the shooting operation data, wherein the third aiming feature is used for reflecting the aiming condition of any shooting operation in the corresponding operation duration;
step S264 determines an abnormal operation based on the third targeting feature.
The firing state of the virtual weapon includes a firing state of the virtual weapon, which is understood to be a state in which the trigger of the virtual weapon is in the firing state, and a non-firing state, which is understood to be a state in which the trigger of the virtual weapon is not in the firing state, taking a general firing weapon (i.e., a weapon fired by firing the trigger). It will be appreciated that different virtual weapons may have different modes of firing, and are not limited in this regard.
The shooting operation data can be understood as the pressing state of shooting keys controlled by the player, and the shooting keys can be physical keys on a computer keyboard, virtual controls on a mobile phone or physical keys on a game handle, which are not limited. It will be appreciated that the player controls the virtual character to perform a firing operation using the virtual weapon by pressing the firing button, i.e. the firing operation data may be used to control the virtual weapon to be in a firing state.
Because a player may press a plurality of keys simultaneously in the game process, only one key is hung, aiming conditions of each key in the game corresponding to the pressing process need to be determined independently, namely aiming conditions in the game in a time interval from the pressing of the key by a finger of the player to the separation of the finger from the key need to be determined, so that whether the operation triggered by the key is abnormal operation or not is determined according to the aiming conditions, and whether the key is the hung key or not can be determined.
And determining a third aiming feature for reflecting the aiming condition of any shooting operation in the corresponding operation time based on the shooting operation data, thereby determining whether the shooting operation corresponding to the shooting operation data is abnormal operation or not based on the third aiming feature, and further determining whether a key generating the shooting operation data is an external key or not.
It will be appreciated that the aiming may be determined based on the ease of shooting operations and aiming accuracy previously described. The shooting operation data generated by each key is determined, so that a key with the largest suspicion value is determined as an external key, and the operation triggered and generated by the external key is abnormal operation.
In a possible embodiment, in step S20, acquiring the first position and the second position, and in step S240, acquiring the shooting data may include performing the steps of:
step S280, acquiring a first position, a second position and shooting data based on at least one embedded point set in a game client, wherein different embedded points are used for acquiring different types of data;
In step S282, the first position, the second position and the shooting data are checked, and the checked first position, second position and shooting data are obtained.
When the game data of the player is acquired, the game data of the player can be acquired by setting pre-buried points at the game client. The embedded points can be arranged in a plurality, and different types of game data correspond to different embedded points. The code for acquiring the game data is inserted at the pre-embedded point preset in the game client, so that the game data of the player during the game can be acquired in real time, including but not limited to shooting data and position data.
It can be understood that part of the data in the game data can be obtained by adding embedded points in the client, and part of the data is necessary data generated when the game program interacts with the client, so that the embedded points can be directly obtained without separately and additionally setting the embedded points.
After the game data is obtained from the client, the game data may be uploaded to the server, and the server integrates and verifies the game data, and eliminates abnormal values and null values in the game data, so as to determine final data, that is, the obtained first position and the obtained second position obtained in step S20, and the shooting data obtained in step S240.
FIG. 8 is a schematic diagram of game data collection according to one embodiment of the present application, as shown in FIG. 8, wherein game data of a player, including but not limited to shooting data and position data, may be acquired in real time after the player starts a game until the game is finished. The server receives the game data sent by the client in real time, and integrates and verifies the received game data after the game is finished, so that final data, namely shooting data and a first position, are determined.
The self-aiming technology detection method provided by the embodiment of the application can detect whether the shooting operation adopts the self-aiming technology by training a machine learning model, wherein the model can be a supervised model, an unsupervised model or a semi-supervised model, and the embodiment of the application is not limited.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present application.
The embodiment also provides a self-aiming technology detection device, which is used for realizing the embodiment and the preferred implementation, and is not described in detail. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
Fig. 9 is a block diagram of a self-aiming technique detection device according to one embodiment of the present application, as shown in fig. 9, taking the self-aiming technique detection device 900 as an example, by executing a software application on a processor of a user terminal and rendering a graphical user interface on a display of the user terminal, where a content displayed by the graphical user interface at least partially includes a virtual game scene, and the virtual game scene includes at least one virtual weapon, the self-aiming technique detection device 900 includes: the obtaining module 901 is configured to obtain a first position and a second position in the graphical user interface, where the first position is a position in the graphical user interface where a target object in the virtual game scene is mapped to the target object, and the second position is a position in the graphical user interface where a center of gravity of the virtual weapon is located; a determination module 902 for determining a first targeting feature based on the first location and the second location, wherein the first targeting feature is to reflect a targeting accuracy of a firing operation; a detection module 903 for detecting the firing operation based on the first aiming feature to determine whether the firing operation meets a target feature, wherein the target feature is used to represent that the self-aiming technique is adopted.
Optionally, the determining module 902 is further configured to: determining an aiming distance based on the first position and the second position, wherein the aiming distance is a distance between a target object mapped into the graphical user interface and the quasi center; a first targeting feature is determined based on the targeting distance.
Optionally, the determining module 902 is further configured to: determining radian characteristics of shooting operation based on the first position and the second position, wherein the radian characteristics are radian values of included angles between a first target direction and a preset direction in a graphical user interface, and the first target direction is a direction of the first position relative to the second position in the graphical user interface; a first targeting feature is determined based on the targeting distance and the arc feature.
Optionally, the detection module 903 is further configured to: acquiring shooting data, wherein the shooting data is used for describing shooting properties of the virtual weapon; determining a second targeting feature based on the firing data, wherein the second targeting feature is configured to reflect a difficulty level of the firing operation; the firing operation is detected based on the first targeting feature, the second targeting feature, and a preset weight value.
Optionally, the shooting data includes aiming data, where the aiming data is an angle value between a second target direction in the virtual game scene and a view direction, the second target direction is a direction of a target object in the virtual game scene relative to a center of gravity of a virtual weapon in the virtual game scene, the view direction is a view center direction of a current virtual camera, and the detection module 903 is further configured to: determining an angular velocity and an angular acceleration of the movement of the target object based on the aiming data, wherein the angular velocity is used for reflecting the movement velocity of the target object, and the angular acceleration is used for reflecting the movement acceleration of the target object; a first aiming sub-feature of the firing operation is determined based on the angular velocity and the angular acceleration, wherein the first aiming feature is used to reflect a relationship of the angular velocity and the angular acceleration to a difficulty level of the firing operation.
Optionally, the shooting data further comprises scope data, the scope data being used to represent a multiple of a scope used for the shooting operation, and the detection module 903 being further configured to: a first collimation sub-feature is determined based on the telescope data, the angular velocity, and the angular acceleration.
Optionally, the shot data further comprises recoil data, and the detection module 903 is further configured to: determining a shaking speed of the quasi-center based on the recoil data; determining a second aiming sub-feature of the shooting operation based on the shake speed, wherein the second aiming sub-feature is used for reflecting the relation between the shake speed and the difficulty level of the shooting operation; a second targeting feature is determined based on the first targeting sub-feature and the second targeting sub-feature.
Optionally, the obtaining module 901 is further configured to: acquiring a third position of a preset part of a target object in the virtual game scene; mapping the third position to the graphical user interface to obtain the first position.
Optionally, the detection module 903 is further configured to: determining a third aiming sub-feature based on the aiming distance, wherein the third aiming sub-feature is used to reflect a relationship of the aiming distance to an aiming accuracy of the firing operation; determining an arc characteristic interval in which an arc characteristic is positioned in a preset time period; determining a fourth aiming sub-feature based on the radian feature interval, wherein the fourth aiming sub-feature is used for reflecting the relation between the radian feature interval and the aiming precision of shooting operation; the first targeting feature is determined based on the third targeting sub-feature and the fourth targeting sub-feature.
Optionally, the shooting control device further comprises a key module for acquiring shooting operation data, wherein the shooting operation data is used for controlling the virtual weapon to be in a shooting state; determining a third aiming feature based on the shooting operation data, wherein the third aiming feature is used for reflecting the aiming condition of any shooting operation in the corresponding operation duration; an abnormal operation is determined based on the third targeting feature.
Optionally, the game client further comprises a verification module, wherein the verification module is used for acquiring the first position, the second position and shooting data based on at least one embedded point arranged in the game client, and different embedded points are used for acquiring different types of data; and checking the first position, the second position and the shooting data to obtain the checked first position, second position and shooting data.
Optionally, the detection module 903 is further configured to: determining a probability that the shooting operation meets the target feature based on the first targeting feature, the second targeting feature, and a preset weight value; in response to the probability being greater than the preset probability, it is determined that the firing operation satisfies the target feature.
It should be noted that each of the above modules may be implemented by software or hardware, and for the latter, it may be implemented by, but not limited to: the modules are all located in the same processor; or the above modules may be located in different processors in any combination.
Embodiments of the present application also provide a computer readable storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
Alternatively, in the present embodiment, the above-described computer-readable storage medium may include, but is not limited to: a usb disk, a read-only memory (ROM), a random access memory (Random Access Memory RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing a computer program.
Alternatively, in this embodiment, the above-mentioned computer-readable storage medium may be located in any one of the computer terminals in the computer terminal group in the computer network, or in any one of the mobile terminals in the mobile terminal group.
Alternatively, in the present embodiment, the above-described computer-readable storage medium may be configured to store a computer program for performing the steps of:
Step S20, a first position and a second position in the graphical user interface are obtained, wherein the first position is a position in the graphical user interface where a target object in the virtual game scene is mapped to the target object, and the second position is a position in the graphical user interface where the center of gravity of the virtual weapon is aligned;
step S22, determining a first aiming feature based on the first position and the second position, wherein the first aiming feature is used for reflecting aiming accuracy of shooting operation;
In step S24, a shooting operation is detected based on the first aiming feature to determine whether the shooting operation satisfies a target feature, wherein the target feature is used for representing that the self-aiming technique is adopted.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: determining an aiming distance based on the first position and the second position, wherein the aiming distance is a distance between a target object mapped into the graphical user interface and the quasi center; a first targeting feature is determined based on the targeting distance.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: determining radian characteristics of shooting operation based on the first position and the second position, wherein the radian characteristics are radian values of included angles between a first target direction and a preset direction in a graphical user interface, and the first target direction is a direction of the first position relative to the second position in the graphical user interface; a first targeting feature is determined based on the targeting distance and the arc feature.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: acquiring shooting data, wherein the shooting data is used for describing shooting properties of the virtual weapon; determining a second targeting feature based on the firing data, wherein the second targeting feature is configured to reflect a difficulty level of the firing operation; the firing operation is detected based on the first targeting feature, the second targeting feature, and a preset weight value.
Optionally, the shooting data comprises aiming data, the aiming data being an angle value between a second target direction in the virtual game scene and a view direction, the second target direction being a direction of a target object in the virtual game scene relative to a center of a virtual weapon in the virtual game scene, the view direction being a center of view direction of a current virtual camera, the computer readable storage medium being further arranged to store program code for: determining an angular velocity and an angular acceleration of the movement of the target object based on the aiming data, wherein the angular velocity is used for reflecting the movement velocity of the target object, and the angular acceleration is used for reflecting the movement acceleration of the target object; a first aiming sub-feature of the firing operation is determined based on the angular velocity and the angular acceleration, wherein the first aiming feature is used to reflect a relationship of the angular velocity and the angular acceleration to a difficulty level of the firing operation.
Optionally, the firing data further comprises scope data for representing a multiple of a scope employed for the firing operation, the computer readable storage medium being further arranged to store program code for: a first collimation sub-feature is determined based on the telescope data, the angular velocity, and the angular acceleration.
Optionally, the shot data further comprises recoil data, the computer readable storage medium being further arranged to store program code for performing the steps of: determining a shaking speed of the quasi-center based on the recoil data; determining a second aiming sub-feature of the shooting operation based on the shake speed, wherein the second aiming sub-feature is used for reflecting the relation between the shake speed and the difficulty level of the shooting operation; a second targeting feature is determined based on the first targeting sub-feature and the second targeting sub-feature.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: acquiring a third position of a preset part of a target object in the virtual game scene; mapping the third position to the graphical user interface to obtain the first position.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: determining a third aiming sub-feature based on the aiming distance, wherein the third aiming sub-feature is used to reflect a relationship of the aiming distance to an aiming accuracy of the firing operation; determining an arc characteristic interval in which an arc characteristic is positioned in a preset time period; determining a fourth aiming sub-feature based on the radian feature interval, wherein the fourth aiming sub-feature is used for reflecting the relation between the radian feature interval and the aiming precision of shooting operation; the first targeting feature is determined based on the third targeting sub-feature and the fourth targeting sub-feature.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: acquiring shooting operation data, wherein the shooting operation data is used for controlling the virtual weapon to be in a shooting state; determining a third aiming feature based on the shooting operation data, wherein the third aiming feature is used for reflecting the aiming condition of any shooting operation in the corresponding operation duration; an abnormal operation is determined based on the third targeting feature.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: acquiring a first position, a second position and shooting data based on at least one embedded point arranged in a game client, wherein different embedded points are used for acquiring different types of data; and checking the first position, the second position and the shooting data to obtain the checked first position, second position and shooting data.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: determining a probability that the shooting operation meets the target feature based on the first targeting feature, the second targeting feature, and a preset weight value; in response to the probability being greater than the preset probability, it is determined that the firing operation satisfies the target feature.
In the computer readable storage medium of this embodiment, a technical solution of self-aiming detection is provided by acquiring a first position and a second position in a graphical user interface, wherein the first position is a position in the graphical user interface where a target object in a virtual game scene is mapped to the target object in the virtual game scene, and the second position is a position in the graphical user interface where the center of gravity of a virtual weapon is aligned; determining a first targeting feature based on the first location and the second location, wherein the first targeting feature is to reflect a targeting accuracy of the firing operation; a firing operation is detected based on the first targeting feature to determine whether the firing operation satisfies a target feature, wherein the target feature is indicative of employing a self-targeting technique. The purpose of automatically detecting whether shooting operation in a game adopts a self-aiming technology (namely self-aiming externally hanging) is achieved, the detection speed and the accuracy of a detection result are effectively improved, and the technical problems that whether a player adopts the self-aiming technology in the game is judged through the automatic detection technology in the related technology, the accuracy of the judgment result is low, and the detection speed is low are solved.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present application may be embodied in the form of a software product, which may be stored in a computer readable storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present application.
In an exemplary embodiment of the present application, a computer-readable storage medium stores thereon a program product capable of implementing the method described above in this embodiment. In some possible implementations, the various aspects of the embodiments of the application may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the application as described in the "exemplary methods" section of this embodiment, when the program product is run on the terminal device.
A program product for implementing the above-mentioned method according to an embodiment of the present application may employ a portable compact disc read only memory (CD-ROM) and comprise program code and may be run on a terminal device, such as a personal computer. However, the program product of the embodiments of the present application is not limited thereto, and in the embodiments of the present application, the computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
Any combination of one or more computer readable media may be employed by the program product described above. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-Only Memory (ROM), an erasable programmable read-Only Memory (EPROM) or flash Memory, an optical fiber, a portable compact disc read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
It should be noted that the program code embodied on the computer readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, radio Frequency (RF), etc., or any suitable combination of the foregoing.
An embodiment of the application also provides an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
Step S20, a first position and a second position in the graphical user interface are obtained, wherein the first position is a position in the graphical user interface where a target object in the virtual game scene is mapped to the target object, and the second position is a position in the graphical user interface where the center of gravity of the virtual weapon is aligned;
step S22, determining a first aiming feature based on the first position and the second position, wherein the first aiming feature is used for reflecting aiming accuracy of shooting operation;
In step S24, a shooting operation is detected based on the first aiming feature to determine whether the shooting operation satisfies a target feature, wherein the target feature is used for representing that the self-aiming technique is adopted.
Optionally, the above processor may be further configured to perform the following steps by a computer program: determining an aiming distance based on the first position and the second position, wherein the aiming distance is a distance between a target object mapped into the graphical user interface and the quasi center; a first targeting feature is determined based on the targeting distance.
Optionally, the above processor may be further configured to perform the following steps by a computer program: determining radian characteristics of shooting operation based on the first position and the second position, wherein the radian characteristics are radian values of included angles between a first target direction and a preset direction in a graphical user interface, and the first target direction is a direction of the first position relative to the second position in the graphical user interface; a first targeting feature is determined based on the targeting distance and the arc feature.
Optionally, the above processor may be further configured to perform the following steps by a computer program: acquiring shooting data, wherein the shooting data is used for describing shooting properties of the virtual weapon; determining a second targeting feature based on the firing data, wherein the second targeting feature is configured to reflect a difficulty level of the firing operation; the firing operation is detected based on the first targeting feature, the second targeting feature, and a preset weight value.
Optionally, the shooting data comprises aiming data, the aiming data being an angle value between a second target direction in the virtual game scene and a view direction, the second target direction being a direction of a target object in the virtual game scene relative to a center of a virtual weapon in the virtual game scene, the view direction being a view center direction of a current virtual camera, the processor being further configured to execute the following steps by the computer program: determining an angular velocity and an angular acceleration of the movement of the target object based on the aiming data, wherein the angular velocity is used for reflecting the movement velocity of the target object, and the angular acceleration is used for reflecting the movement acceleration of the target object; a first aiming sub-feature of the firing operation is determined based on the angular velocity and the angular acceleration, wherein the first aiming feature is used to reflect a relationship of the angular velocity and the angular acceleration to a difficulty level of the firing operation.
Optionally, the shooting data further comprises scope data for representing a multiple of a scope employed for the shooting operation, said processor being further arranged to perform the following steps by means of a computer program: a first collimation sub-feature is determined based on the telescope data, the angular velocity, and the angular acceleration.
Optionally, the shot data further comprises recoil data, and the processor may be further arranged to perform the following steps by means of a computer program: determining a shaking speed of the quasi-center based on the recoil data; determining a second aiming sub-feature of the shooting operation based on the shake speed, wherein the second aiming sub-feature is used for reflecting the relation between the shake speed and the difficulty level of the shooting operation; a second targeting feature is determined based on the first targeting sub-feature and the second targeting sub-feature.
Optionally, the above processor may be further configured to perform the following steps by a computer program: acquiring a third position of a preset part of a target object in the virtual game scene; mapping the third position to the graphical user interface to obtain the first position.
Optionally, the above processor may be further configured to perform the following steps by a computer program: determining a third aiming sub-feature based on the aiming distance, wherein the third aiming sub-feature is used to reflect a relationship of the aiming distance to an aiming accuracy of the firing operation; determining an arc characteristic interval in which an arc characteristic is positioned in a preset time period; determining a fourth aiming sub-feature based on the radian feature interval, wherein the fourth aiming sub-feature is used for reflecting the relation between the radian feature interval and the aiming precision of shooting operation; the first targeting feature is determined based on the third targeting sub-feature and the fourth targeting sub-feature.
Optionally, the above processor may be further configured to perform the following steps by a computer program: acquiring shooting operation data, wherein the shooting operation data is used for controlling the virtual weapon to be in a shooting state; determining a third aiming feature based on the shooting operation data, wherein the third aiming feature is used for reflecting the aiming condition of any shooting operation in the corresponding operation duration; an abnormal operation is determined based on the third targeting feature.
Optionally, the above processor may be further configured to perform the following steps by a computer program: acquiring a first position, a second position and shooting data based on at least one embedded point arranged in a game client, wherein different embedded points are used for acquiring different types of data; and checking the first position, the second position and the shooting data to obtain the checked first position, second position and shooting data.
Optionally, the above processor may be further configured to perform the following steps by a computer program: determining a probability that the shooting operation meets the target feature based on the first targeting feature, the second targeting feature, and a preset weight value; in response to the probability being greater than the preset probability, it is determined that the firing operation satisfies the target feature.
In the electronic device of the embodiment, a technical scheme of self-aiming technology detection is provided, and a first position and a second position in a graphical user interface are obtained, wherein the first position is a position in which a target object in a virtual game scene is mapped to the graphical user interface, and the second position is a position in which a center of a virtual weapon in the graphical user interface is aligned; determining a first targeting feature based on the first location and the second location, wherein the first targeting feature is to reflect a targeting accuracy of the firing operation; a firing operation is detected based on the first targeting feature to determine whether the firing operation satisfies a target feature, wherein the target feature is indicative of employing a self-targeting technique. The purpose of automatically detecting whether shooting operation in a game adopts a self-aiming technology (namely self-aiming externally hanging) is achieved, the detection speed and the accuracy of a detection result are effectively improved, and the technical problems that whether a player adopts the self-aiming technology in the game is judged through the automatic detection technology in the related technology, the accuracy of the judgment result is low, and the detection speed is low are solved.
Fig. 10 is a schematic diagram of an electronic device according to an embodiment of the application. As shown in fig. 10, the electronic device 1000 is merely an example, and should not be construed as limiting the functionality and scope of use of the embodiments of the present application.
As shown in fig. 10, the electronic apparatus 1000 is embodied in the form of a general purpose computing device. Components of electronic device 1000 may include, but are not limited to: the at least one processor 1010, the at least one memory 1020, a bus 1030 connecting the various system components including the memory 1020 and the processor 1010, and a display 1040.
Wherein the memory 1020 stores program code that can be executed by the processor 1010 to cause the processor 1010 to perform the steps according to various exemplary embodiments of the present application described in the above method section of the embodiment of the present application.
Memory 1020 may include readable media in the form of volatile memory units such as Random Access Memory (RAM) 10201 and/or cache memory 10202, and may further include Read Only Memory (ROM) 10203, and may also include non-volatile memory such as one or more magnetic storage devices, flash memory, or other non-volatile solid state memory.
In some examples, memory 1020 may also include a program/utility 10204 having a set (at least one) of program modules 10205, such program modules 10205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Memory 1020 may further include memory located remotely from processor 1010, which may be connected to electronic device 1000 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Bus 1030 may be representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processor 1010, or a local bus using any of a variety of bus architectures.
The display 1040 may be, for example, a touch-screen type Liquid crystal display (Liquid CRYSTAL DISPLAY, LCD) that may enable a user to interact with a user interface of the electronic device 1000.
Optionally, the electronic apparatus 1000 may also be in communication with one or more external devices 1100 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic apparatus 1000, and/or with any device (e.g., router, modem, etc.) that enables the electronic apparatus 1000 to communicate with one or more other computing devices. Such communication may occur through an Input/Output (I/O) interface 1050. Also, electronic device 1000 can communicate with one or more networks such as a local area network (Local Area Network, LAN), a wide area network (Wide Area Network, WAN), and/or a public network such as the internet through network adapter 1060. As shown in fig. 10, the network adapter 1060 communicates with other modules of the electronic device 1000 over a bus 1030. It should be appreciated that although not shown in fig. 10, other hardware and/or software modules may be used in connection with the electronic device 1000, which may include, but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, disk array (Redundant Array of INDEPENDENT DISKS, RAID) systems, tape drives, data backup storage systems, and the like.
The electronic device 1000 may further include: a keyboard, a cursor control device (e.g., a mouse), an input/output interface (I/O interface), a network interface, a power supply, and/or a camera.
It will be appreciated by those of ordinary skill in the art that the configuration shown in fig. 10 is merely illustrative and is not intended to limit the configuration of the electronic device described above. For example, electronic device 1000 may also include more or fewer components than shown in FIG. 10, or have a different configuration than shown in FIG. 1. The memory 1020 may be used to store a computer program and corresponding data, such as a computer program and corresponding data corresponding to a self-aiming technique detection method in an embodiment of the present application. The processor 1010 executes a computer program stored in the memory 1020 to perform various functional applications and data processing, i.e., to implement the self-aiming technique detection method described above.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a read-only memory (ROM), a random access memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application, which are intended to be comprehended within the scope of the present application.

Claims (15)

1. A method of self-aiming technique detection, the method comprising:
Acquiring a first position and a second position in a graphical user interface, wherein the first position is a position in the graphical user interface where a target object in a virtual game scene is mapped to the target object, and the second position is a position in the graphical user interface where a quasi center of a virtual weapon is located;
Determining a first targeting feature based on the first location and the second location, wherein the first targeting feature is to reflect a targeting accuracy of a firing operation;
The firing operation is detected based on the first targeting feature to determine whether the firing operation satisfies a target feature, wherein the target feature is indicative of employing a self-targeting technique.
2. The method of claim 1, wherein the determining a first targeting feature based on the first location and the second location comprises:
Determining an aiming distance based on the first position and the second position, wherein the aiming distance is a distance between a target object mapped into the graphical user interface and a quasi center;
The first targeting feature is determined based on the targeting distance.
3. The method of claim 2, wherein the determining the first targeting feature based on the targeting distance comprises:
Determining radian characteristics of the shooting operation based on the first position and the second position, wherein the radian characteristics are radian values of an included angle between a first target direction and a preset direction in the graphical user interface, and the first target direction is a direction of the first position relative to the second position in the graphical user interface;
The first targeting feature is determined based on the targeting distance and the radian feature.
4. A method according to any of claims 1-3, wherein the detecting the firing operation based on a first targeting feature comprises:
acquiring shooting data, wherein the shooting data is used for describing shooting properties of the virtual weapon;
Determining a second targeting feature based on the firing data, wherein the second targeting feature is to reflect a difficulty level of the firing operation;
The firing operation is detected based on the first targeting feature, the second targeting feature, and a preset weight value.
5. The method of claim 4, wherein the shot data includes targeting data that is an angle value between a second target direction in the virtual game scene and a field of view direction, the second target direction being a direction of the target object in the virtual game scene relative to a center of gravity of a virtual weapon in the virtual game scene, the field of view direction being a center of view direction of a current virtual camera, the determining a second targeting feature based on the shot data comprising:
Determining an angular velocity and an angular acceleration of the target object movement based on the aiming data, wherein the angular velocity is used for reflecting the movement velocity of the target object, and the angular acceleration is used for reflecting the movement acceleration of the target object;
A first targeting sub-feature of the firing operation is determined based on the angular velocity and the angular acceleration, wherein the first targeting feature is configured to reflect a relationship of the angular velocity and the angular acceleration to a difficulty level of the firing operation.
6. The method of claim 5, wherein the shot data further comprises scope data representing a multiple of a scope employed by the shooting operation, the determining a first aiming sub-feature of the shooting operation based on the angular velocity and the angular acceleration comprising:
the first sighting sub-feature is determined based on the sighting telescope data, the angular velocity and the angular acceleration.
7. The method of claim 6, wherein the shot data further comprises recoil data, the determining a second targeting feature based on the shot data further comprising:
determining a shaking speed of the quasi-center based on the recoil data;
Determining a second aiming sub-feature of the shooting operation based on the shake speed, wherein the second aiming sub-feature is used for reflecting the relation between the shake speed and the difficulty level of the shooting operation;
the second targeting feature is determined based on the first targeting sub-feature and the second targeting sub-feature.
8. The method of claim 1, wherein obtaining the first location in the graphical user interface comprises:
acquiring a third position of a preset part of the target object in the virtual game scene;
Mapping the third position to the graphical user interface to obtain the first position.
9. The method of claim 3, wherein the determining a first targeting feature based on the targeting distance and the radian feature comprises:
Determining a third targeting sub-feature based on the targeting distance, wherein the third targeting sub-feature is to reflect a relationship of the targeting distance to a targeting accuracy of the firing operation;
determining an radian characteristic interval in which the radian characteristic is positioned in a preset time period;
Determining a fourth aiming sub-feature based on the radian feature interval, wherein the fourth aiming sub-feature is used for reflecting the relation between the radian feature interval and the aiming precision of the shooting operation;
the first targeting feature is determined based on the third targeting sub-feature and the fourth targeting sub-feature.
10. The method according to claim 4, wherein the method further comprises:
Acquiring shooting operation data, wherein the shooting operation data are used for controlling the virtual weapon to be in a shooting state;
Determining a third aiming feature based on the shooting operation data, wherein the third aiming feature is used for reflecting the aiming condition of any shooting operation in a corresponding operation duration;
An abnormal operation is determined based on the third targeting feature.
11. The method of claim 4, wherein acquiring the first location, the second location, and the shot data comprises:
Acquiring the first position, the second position and the shooting data based on at least one embedded point set in a game client, wherein different embedded points are used for acquiring different types of data;
And checking the first position, the second position and the shooting data to obtain the checked first position, second position and shooting data.
12. The method of claim 4, wherein the detecting the firing operation based on the first targeting feature, the second targeting feature, and a preset weight value comprises:
Determining a probability that the firing operation satisfies the target feature based on the first targeting feature, the second targeting feature, and the preset weight value;
And determining that the shooting operation meets the target feature in response to the probability being greater than a preset probability.
13. A self-aiming technique detection device, the device comprising:
the system comprises an acquisition module, a control module and a control module, wherein the acquisition module is used for acquiring a first position and a second position in a graphical user interface, the first position is a position in the graphical user interface, the target object in a virtual game scene is mapped to a target object in the virtual game scene, and the second position is a position of a quasi center of a virtual weapon in the graphical user interface;
A determination module for determining a first targeting feature based on the first location and the second location, wherein the first targeting feature is to reflect a targeting accuracy of a firing operation;
A detection module for detecting the firing operation based on a first targeting feature to determine whether the firing operation satisfies a target feature, wherein the target feature is indicative of employing a self-targeting technique.
14. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program, wherein the computer program is arranged to perform the self-aiming technique detection method as claimed in any of the preceding claims 1 to 10 when run on a computer or processor.
15. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to run the computer program to perform the self-aiming technique detection method as claimed in any of the preceding claims 1 to 10.
CN202410041887.2A 2024-01-10 2024-01-10 Self-aiming technology detection method and device, storage medium and electronic device Pending CN117899476A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410041887.2A CN117899476A (en) 2024-01-10 2024-01-10 Self-aiming technology detection method and device, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410041887.2A CN117899476A (en) 2024-01-10 2024-01-10 Self-aiming technology detection method and device, storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN117899476A true CN117899476A (en) 2024-04-19

Family

ID=90693284

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410041887.2A Pending CN117899476A (en) 2024-01-10 2024-01-10 Self-aiming technology detection method and device, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN117899476A (en)

Similar Documents

Publication Publication Date Title
CN108815851B (en) Interface display method, equipment and storage medium for shooting in virtual environment
US20200155951A1 (en) Method and apparatus for providing online shooting game
CN108635857B (en) Interface display method and device, electronic device and computer readable storage medium
CN107773987B (en) Virtual shooting subject control method and device, electronic equipment and storage medium
US10274287B2 (en) System and method for marksmanship training
JP5563709B2 (en) System and method for facilitating interaction with virtual space via a touch-sensitive surface
US20090280901A1 (en) Game controller device and methods thereof
US8385596B2 (en) First person shooter control with virtual skeleton
CN108958475B (en) Virtual object control method, device and equipment
CN107913520A (en) Information processing method, device, electronic equipment and storage medium
KR101366444B1 (en) Virtual reality shooting system for real time interaction
CN108671544B (en) Firearm assembly method, device and storage medium in virtual environment
CN107678647A (en) Virtual shooting main body control method, apparatus, electronic equipment and storage medium
KR101414147B1 (en) Virtual Reality Shooting Simulation System
CN111179679B (en) Shooting training method and device, terminal equipment and storage medium
CN110465098B (en) Method, device, equipment and medium for controlling virtual object to use virtual prop
CN109821238B (en) Method and device for aiming in game, storage medium and electronic device
US11293722B2 (en) Smart safety contraption and methods related thereto for use with a firearm
WO2021048307A1 (en) Imaging system for firearm
CN108744513A (en) Method of sight, device, electronic equipment in shooting game and storage medium
KR101429341B1 (en) Method for gun shotting game using augmentation reality and mobile device and system usning the same
WO2014111947A1 (en) Gesture control in augmented reality
US20230330528A1 (en) Information determining method and apparatus, device, and storage medium
CN106508013B (en) The universal guided missile simulation training aidss of indoor and outdoor
JP7381572B2 (en) Advanced gaming visualization system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination