CN113318431A - In-game aiming control method and device - Google Patents

In-game aiming control method and device Download PDF

Info

Publication number
CN113318431A
CN113318431A CN202110616223.0A CN202110616223A CN113318431A CN 113318431 A CN113318431 A CN 113318431A CN 202110616223 A CN202110616223 A CN 202110616223A CN 113318431 A CN113318431 A CN 113318431A
Authority
CN
China
Prior art keywords
virtual
touch operation
game
user interface
graphical user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110616223.0A
Other languages
Chinese (zh)
Other versions
CN113318431B (en
Inventor
刘奕凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110616223.0A priority Critical patent/CN113318431B/en
Publication of CN113318431A publication Critical patent/CN113318431A/en
Application granted granted Critical
Publication of CN113318431B publication Critical patent/CN113318431B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/219Input arrangements for video game devices characterised by their sensors, purposes or types for aiming at specific areas on the display, e.g. light-guns
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an in-game aiming control method and device. Wherein, the method comprises the following steps: displaying a first view picture through a graphical user interface; responding to a first touch operation acting on the graphical user interface, acquiring a first position of a touch point of the first touch operation, and controlling to adjust the first view picture into a second view picture determined according to the first position; and controlling to start the virtual sighting telescope configured on the virtual gun in response to the fact that the staying time of the touch point at the first position meets a first preset time. The invention solves the technical problem that the aiming center after opening the mirror is deviated due to the complex opening action in the shooting game in the prior art.

Description

In-game aiming control method and device
Technical Field
The invention relates to the field of in-game control, in particular to an in-game aiming control method and device.
Background
A shooting game is a type of game that provides a virtual gun in a game interface for a player to shoot at, and may generally include a first person shooting game and a third person shooting game. The method comprises the steps that a user opens a virtual sighting telescope equipped on a virtual gun when shooting through the virtual gun, the mirror opening operation of a current first-person shooting game is mostly a fixed mirror opening button, the moving direction of a first-person lens is always consistent with the direction of a character, so that the mirror opening is a step-by-step option in the lens moving process, the user can find an observation point needing the mirror opening in the process of moving a visual angle, however, after the mirror opening button is clicked, a visual central point is reselected in a screen, the mirror opening is closed after shooting, and the mobile phone is possibly lifted up and falls down to offset the aiming center in the process of large-magnification visual angles. In the operation of quickly opening the telescope-shooting-closing the telescope, the process of opening and closing the telescope is relatively complex.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides an in-game aiming control method and device, which at least solve the technical problem that aiming centers after opening mirrors are deviated due to complex opening action in a shooting game in the prior art.
According to an aspect of an embodiment of the present invention, there is provided an in-game aiming control method, which provides a graphical user interface through a terminal device, wherein the graphical user interface includes at least a part of a game scene and at least a part of a first virtual object located in the game scene, and the first virtual object is configured as a virtual gun, the method including: displaying a first view picture through a graphical user interface; responding to a first touch operation acting on the graphical user interface, acquiring a first position of a touch point of the first touch operation, and controlling to adjust the first view picture into a second view picture determined according to the first position; and controlling to start the virtual sighting telescope configured on the virtual gun in response to the fact that the staying time of the touch point at the first position meets a first preset time.
Optionally, the method further comprises: responding that the stay time of the touch point at the first position does not meet a first preset time, and detecting whether the first touch operation leaves the touch point; and when the first touch operation is detected to leave the touch point, stopping recording the stay time of the first touch operation at the touch point.
Optionally, after controlling to adjust the first view picture to the second view picture determined according to the first position, the method further includes: and displaying a timing area, wherein the timing area is used for displaying the relation between the stay time and the first preset time.
Optionally, before displaying the timing area, the method further comprises: comparing the stay time with a second preset time; and entering a display timing area under the condition that the stay time exceeds a second preset time, wherein the second preset time is less than the first preset time.
Optionally, after controlling to turn on the virtual sighting telescope configured on the virtual firearm in response to the staying time of the touch point at the first position meeting a first preset time, the method further includes: when the first touch operation is detected to move away from the touch point, adjusting the visual angle of the virtual sighting telescope according to the movement of the first touch operation; and when the first touch operation cancellation is detected, closing the virtual sighting telescope configured on the virtual gun.
According to another aspect of an embodiment of the present invention, there is provided an in-game aiming control method, in which a terminal device provides a graphical user interface, the graphical user interface includes at least a part of a game scene and at least a part of a virtual object located in the game scene, and a first virtual object is configured as a virtual gun, the method including: displaying a first view picture through a graphical user interface; responding to a first touch operation acting on the graphical user interface, acquiring the current position of a touch point of the first touch operation, and controlling to adjust the first view picture into a second view picture determined according to the current position; and in the continuous process of the first touch control operation, responding that the stay time of the target area in the game scene corresponding to the preset position in the second visual field picture meets the first preset time, and controlling to start the virtual sighting telescope configured on the virtual gun.
Optionally, the step of responding that the staying time of the target area in the game scene corresponding to the preset position in the second view field picture satisfies the first preset time includes: responding that the stay time of a second virtual object in the game scene corresponding to the preset position in the second visual field picture meets the first preset time.
Optionally, the step of responding that the staying time of the target area in the game scene corresponding to the preset position in the second view field picture meets the first preset time includes: acquiring the stay time of a preset position in a second view field picture corresponding to a first scene area in a game scene; and when the stay time of the preset position in the second view field picture corresponding to the first scene area in the game scene meets the first preset time, responding that the stay time of the preset position in the second view field picture corresponding to the first scene area in the game scene meets the first preset time.
According to another aspect of the embodiments of the present invention, there is also provided an in-game aiming control apparatus, which provides a graphical user interface through a terminal device, the graphical user interface including at least a part of a game scene and at least a part of a first virtual object located in the game scene, the first virtual object being configured as a virtual gun, the apparatus including: the first display module is used for displaying a first view picture through a graphical user interface; the first acquisition module is used for responding to a first touch operation acting on the graphical user interface, acquiring a first position of a touch point of the first touch operation, and controlling to adjust the first view picture into a second view picture determined according to the first position; and the first control module is used for responding that the stay time of the touch point at the first position meets a first preset time, and controlling to start the virtual sighting telescope configured on the virtual gun.
According to another aspect of the embodiments of the present invention, there is also provided an in-game aiming control apparatus, which provides a graphical user interface through a terminal device, the graphical user interface including at least a part of a game scene and at least a part of a virtual object located in the game scene, the first virtual object being configured as a virtual firearm, the apparatus including: the second display module is used for displaying the first visual field picture through a graphical user interface; the second acquisition module is used for responding to the first touch operation acting on the graphical user interface, acquiring the current position of a touch point of the first touch operation, and controlling to adjust the first view picture into a second view picture determined according to the current position; and the second control module is used for responding that the stay time of the target area in the game scene corresponding to the preset position in the second visual field picture meets the first preset time in the continuous process of the first touch control operation, and controlling to start the virtual sighting telescope configured on the virtual gun.
According to another aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium including a stored program, wherein when the program runs, the apparatus on which the computer-readable storage medium is located is controlled to execute the in-game aiming control method of any one of the above.
According to another aspect of the embodiments of the present invention, there is also provided a processor for executing a program, wherein the program executes to execute the in-game aiming control method of any one of the above.
In the embodiment of the invention, a first visual field picture is displayed through a graphical user interface, a first touch operation acted on the graphical user interface is responded, a first position of a touch point of the first touch operation is acquired, and the first visual field picture is controlled to be adjusted to a second visual field picture determined according to the first position; the dwell time of the response touch point at the first position meets the first preset time length, the virtual sighting telescope configured on the virtual gun is controlled to be started, touch operation in shooting game opening is reduced, a user can quickly open the telescope only by performing one touch operation, and meanwhile, the phenomenon that the sighting center after opening the telescope is deviated due to too much touch operation can be avoided, so that the technical problem that the sighting center after opening the telescope is deviated due to complex telescope opening action in game in the prior art is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a flowchart of an in-game aiming control method according to embodiment 1 of the present invention;
FIG. 2 is a schematic illustration of a first-person perspective in accordance with an embodiment of the invention;
FIG. 3 is a schematic illustration of a second person perspective according to an embodiment of the invention;
FIG. 4 is a schematic illustration of a timing zone according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of an open virtual scope according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a graphical user interface according to an embodiment of the present invention;
fig. 7 is a flowchart of an in-game aiming control method according to embodiment 2 of the present invention;
fig. 8 is a schematic view of an in-game aiming control device according to embodiment 3 of the present invention;
fig. 9 is a schematic diagram of an in-game aiming control device according to embodiment 4 of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The following is a description of the terms referred to in this application:
virtual scene
Is a virtual scene that an application program displays (or provides) when running on a terminal or server. Optionally, the virtual scene is a simulated environment of the real world, or a semi-simulated semi-fictional virtual environment, or a purely fictional virtual environment. The virtual scene is any one of a two-dimensional virtual scene and a three-dimensional virtual scene, and the virtual environment can be sky, land, sea and the like, wherein the land comprises environmental elements such as deserts, cities and the like. For example, in a sandbox type 3D shooting game, the virtual scene is a 3D game world for a player to control a virtual object to play against, and an exemplary virtual scene may include: at least one element selected from a group consisting of a mountain, a flat ground, a river, a lake, an ocean, a desert, a sky, a plant, a building, and a vehicle; for example, for a 2D card game in which a virtual scene is a scene for displaying a released card or a virtual object corresponding to a card, an example virtual scene may include: a arena, a battle field, or other 'field' elements or other elements capable of displaying the card battle state; for a 2D or 3D multiplayer online tactical sports game, the virtual scene is a 2D or 3D terrain scene for the virtual object to fight against, and exemplary virtual scenes may include: mountains, lines, rivers, classrooms, tables and chairs, podium and other elements in the canyon style.
Virtual object
Refers to a dynamic object that can be controlled in a virtual scene. Alternatively, the dynamic object may be a virtual character, a virtual animal, an animation character, or the like. The virtual object is a Character controlled by a Player through an input device, or an Artificial Intelligence (AI) set in a virtual environment match-up through training, or a Non-Player Character (NPC) set in a virtual scene match-up. Optionally, the virtual object is a virtual character playing a game in a virtual scene. Optionally, the number of virtual objects in the virtual scene match is preset, or dynamically determined according to the number of clients participating in the match, which is not limited in the embodiment of the present application. In one possible implementation, the user can control the virtual object to move in the virtual scene, e.g., control the virtual object to run, jump, crawl, etc., and can also control the virtual object to fight against other virtual objects using skills, virtual props, etc., provided by the application.
Visual field picture
The virtual scene includes a first virtual object and a second virtual object, and the first virtual object and the second virtual object may belong to different teams. And the terminal displays a virtual scene reflecting the game scene through the game picture. Optionally, the game screen is a screen corresponding to a specific viewing angle for observing the virtual scene. The view angle is an angle at which the virtual scene is observed by the camera model in the virtual scene.
Optionally, the camera model automatically follows the virtual object in the virtual environment, that is, when the position of the virtual object in the virtual environment changes, the camera model changes while following the position of the virtual object in the virtual environment, and the camera model is always within the preset distance range of the virtual object in the virtual environment. Optionally, the relative positions of the camera model and the virtual object do not change during the automatic following process.
The camera model refers to a three-dimensional model located around a virtual object in a virtual environment, and when a first-person perspective is adopted, the camera model is located near or at the head of the virtual object; when the third person weighing view angle is adopted, the camera model can be located behind the virtual object and bound with the virtual object, and also can be located at any position away from the virtual object by a preset distance, and the virtual object located in the virtual environment can be observed from different angles through the camera model. Optionally, when the third person perspective is the over-shoulder perspective of the first person, the camera model is located behind the virtual object (e.g., the shoulders of the virtual character). Optionally, the viewing angle includes other viewing angles, such as a top viewing angle, in addition to the first person viewing angle and the third person viewing angle; the camera model may be located overhead of the virtual object head when a top view is employed, which is a view of viewing the virtual environment from an overhead top view. Optionally, the camera model is not actually displayed in the virtual environment, i.e. the camera model is not displayed in the virtual environment displayed by the user interface.
The following describes an example in which the camera model is located at an arbitrary position away from the virtual object by a predetermined distance. Optionally, one virtual object corresponds to one camera model, and the camera model can rotate around the virtual object as a rotation center, such as: the camera model is rotated with any point of the virtual object as a rotation center, the camera model not only rotates in angle but also shifts in displacement during the rotation, and the distance between the camera model and the rotation center is kept constant during the rotation, that is, the camera model is rotated on the surface of a sphere with the rotation center as a sphere center, wherein any point of the virtual object may be a head, a trunk or any point around the virtual object, which is not limited in the embodiment of the present application. Optionally, when the camera model observes the virtual object, the center of the view angle of the camera model points to the direction in which the point of the spherical surface on which the camera model is located points to the center of the sphere
Optionally, the camera model may also observe the virtual object at a preset angle in different directions of the virtual object. Optionally, the first virtual object is a virtual object controlled by a user through a terminal, and the second virtual object includes at least one of a virtual object controlled by another user and a virtual object controlled by a background server.
Optionally, the virtual environment picture in this embodiment is a picture for observing the virtual environment from the perspective of the first virtual object.
The in-game aiming control method in one embodiment of the disclosure can be operated on a terminal device or a server. The terminal device may be a local terminal device. When the information processing method is operated on a server, the information processing method can be implemented and executed based on a cloud interactive system, wherein the cloud interactive system comprises the server and a client device.
In an optional embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud games. Taking a cloud game as an example, a cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, the running main body of the game program and the game picture presenting main body are separated, the storage and the running of the in-game aiming control method are finished on a cloud game server, and the client equipment is used for receiving and sending data and presenting the game picture, for example, the client equipment can be display equipment with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; however, the terminal device performing the information processing is a cloud game server in the cloud. When a game is played, a player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the client device through a network, and finally the data are decoded through the client device and the game pictures are output.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used for presenting a game screen. The local terminal device is used for interacting with the player through a graphical user interface, namely, a game program is downloaded and installed and operated through an electronic device conventionally. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the player through holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including a game screen and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
Example 1
In accordance with an embodiment of the present invention, there is provided an embodiment of an in-game targeting control method, it being noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system, such as a set of computer-executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than presented herein.
Fig. 1 is an in-game aiming control method according to embodiment 1 of the present invention, as shown in fig. 1, the method including the steps of:
step S102, displaying a first view picture through a graphical user interface.
The method comprises the steps that a graphical user interface is provided through a terminal device, the graphical user interface comprises at least part of a game scene and at least part of a first virtual object located in the game scene, and the first virtual object is configured to be a virtual gun.
The graphical user interface refers to a computer operation user interface displayed in a graphical manner through a terminal device, and the terminal device may be the aforementioned local terminal device or the aforementioned client device in the cloud interaction system.
The first visual field picture is a visual field provided by a game scene in the graphical user interface for a user.
Step S104, in response to a first touch operation applied to the graphical user interface, acquiring a first position of a touch point of the first touch operation, and controlling to adjust the first view frame to a second view frame determined according to the first position.
The first visual field picture and the second visual field picture are different visual fields provided by moving the virtual camera in the game.
In an alternative embodiment, the first touch operation may be an operation of moving a virtual camera.
In another alternative embodiment, the first touch operation may be a sliding operation or other operation capable of adjusting the game field of view. For example, in a game, a player performs a sliding operation on the device touch screen surface, which triggers a virtual camera movement in the game, thereby adjusting the game field of view from a first field of view to a second field of view.
In yet another alternative embodiment, the first touch operation may be applied to an aiming control in the game scene, may be applied to a blank area in the game scene, or may be applied to other controls capable of adjusting the field of view, such as a small eye control.
The graphical user interface may be displayed in a First-person perspective or a third-person perspective, where the First-person perspective is mainly applied to a First-person shooting game (FPS), and the First-person shooting game is mainly performed in a subjective perspective of a user, as shown in fig. 2; the Third person called the shoot game (TPS) emphasizes the sense of action, and the leading role thereof is visible on the game screen, as shown in fig. 3, where 1 denotes the stay time of the first touch operation on the graphical user interface.
The above-mentioned scheme does not make any limitation on the execution position of the first touch operation. In an alternative embodiment, the first touch operation may be performed in response to a first touch operation applied anywhere in the graphical user interface; in another alternative embodiment, the game view may be adjusted from the first view screen to the second view screen in response to a first touch operation of a user acting on the game interface through a finger or a cursor of a mouse, where the user is a game player for performing the first touch operation.
And S106, responding that the stay time of the touch point at the first position meets a first preset time, and controlling to start the virtual sighting telescope configured on the virtual gun.
Specifically, the touch point is a position point where a finger of a user or a mouse cursor performs a touch operation on the game interface.
In an optional embodiment, in response to a first touch operation applied to the graphical user interface, the game view is adjusted from a first view picture to a second view picture, when it is detected that the first touch operation stops moving in the second view picture, a point where a finger of a user stops on the graphical user interface for touch operation can be acquired, the stay duration of the finger of the user at the touch point is recorded, and under the condition that the stay duration meets a first preset duration, a virtual sighting telescope configured on a virtual gun is controlled to be started, so that the user can conveniently watch enemies far away in a scene through a high power lens.
The first preset time length can be freely set by a user; the virtual gun refers to a virtual weapon used for shooting aiming by a user in a shooting game scene; the virtual sighting telescope is a virtual lens for sighting, which is equipped on a virtual gun used by a user in a shooting game scene.
In an alternative embodiment, in the case that the staying time period is less than the first preset time period, the user may simply adjust the first view frame to the second view frame, which does not wish to perform the open mirror operation, and at this time, it may be determined not to open the virtual sighting telescope configured on the virtual firearm; when the staying time is less than the first preset time, the user may also perform a misoperation after the first view picture is adjusted to the second view picture, and also does not wish to perform a mirror-opening operation, and at this time, it may be determined not to open the virtual sighting telescope configured on the virtual gun, so as to avoid inconvenience in operation of the user in the game due to the opening of the lens. Specifically, under the condition that the stay time is shorter than a first preset time, the stay time of the first touch operation at the touch point is continuously recorded until the finger leaves the game interface or the finger starts to move, if the total stay time of the first touch operation at the touch point is shorter than the first preset time, it indicates that the user only needs to adjust the game view from the first view picture to the second view picture, and does not need to open the virtual sighting telescope, and at this time, the step of opening the virtual sighting telescope configured on the virtual gun is not executed.
Further, when the stay time length meets the first preset time length, it is indicated that the user not only needs to adjust the first view picture to the second view picture, but also needs to open the virtual sighting telescope to search for enemies or accurate shooting, and at this time, the virtual sighting telescope configured on the virtual gun can be determined to be opened, so that the user can search enemies or accurate shooting through the sighting telescope. Specifically, under the condition that the stay time length meets the first preset time length, it is shown that at this time, after the first view field picture is adjusted to the second view field picture by the user, the virtual sighting telescope configured on the virtual gun needs to be opened so as to quickly check enemies in the game scene, and the situation that the aiming point is shifted due to the fact that the screen needs to be clicked again after the first view field picture is adjusted to the second view field picture is avoided.
In an optional embodiment, the first preset duration can be set to be shorter according to the operation habits and requirements of the user, at this time, when the user performs the first touch operation, the user can open the virtual sighting telescope configured on the virtual firearm only in a shorter touch time, the sensitivity of opening the virtual sighting telescope is improved, and the shooting process can be conveniently and quickly switched, so that the user can feel a good game experience.
In another optional embodiment, the first preset duration may be set to be longer according to the operation habits and requirements of the user, and at this time, when the user performs the first touch operation, the user needs a longer touch time to turn on the virtual sighting telescope configured on the virtual instrument, so that the user can be given a longer touch time, thereby preventing the user from performing a wrong operation, and enabling a novice player to feel a good game experience.
In this embodiment, a first view image is first displayed through a graphical user interface, a first touch operation applied to the graphical user interface is responded, a first position of a touch point of the first touch operation is acquired, and the first view image is controlled to be adjusted to a second view image determined according to the first position; the dwell time of the response touch point at the first position meets the first preset time length, the virtual sighting telescope configured on the virtual gun is controlled to be started, touch operation in shooting game opening is reduced, a user can quickly open the telescope only by performing one touch operation, and meanwhile, the phenomenon that the sighting center after opening the telescope is deviated due to too much touch operation can be avoided, so that the technical problem that the sighting center after opening the telescope is deviated due to complex telescope opening action in game in the prior art is solved.
Optionally, the method further comprises: responding that the stay time of the touch point at the first position does not meet a first preset time, and detecting whether the first touch operation leaves the touch point; and when the first touch operation is detected to leave the touch point, stopping recording the stay time of the first touch operation at the touch point.
Specifically, the first touch operation leaving the touch point may include two cases, where the first case is that the finger of the user does not leave the game interface, but moves on the game interface; the second case is when the user's finger is completely removed from the game interface.
In an optional embodiment, when the staying time is less than a first preset time, it may be detected whether the first touch operation leaves the touch point, and when it is detected that the first touch operation leaves the touch point, it indicates that the user has completed switching the first view picture to the second view picture at this time, and does not need to perform a mirror opening operation, and by stopping recording the staying time of the first touch operation at the touch point, the step of opening the virtual sighting telescope on the virtual firearm may be cancelled, and a buffer condition for opening the virtual sighting telescope is given to the user.
Optionally, after controlling the first view frame to be adjusted by the second view frame determined according to the first position, the method further includes: and displaying a timing area, wherein the timing area is used for displaying the relation between the stay time and the first preset time.
As shown in fig. 4, the timing zone 2 can be shown in the form of arc-shaped timing slots, and the timing slots can also be in other shapes, such as circular timing slots, bar timing slots, etc.; the time zone may also be displayed in the form of a number, for example, in the form of a countdown in the middle or above the screen.
The timing area shows a relationship between the dwell time and the first preset time, and still taking fig. 4 as an example, the entire arc-shaped timing slot represents the first preset time, the black area represents the dwell time of the first touch operation, and the black area gradually expands as the dwell time of the first touch operation increases.
In an optional embodiment, the timing area may be displayed in a timing progress bar or a countdown manner, so that a user may determine whether to stop the first touch operation or continue to execute the first touch operation according to a timing condition of the timing area; if the user needs to open the virtual sighting telescope, determining whether the first touch operation needs to be continuously executed or not by referring to the timing condition displayed in the timing area; if the progress bar in the timing area runs out, it indicates that the stay time is equal to a first preset time, and at this time, the virtual sighting telescope is opened, and the first touch operation does not need to be executed. If the progress bar in the timing area is not finished, it indicates that the stay time is less than a first preset time, and the virtual sighting telescope is not opened at this time, and at this time, the first touch operation can be continuously executed until the stay time is greater than or equal to the first preset time, so as to execute the operation of opening the virtual sighting telescope.
Optionally, before displaying the timing area, the method further comprises: comparing the stay time with a second preset time; and entering a display timing area under the condition that the stay time exceeds a second preset time, wherein the second preset time is less than the first preset time.
The second preset time period may be set by a user.
In an optional embodiment, the second preset duration can be set to be shorter according to the operation habits and requirements of the user, at this time, when the user executes the first touch operation, the user can enter the step of displaying the timing area only by needing shorter touch time, so that the user can select whether to start the virtual sighting telescope in advance, the shooting process can be conveniently and quickly switched, and the user can feel a good game experience.
In another optional embodiment, the second preset duration may be set longer according to the operation habits and requirements of the user, and at this time, when the user performs the first touch operation, the step of entering the display timing area with longer touch time is required, so that the user can be given a longer thinking time, the user is prevented from misoperation, and the novice player can feel a good game experience.
In an optional embodiment, before the timing area is displayed, the staying time duration may be compared with a second preset time duration, and when the staying time duration exceeds the second preset time duration, the timing area may be displayed, so that a user may select whether to start the virtual sighting telescope according to the timing area, as shown in fig. 4, by displaying the timing area 2, the user may select whether to start the virtual sighting telescope equipped on the virtual gun 3 according to the timing situation of the timing area, as shown in fig. 5, when the staying time duration of the first touch operation is longer than the first preset time duration, the step of starting the virtual sighting telescope 4 may be performed, which reduces the complexity of starting the virtual sighting telescope, so that the user may quickly aim at an enemy through the virtual sighting telescope, and improves the game experience of the user.
For example, the first preset time period may be set to be 3 seconds, the second preset time period may be set to be 1 second, and when the time that the finger of the user stays on the game interface exceeds 1 second, the time counting slot (i.e., the above timing area) is displayed, so that the user determines whether to continue to stay according to the timing area, if the user needs to start the virtual sighting telescope, the finger continues to stay until the time counting slot is full, and when the time counting slot is full, the time counting slot indicates that the time counting length reaches the first preset time period; if the user does not need to turn on the virtual scope, the finger gauge needs to be moved away or continue to move before the slot is full.
Optionally, after controlling to turn on the virtual sighting telescope configured on the virtual firearm in response to the staying time of the touch point at the first position meeting a first preset time, the method further includes: when the first touch operation is detected to move away from the touch point, adjusting the visual angle of the virtual sighting telescope according to the movement of the first touch operation; and when the first touch operation cancellation is detected, closing the virtual sighting telescope configured on the virtual gun.
In an optional embodiment, when the staying time is longer than or equal to a first preset time, after the virtual sighting telescope configured on the virtual firearm is opened, and when it is detected that the first touch operation leaves the touch point to move, the visual angle of the virtual sighting telescope can be adjusted according to the movement of the first touch operation, so that a user can adjust the visual angle according to the movement track of an enemy, the user can aim at the enemy better, a shot can be shot successfully, and the game experience of the user is improved.
An application scenario of the present embodiment is provided below. A schematic diagram of a graphical user interface is shown in fig. 6. The interface includes a game scene and a virtual object located in the game scene, which are not shown in fig. 6. The player controls the movement of the virtual object in the game scene. The interface in FIG. 6 shows a variety of controls that are displayed over a game scene.
Wherein, 1000 are knapsack controls, through the touch-control operation that acts on this control, trigger and show the knapsack interface in graphical user interface for look over the article stage property of knapsack, the article stage property can include but not limited to: props such as medicines, weapons, accessories, guards, etc. In an alternative embodiment, the edge of the backpack control is provided with an edge line for indicating the amount of items in the backpack. The touch operation acting on the backpack control can be clicking, double-clicking, long-pressing, sliding and the like.
1002 is a move control and the shaded filled circle is a rocker in the move control. In an optional embodiment, the movement control is configured with a movement response area, where the movement response area may be the same as the size of the movement control, that is, an area in the graphical user interface corresponding to the movement control is the movement response area. In an optional embodiment, the size of the movement response area may be larger than that of the movement control, that is, the movement control is disposed in the movement response area, and in terms of distance, the graphical user interface is divided into left and right side areas from the middle, where the left side area is the movement response area of the movement control. And triggering a movement control instruction through touch operation of a movement response area acting on the movement control, and controlling the movement direction of the virtual object according to the movement control instruction. The touch operation acting on the movement response area of the movement control can be click, double click, long press, sliding and other operations.
When the touch operation triggering the movement control instruction is detected to meet the preset condition, a 'run' instruction is triggered, and the virtual object is controlled to automatically and continuously run in the game scene. As shown in fig. 5, in an alternative embodiment, when the touch operation triggering the movement control command satisfies the predetermined condition, the touch operation acting on the joystick area slides along the predetermined direction to a predetermined distance or a predetermined area (e.g., a position indicated by an arrow above the joystick). For example, when a touch operation drags the joystick upward, a sprint control 1006 appears, and when the sprint control is triggered, the virtual object enters a sprint mode. Meanwhile, the 1030 control is also a sprint control, and when the 1030 control is clicked, the virtual object enters a fast running mode. In an optional embodiment, when the touch operation triggering the movement control command satisfies the preset condition, the touch operation acting on the rocker area satisfies a preset duration, or satisfies a preset pressure, and the like.
1004 and 1014 are attack controls, an attack instruction is triggered through touch operation acting on an attack response area of the attack controls, and the virtual character is controlled to execute attack operation in a game scene according to the attack instruction. For example, when the virtual object is equipped with a long-distance weapon, the attack control triggers that the attack behavior is a shooting behavior, that is, the left hand can execute a shooting operation by triggering the 1004 control, and the right hand can execute a shooting operation by triggering the 1014 control; when the virtual character is equipped with a close-combat weapon, the attack control triggers the attack behavior to be a hacking behavior and the like. In an alternative embodiment, only one of attack control 1004 and attack control 1014 is displayed. The display position of the attack control can be adjusted according to a position setting instruction, wherein the position setting instruction can be triggered on a setting interface or can be triggered in the process of checking an office. The touch operation acting on the attack control can be clicking, double clicking, long pressing, sliding and the like.
1008 and 1010 are gesture controls, and a gesture adjusting instruction is triggered by touch operation of the gesture controls to adjust the posture of the virtual object in the game scene. Wherein, the trigger 1008 control can control the weapon to be located at the left arm position of the virtual object and control the virtual object to stand in a left head-turning posture; the trigger 1010 control may control the weapon to be in the right arm position of the virtual object and control the virtual object to stand in a right-handed head-twisted posture. 1008 and 1010 are often used for controlling the posture of a virtual object when the virtual object avoids an obstacle such as a wall or a tree. The touch operation acting on the gesture control can be clicking, double-clicking, long-pressing, sliding and the like.
1012 are weapon slots, which can be triggered to switch the weapon used by the player. The left side "burst" is used to indicate the current firing pattern in which multiple rounds are fired in succession. In addition to the "continuous fire" mode, it is also possible to switch to the "single fire" mode, in which a single bullet is fired. The lower shaded bar is used to show the amount of bullets in the weapon. When shooting, the bullet quantity is continuously reduced, and when the bullet quantity becomes zero, the bullet can be automatically changed, and the bullet quantity is the maximum quantity. In addition, the 1016 control is a bullet changing control, the bullet changing control can be triggered to add bullets to the currently used weapon, and after the bullets are changed, the bullet amount is the maximum value. The touch operation acting on the weapon slot can be clicking, double-clicking, long-pressing, sliding and the like.
The 1018 and 1020 controls are action controls, and control the virtual object to execute a corresponding action in the game scene through a touch operation acting on the action controls. For example, trigger 1018 may control the virtual object to squat, and trigger 1020 may control the virtual object to assume a voltage ground. 1022 the control may control the virtual object to perform a jump action. The touch operation acting on the action control can be clicking, double-clicking, long-pressing, sliding and the like.
The 1024 control is used for controlling a current game scene to enter a mirror opening mode, when a first touch operation acting on the 1024 control is detected, the current game scene is controlled to enter the mirror opening mode, the sighting lens is displayed in the game scene in the mirror opening mode, the game scene seen by the virtual object through the sighting lens is also displayed, and the shooting accuracy rate can be improved in the mirror opening mode. And when a second touch operation acting on the control 1024 is detected, controlling to exit the mirror opening mode. In an optional embodiment, the first touch operation and the second touch operation are independent touch operations, for example, clicking a 1024 control to control the current game scene to enter a mirror-opening mode, clicking the 1024 control again to exit the mirror-opening mode, and displaying the game scene shot by the virtual camera along with the virtual object again. In an optional embodiment, the first touch operation and the second touch operation may be a start operation and an end operation of one touch operation, for example, when a touch 1024 control is detected, the game scene is controlled to enter the open mirror mode, during the continuous pressing, the game scene is controlled to be continuously in the open mirror mode, and when a touch cutoff triggering the touch operation is detected to leave the touch detection area, the game scene is controlled to exit the open mirror mode.
The 1026 control is used to control the shooting parameters of the virtual camera presenting the game scene, thereby adjusting the view of the game scene displayed in the graphical user interface. In an optional embodiment, it is determined that a touch operation acting on the 1026 control is detected, and shooting parameters of the virtual camera are adjusted according to the touch operation, where the shooting parameters include, but are not limited to, a position, an orientation, an angle of view, and the like of the virtual camera. The touch operation acting on the 1026 control can be click, double click, long press, sliding and the like.
The 1028 control is a marking control, and marks a virtual object, a virtual object and the like in the game scene through touch control operation acting on the marking control. And the 1032 control is a setting control, clicking the 1032 control, and displaying a setting menu for setting the basic function of the current game. The touch operation on the 1028 control may be clicking, double clicking, long pressing, sliding, and the like.
The 1034 and 1036 controls are message controls, where 1034 can be used to view system notifications, and 1036 can be used to view or send messages to teammates. 1038 is a small map for displaying the position of the virtual object controlled by the player, and may also display the position of some other player's virtual object.
On the basis of the application scenario, the player may trigger execution of the in-game aiming control method in this embodiment through the first touch operation.
Example 2
There is also provided, in accordance with an embodiment of the present invention, an embodiment of an in-game targeting control method, it should be noted that the steps illustrated in the flowchart of the accompanying drawings may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than that described herein.
Fig. 7 is an in-game aiming control method according to embodiment 2 of the present invention, as shown in fig. 7, the method including the steps of:
step S702 displays a first view screen through a graphical user interface.
The terminal device is used for providing the graphical user interface, the graphical user interface comprises at least part of game scene and at least part of virtual object located in the game scene, and the first virtual object is configured with a virtual gun.
Step S704, in response to the first touch operation applied to the graphical user interface, obtains a current position of a touch point of the first touch operation, and controls to adjust the first view frame to a second view frame determined according to the current position.
The current position of the touch point of the first touch operation may be a position that changes in real time.
Step S706, in the continuous process of the first touch operation, controlling to start a virtual sighting telescope configured on the virtual gun in response that the staying time of the target area in the game scene corresponding to the preset position in the second view frame satisfies a first preset time.
The preset position may be a position where a sighting telescope of the virtual gun is aligned, and the target area may be an area where other virtual objects are located.
Optionally, the step of responding that the staying time of the target area in the game scene corresponding to the preset position in the second view field picture satisfies the first preset time includes: responding that the stay time of a second virtual object in the game scene corresponding to the preset position in the second visual field picture meets the first preset time.
The second virtual object described above may be an enemy of the first virtual object in the game scene.
In an optional embodiment, when the stay time of the sighting telescope of the first touch operation in the second view frame, which is aligned with the position of the enemy, meets the first preset time, it indicates that the user is aiming at the enemy at this time, and the virtual sighting telescope configured on the virtual gun can be controlled to be opened to carefully check the enemy, so that the user is prevented from manually opening the virtual sighting telescope, and the game experience of the user is improved.
Optionally, the step of responding that the staying time of the target area in the game scene corresponding to the preset position in the second view field picture meets the first preset time includes: acquiring the stay time of a preset position in a second view field picture corresponding to a first scene area in a game scene; and when the stay time of the preset position in the second view field picture corresponding to the first scene area in the game scene meets the first preset time, responding that the stay time of the preset position in the second view field picture corresponding to the first scene area in the game scene meets the first preset time.
The first scene area may be other than enemies in the game scene, such as a building area, a grass area, a sky area, a water surface area, and the like.
In an optional embodiment, the stay time of the position corresponding to the sighting telescope of the virtual gun in the second view field picture in the first scene area in the game scene can be acquired, and by acquiring the stay time, whether the user needs to watch the area more clearly can be judged.
Example 3
According to an embodiment of the present invention, an in-game aiming control device is further provided, where the device may execute the in-game aiming control method in embodiment 1, and a specific implementation manner and a preferred application scenario are the same as those in the embodiment, and are not described herein again.
Fig. 8 is a schematic diagram of an in-game aiming control device according to embodiment 3 of the present invention, as shown in fig. 8, the device including:
a first display module 82, configured to display a first view frame through a graphical user interface;
a first obtaining module 84, configured to obtain a first position of a touch point of a first touch operation in response to the first touch operation applied to the graphical user interface, and control to adjust the first view frame to a second view frame determined according to the first position;
the first control module 86 is configured to control to turn on the virtual sighting telescope configured on the virtual firearm in response to that the staying time of the touch point at the first position meets a first preset time.
Optionally, the apparatus further comprises: the first detection module is used for responding that the stay time of the touch point at the first position does not meet a first preset time, and detecting whether the first touch operation leaves the touch point; and the stopping module is used for stopping recording the stay time of the first touch operation at the touch point when the first touch operation is detected to leave the touch point.
Optionally, the first display module is further configured to display a timing area, where the timing area is used to show a relationship between the staying time and a first preset time.
Optionally, the apparatus further comprises: the comparison module is used for comparing the stay time with a second preset time; the first display module is further used for entering a step of displaying a timing area under the condition that the stay time length exceeds a second preset time length, wherein the second preset time length is smaller than the first preset time length.
Optionally, the apparatus further comprises: the adjusting module is used for adjusting the visual angle of the virtual sighting telescope according to the movement of the first touch operation when the first touch operation is detected to move away from the touch point; and the second detection module is used for closing the virtual sighting telescope configured on the virtual gun when the first touch operation is detected to be cancelled.
Example 4
According to an embodiment of the present invention, an in-game aiming control device is further provided, where the device may execute the in-game aiming control method in embodiment 2, and a specific implementation manner and a preferred application scenario are the same as those in the above embodiment, and are not described herein again.
Fig. 9 is a schematic diagram of an in-game aiming control device according to an embodiment of the present invention, as shown in fig. 9, the device including:
a second display module 92, configured to display the first view frame through a graphical user interface;
a second obtaining module 94, configured to obtain a current position of a touch point of a first touch operation in response to the first touch operation applied to the graphical user interface, and control to adjust the first view frame to a second view frame determined according to the current position;
and the second control module 96 is configured to respond that the staying time of the target area in the game scene corresponding to the preset position in the second view picture meets the first preset time in the continuous process of the first touch control operation, and control to start the virtual sighting telescope configured on the virtual gun.
Optionally, the second control module comprises: and the first control unit is used for responding that the stay time of the second virtual object in the game scene corresponding to the preset position in the second visual field picture meets the first preset time.
Optionally, the second control module comprises: the acquisition unit is used for acquiring the stay time of a preset position in the second view field picture corresponding to the first scene area in the game scene; and the second control unit is used for responding that the staying time of the first scene area in the game scene corresponding to the preset position in the second view field picture meets the first preset time when the staying time of the first scene area in the game scene corresponding to the preset position in the second view field picture meets the first preset time.
Example 5
According to an embodiment of the present invention, there is also provided a storage medium including a stored program, wherein when the program runs, the apparatus on which the computer-readable storage medium is controlled executes an in-game aiming control method in the above-described embodiments.
Example 6
According to an embodiment of the present invention, there is also provided a processor for executing a program, where the program executes an in-game aiming control method in the above-mentioned embodiment when running.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (12)

1. An in-game aiming control method, characterized in that, a graphical user interface is provided through a terminal device, the graphical user interface comprises at least a part of a game scene and at least a part of a first virtual object positioned in the game scene, the first virtual object is configured to be a virtual gun, and the method comprises the following steps:
displaying a first view picture through the graphical user interface;
responding to a first touch operation acting on the graphical user interface, acquiring a first position of a touch point of the first touch operation, and controlling to adjust the first view picture into a second view picture determined according to the first position;
and responding that the stay time of the touch point at the first position meets a first preset time, and controlling to start the virtual sighting telescope configured on the virtual gun.
2. The method of claim 1, further comprising:
responding that the stay time of the touch point at the first position does not meet a first preset time, and detecting whether the first touch operation leaves the touch point;
and when the first touch operation is detected to leave the touch point, stopping recording the stay time of the first touch operation at the touch point.
3. The method of claim 1, wherein after controlling the adjustment of the first view to a second view determined according to the first position, the method further comprises:
and displaying a timing area, wherein the timing area is used for displaying the relation between the stay time and the first preset time.
4. The method of claim 3, wherein prior to displaying the timing zone, the method further comprises:
comparing the stay time with a second preset time;
and entering a display timing area under the condition that the stay time length exceeds a second preset time length, wherein the second preset time length is less than the first preset time length.
5. The method of claim 2, wherein after controlling to turn on a virtual scope configured on the virtual firearm in response to a dwell time of the touch point at the first position satisfying a first preset time, the method further comprises:
when the first touch operation is detected to move away from the touch point, adjusting the visual angle of the virtual sighting telescope according to the movement of the first touch operation;
and when the first touch operation cancellation is detected, closing the virtual sighting telescope configured on the virtual gun.
6. An in-game aiming control method, characterized in that, a graphical user interface is provided through a terminal device, the graphical user interface comprises at least a part of a game scene and at least a part of a virtual object positioned in the game scene, a first virtual object is configured to be a virtual gun, and the method comprises the following steps:
displaying a first view picture through the graphical user interface;
responding to a first touch operation acting on the graphical user interface, acquiring the current position of a touch point of the first touch operation, and controlling to adjust the first view picture into a second view picture determined according to the current position;
and in the continuous process of the first touch operation, responding that the stay time of the preset position in the second visual field picture corresponding to the target area in the game scene meets a first preset time, and controlling to start the virtual sighting telescope configured on the virtual gun.
7. The method according to claim 6, wherein the graphical user interface includes a second virtual object, the second virtual object is a virtual object in a different battle with the first virtual object, and the step of responding that the staying time of the target area in the game scene corresponding to the preset position in the second view field screen satisfies a first preset time includes: responding that the stay time of a second virtual object in the game scene corresponding to the preset position in the second visual field picture meets a first preset time.
8. The method of claim 6, wherein the step of responding that the dwell time of the preset position in the second view frame corresponding to the target area in the game scene satisfies a first preset time period comprises:
acquiring the stay time of a preset position in the second view field picture corresponding to a first scene area in the game scene;
and when the stay time of the preset position in the second view field picture corresponding to the first scene area in the game scene meets a first preset time, responding that the stay time of the preset position in the second view field picture corresponding to the first scene area in the game scene meets the first preset time.
9. An in-game aiming control device, characterized in that a graphical user interface is provided through a terminal device, the graphical user interface comprises at least a part of a game scene and at least a part of a first virtual object positioned in the game scene, the first virtual object is configured to be a virtual gun, and the device comprises:
the first display module is used for displaying a first view picture through the graphical user interface;
the first acquisition module is used for responding to a first touch operation acting on the graphical user interface, acquiring a first position of a touch point of the first touch operation, and controlling to adjust the first view picture into a second view picture determined according to the first position;
and the first control module is used for responding that the stay time of the touch point at the first position meets a first preset time, and controlling to start the virtual sighting telescope configured on the virtual gun.
10. An in-game aiming control device, characterized in that a graphical user interface is provided through a terminal device, the graphical user interface comprises at least a part of a game scene and at least a part of a virtual object positioned in the game scene, a first virtual object is configured to be a virtual gun, the device comprises:
the second display module is used for displaying the first visual field picture through the graphical user interface;
the second acquisition module is used for responding to a first touch operation acting on the graphical user interface, acquiring the current position of a touch point of the first touch operation, and controlling to adjust the first view picture into a second view picture determined according to the current position;
and the second control module is used for responding that the stay time of the preset position in the second visual field picture corresponding to the target area in the game scene meets a first preset time in the continuous process of the first touch operation, and controlling to start the virtual sighting telescope configured on the virtual gun.
11. A storage medium comprising a stored program, wherein the program, when executed, controls a device on which the storage medium is located to execute an in-game aiming control method according to any one of claims 1 to 8.
12. A processor for running a program, wherein the program when running executes the in-game aiming control method of any one of claims 1 to 8.
CN202110616223.0A 2021-06-02 2021-06-02 In-game aiming control method and device Active CN113318431B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110616223.0A CN113318431B (en) 2021-06-02 2021-06-02 In-game aiming control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110616223.0A CN113318431B (en) 2021-06-02 2021-06-02 In-game aiming control method and device

Publications (2)

Publication Number Publication Date
CN113318431A true CN113318431A (en) 2021-08-31
CN113318431B CN113318431B (en) 2023-12-19

Family

ID=77423253

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110616223.0A Active CN113318431B (en) 2021-06-02 2021-06-02 In-game aiming control method and device

Country Status (1)

Country Link
CN (1) CN113318431B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170007941A1 (en) * 2014-01-31 2017-01-12 Bandai Co., Ltd. Information providing system and information providing program
CN107678647A (en) * 2017-09-26 2018-02-09 网易(杭州)网络有限公司 Virtual shooting main body control method, apparatus, electronic equipment and storage medium
CN109499061A (en) * 2018-11-19 2019-03-22 网易(杭州)网络有限公司 Method of adjustment, device, mobile terminal and the storage medium of scene of game picture
CN109589601A (en) * 2018-12-10 2019-04-09 网易(杭州)网络有限公司 Virtual aim mirror control method and device, electronic equipment and storage medium
CN110597449A (en) * 2019-09-12 2019-12-20 腾讯科技(深圳)有限公司 Prop using method, device, terminal and storage medium based on virtual environment
WO2020168680A1 (en) * 2019-02-22 2020-08-27 网易(杭州)网络有限公司 Game role control method, apparatus and device and storage medium
CN112354181A (en) * 2020-11-30 2021-02-12 腾讯科技(深圳)有限公司 Open mirror picture display method and device, computer equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170007941A1 (en) * 2014-01-31 2017-01-12 Bandai Co., Ltd. Information providing system and information providing program
CN107678647A (en) * 2017-09-26 2018-02-09 网易(杭州)网络有限公司 Virtual shooting main body control method, apparatus, electronic equipment and storage medium
CN109499061A (en) * 2018-11-19 2019-03-22 网易(杭州)网络有限公司 Method of adjustment, device, mobile terminal and the storage medium of scene of game picture
CN109589601A (en) * 2018-12-10 2019-04-09 网易(杭州)网络有限公司 Virtual aim mirror control method and device, electronic equipment and storage medium
WO2020168680A1 (en) * 2019-02-22 2020-08-27 网易(杭州)网络有限公司 Game role control method, apparatus and device and storage medium
CN110597449A (en) * 2019-09-12 2019-12-20 腾讯科技(深圳)有限公司 Prop using method, device, terminal and storage medium based on virtual environment
CN112354181A (en) * 2020-11-30 2021-02-12 腾讯科技(深圳)有限公司 Open mirror picture display method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN113318431B (en) 2023-12-19

Similar Documents

Publication Publication Date Title
US20230347254A1 (en) Method and apparatus for providing online shooting game
KR102619439B1 (en) Methods and related devices for controlling virtual objects
JP7334347B2 (en) Virtual environment screen display method, device, equipment and program
CN113398601B (en) Information transmission method, information transmission device, computer-readable medium, and apparatus
CN110917623B (en) Interactive information display method, device, terminal and storage medium
WO2023045375A1 (en) Method and apparatus for spectating game after character is killed, and electronic device and storage medium
CN110975283A (en) Processing method and device of virtual shooting prop, storage medium and electronic device
US20230364502A1 (en) Method and apparatus for controlling front sight in virtual scenario, electronic device, and storage medium
CN113559507A (en) Information processing method, information processing apparatus, storage medium, and electronic device
CN111249726B (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN112316429A (en) Virtual object control method, device, terminal and storage medium
CN113318431B (en) In-game aiming control method and device
CN112755524B (en) Virtual target display method and device, electronic equipment and storage medium
US9616340B2 (en) Computer device, storage medium, and method of controlling computer device
CN113996051A (en) Method, device, equipment and storage medium for canceling and releasing game skills
CN113384883B (en) Display control method and device in game, electronic equipment and storage medium
CN115645923A (en) Game interaction method and device, terminal equipment and computer-readable storage medium
CN111589129B (en) Virtual object control method, device, equipment and medium
CN114225393A (en) Game resource acquisition method, device, medium, device and program product
CN113687761A (en) Game control method and device, electronic equipment and storage medium
CN112121433A (en) Method, device and equipment for processing virtual prop and computer readable storage medium
CN113730909B (en) Aiming position display method and device, electronic equipment and storage medium
WO2024051422A1 (en) Method and apparatus for displaying virtual prop, and device, medium and program product
Quek et al. Obscura: A mobile game with camera based mechanics
CN112843708B (en) Configuration method and device of game equipment, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant