CN113318431B - In-game aiming control method and device - Google Patents

In-game aiming control method and device Download PDF

Info

Publication number
CN113318431B
CN113318431B CN202110616223.0A CN202110616223A CN113318431B CN 113318431 B CN113318431 B CN 113318431B CN 202110616223 A CN202110616223 A CN 202110616223A CN 113318431 B CN113318431 B CN 113318431B
Authority
CN
China
Prior art keywords
virtual
touch operation
game
user interface
graphical user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110616223.0A
Other languages
Chinese (zh)
Other versions
CN113318431A (en
Inventor
刘奕凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110616223.0A priority Critical patent/CN113318431B/en
Publication of CN113318431A publication Critical patent/CN113318431A/en
Application granted granted Critical
Publication of CN113318431B publication Critical patent/CN113318431B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/219Input arrangements for video game devices characterised by their sensors, purposes or types for aiming at specific areas on the display, e.g. light-guns
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Abstract

The invention discloses an in-game aiming control method and device. Wherein the method comprises the following steps: displaying a first view through a graphical user interface; responding to a first touch operation acted on a graphical user interface, acquiring a first position of a touch point of the first touch operation, and controlling to adjust a first visual field picture to a second visual field picture determined according to the first position; and responding to the stay time of the touch point at the first position to meet the first preset time, and controlling to start a virtual sighting telescope configured on the virtual firearm. The invention solves the technical problem that the aiming center after the lens opening is deviated due to the complex lens opening action in the shooting game in the prior art.

Description

In-game aiming control method and device
Technical Field
The invention relates to the field of in-game control, in particular to an in-game aiming control method and device.
Background
Shooting games are a type of game that provides a virtual firearm in a game interface for a player to shoot, and shooting games may generally include first person shooting games and third person shooting games. The open mirror means that a user opens a virtual sighting telescope equipped on the virtual firearm when shooting through the virtual firearm, the open mirror operation of the current first-person shooting game is mostly a fixed open mirror button, the moving direction of a first-person lens is always consistent with the direction of a character, therefore, the open mirror is a progressive option of the lens moving process, the user possibly finds an observation point needing to open the mirror in the moving visual angle process, but the user needs to reselect a visual center point in a screen after clicking the open mirror button, and then closes the open mirror after shooting, and the movement of the mobile phone possibly generates offset to the sighting center due to lifting and falling of the mobile phone under the high-magnification visual angle during the shooting. For the operation of quick-opening-shooting-closing the scope, the procedure of opening and closing the scope is relatively complex.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the invention provides an in-game aiming control method and device, which at least solve the technical problem that aiming centers after shooting are offset due to complex shooting actions in shooting games in the prior art.
According to an aspect of an embodiment of the present invention, there is provided an in-game aiming control method, which provides a graphical user interface through a terminal device, the graphical user interface including at least a part of a game scene and at least a part of a first virtual object located in the game scene, the first virtual object being configured as a virtual firearm, the method including: displaying a first view through a graphical user interface; responding to a first touch operation acted on a graphical user interface, acquiring a first position of a touch point of the first touch operation, and controlling to adjust a first visual field picture to a second visual field picture determined according to the first position; and responding to the stay time of the touch point at the first position to meet the first preset time, and controlling to start a virtual sighting telescope configured on the virtual firearm.
Optionally, the method further comprises: detecting whether the first touch operation leaves the touch point or not according to the fact that the stay time of the touch point at the first position does not meet the first preset time; and stopping recording the stay time of the first touch operation at the touch point when the first touch operation is detected to leave the touch point.
Optionally, after controlling the first field of view screen to be adjusted to the second field of view screen determined from the first position, the method further comprises: and displaying a timing area, wherein the timing area is used for displaying the relation between the stay time and the first preset time.
Optionally, before displaying the timing area, the method further comprises: comparing the residence time with a second preset time; and when the residence time length exceeds a second preset time length, entering a step of displaying a timing area, wherein the second preset time length is smaller than the first preset time length.
Optionally, after controlling to turn on the virtual sighting telescope configured on the virtual firearm in response to the stay time of the touch point at the first position meeting a first preset time length, the method further includes: when the first touch operation is detected to leave the touch point, adjusting the visual angle of the virtual sighting telescope according to the movement of the first touch operation; and closing the virtual sighting telescope configured on the virtual firearm when the first touch operation is detected to be canceled.
According to another aspect of an embodiment of the present invention, there is provided an in-game aiming control method, which provides a graphical user interface through a terminal device, the graphical user interface including at least a part of a game scene and at least a part of virtual objects located in the game scene, the first virtual object being configured as a virtual firearm, the method including: displaying a first view through a graphical user interface; responding to a first touch operation acted on a graphical user interface, acquiring the current position of a touch point of the first touch operation, and controlling to adjust a first visual field picture to a second visual field picture determined according to the current position; and in the continuous process of the first touch operation, responding to the condition that the stay time of the target area in the game scene corresponding to the preset position in the second view field picture meets the first preset time, and controlling to start the virtual sighting telescope configured on the virtual firearm.
Optionally, the graphical user interface includes a second virtual object, where the second virtual object is a virtual object that is in a different camping with the first virtual object, and the step of responding to the stay time of the target area in the game scene corresponding to the preset position in the second view field picture to meet the first preset time length includes: responding to the first preset duration when the stay duration of the second virtual object in the game scene corresponding to the preset position in the second view field picture meets the first preset duration.
Optionally, the step of responding to the first preset duration being satisfied by the stay duration of the target area in the game scene corresponding to the preset position in the second view field picture includes: acquiring a stay time of a first scene area in a game scene corresponding to a preset position in a second view field picture; when the stay time length of the first scene area in the corresponding game scene of the preset position in the second view field picture meets the first preset time length, responding to the stay time length of the first scene area in the corresponding game scene of the preset position in the second view field picture meets the first preset time length.
According to another aspect of an embodiment of the present invention, there is also provided an in-game aiming control device that provides a graphical user interface through a terminal device, the graphical user interface including at least a part of a game scene and at least a part of a first virtual object located in the game scene, the first virtual object being configured as a virtual firearm, the device including: the first display module is used for displaying a first visual field picture through a graphical user interface; the first acquisition module is used for responding to a first touch operation acted on the graphical user interface, acquiring a first position of a touch point of the first touch operation, and controlling the first view picture to be adjusted to a second view picture determined according to the first position; the first control module is used for responding to the fact that the stay time of the touch point at the first position meets a first preset time, and controlling to start a virtual sighting telescope configured on the virtual firearm.
According to another aspect of an embodiment of the present invention, there is also provided an in-game aiming control device that provides a graphical user interface through a terminal device, the graphical user interface including at least a part of a game scene and at least a part of virtual objects located in the game scene, the first virtual object being configured as a virtual firearm, the device including: the second display module is used for displaying the first view field picture through the graphical user interface; the second acquisition module is used for responding to a first touch operation acted on the graphical user interface, acquiring the current position of a touch point of the first touch operation, and controlling the first view picture to be adjusted to a second view picture determined according to the current position; and the second control module is used for responding to the first preset duration of the stay time of the target area in the game scene corresponding to the preset position in the second visual field picture in the continuous process of the first touch operation, and controlling to start the virtual sighting telescope configured on the virtual firearm.
According to another aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium including a stored program, wherein the program when run controls a device in which the computer-readable storage medium is located to execute the in-game aiming control method of any one of the above.
According to another aspect of the embodiments of the present invention, there is also provided a processor for running a program, wherein the program executes any one of the above in-game aiming control methods.
In the embodiment of the invention, a first view field picture is displayed through a graphical user interface, a first position of a touch point of the first touch operation is obtained in response to the first touch operation acted on the graphical user interface, and the first view field picture is controlled to be adjusted to a second view field picture determined according to the first position; responding to the stay time of the touch point at the first position to meet the first preset time, controlling to start a virtual sighting telescope configured on the virtual firearm, reducing touch operation in shooting game, enabling a user to realize quick lens opening only by executing one touch operation, and meanwhile avoiding deviation of a sighting center after lens opening due to excessive touch operation, and further solving the technical problem that the sighting center after lens opening is deviated due to complex lens opening action in the game in the prior art.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute a limitation on the invention. In the drawings:
FIG. 1 is a flow chart of an in-game aiming control method according to embodiment 1 of the present invention;
FIG. 2 is a schematic diagram of a first person perspective in accordance with an embodiment of the present invention;
FIG. 3 is a schematic diagram of a second person perspective according to an embodiment of the present invention;
FIG. 4 is a schematic illustration of a timing area according to an embodiment of the invention;
FIG. 5 is a schematic diagram of an open virtual sighting telescope according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a graphical user interface according to an embodiment of the invention;
FIG. 7 is a flow chart of an in-game aiming control method according to embodiment 2 of the present invention;
FIG. 8 is a schematic view of an in-game aiming control device according to embodiment 3 of the present invention;
fig. 9 is a schematic diagram of an in-game aiming control device according to embodiment 4 of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The following is an explanation of the terms involved in this application:
virtual scene
Is a virtual scene that an application displays (or provides) when running on a terminal or server. Optionally, the virtual scene is a simulation environment for the real world, or a semi-simulated semi-fictional virtual environment, or a purely fictional virtual environment. The virtual scene is any one of a two-dimensional virtual scene and a three-dimensional virtual scene, and the virtual environment can be sky, land, ocean and the like, wherein the land comprises environmental elements such as deserts, cities and the like. The virtual scene is a scene of a complete game logic of a virtual object such as user control, for example, in a sandbox 3D shooting game, the virtual scene is a 3D game world for a player to control the virtual object to fight, and an exemplary virtual scene may include: at least one element selected from mountains, flat lands, rivers, lakes, oceans, deserts, sky, plants, buildings and vehicles; for example, in a 2D card game, the virtual scene is a scene for showing a released card or a virtual object corresponding to the released card, and an exemplary virtual scene may include: arenas, battle fields, or other "field" elements or other elements that can display the status of card play; for a 2D or 3D multiplayer online tactical game, the virtual scene is a 2D or 3D terrain scene for virtual objects to fight, an exemplary virtual scene may include: mountain, line, river, classroom, table and chair, podium, etc.
Virtual object
Refers to dynamic objects that can be controlled in a virtual scene. Alternatively, the dynamic object may be a virtual character, a virtual animal, a cartoon character, or the like. The virtual object is a Character that a Player controls through an input device, or is an artificial intelligence (Artificial Intelligence, AI) set in a virtual environment fight by training, or is a Non-Player Character (NPC) set in a virtual environment fight. Optionally, the virtual object is a virtual character playing an athletic in the virtual scene. Optionally, the number of virtual objects in the virtual scene fight is preset, or dynamically determined according to the number of clients joining the fight, which is not limited in the embodiment of the present application. In one possible implementation, a user can control a virtual object to move in the virtual scene, e.g., control the virtual object to run, jump, crawl, etc., as well as control the virtual object to fight other virtual objects using skills, virtual props, etc., provided by the application.
Visual field picture
The virtual scene comprises a first virtual object and a second virtual object, and the first virtual object and the second virtual object can belong to different teams. The terminal displays a virtual scene which reflects the game scene through the game picture. Optionally, the game screen is a screen corresponding to the virtual scene observed at a specific viewing angle. The angle of view is the angle at which the virtual scene is observed by the camera model in the virtual scene.
Optionally, the camera model automatically follows the virtual object in the virtual environment, that is, when the position of the virtual object in the virtual environment changes, the camera model simultaneously changes along with the position of the virtual object in the virtual environment, and the camera model is always within a preset distance range of the virtual object in the virtual environment. Optionally, the relative positions of the camera model and the virtual object do not change during the automatic following process.
The camera model refers to a three-dimensional model located around the virtual object in the virtual environment, which is located near or at the head of the virtual object when the first person perspective is employed; when the third person is referred to as a viewing angle, the camera model can be located behind the virtual object and bound with the virtual object, or can be located at any position with a preset distance from the virtual object, and can observe the virtual object located in the virtual environment from different angles. Optionally, the camera model is positioned behind a virtual object (e.g., the head and shoulder of a virtual character) when the third person is said to be at a view angle that is the first person's view angle over the shoulder. Optionally, the viewing angle includes other viewing angles, such as a top view, in addition to the first-person viewing angle and the third-person viewing angle; when a top view is employed, the camera model may be located above the head of the virtual object, the top view being a view of the virtual environment from an overhead view. Optionally, the camera model is not actually displayed in the virtual environment, i.e. the camera model is not displayed in the virtual environment of the user interface display.
The camera model is described as being located at an arbitrary position at a predetermined distance from the virtual object. Optionally, a virtual object corresponds to a camera model, and the camera model may rotate with the virtual object as a rotation center, for example: the camera model is rotated by taking any point of the virtual object as a rotation center, the camera model not only rotates in angle, but also shifts in displacement in the rotation process, and the distance between the camera model and the rotation center is kept unchanged during rotation, namely, the camera model is rotated on the surface of a sphere taking the rotation center as a sphere center, wherein any point of the virtual object can be any point of the head, the trunk or the periphery of the virtual object, and the embodiment of the application is not limited. Optionally, when the camera model observes the virtual object, the center of the view angle of the camera model points in the direction that the point of the sphere where the camera model is located points to the center of the sphere
Optionally, the camera model may also observe the virtual object at a preset angle in different directions of the virtual object. Optionally, the first virtual object is a virtual object controlled by a user through a terminal, and the second virtual object includes at least one of a virtual object controlled by another user and a virtual object controlled by a background server.
Optionally, the virtual environment screen in the embodiment of the present application is a screen for observing the virtual environment from the perspective of the first virtual object.
The in-game targeting control method in one embodiment of the present disclosure may be run on a terminal device or a server. The terminal device may be a local terminal device. When the information processing method is operated on the server, the information processing method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an alternative embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, the storage and operation of the in-game aiming control method are completed on the cloud game server, and the client device is used for receiving and sending data and presenting game pictures, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the terminal device for information processing is cloud game server of cloud. When playing the game, the player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
Example 1
In accordance with an embodiment of the present invention, there is provided an embodiment of an in-game aiming control method, it being noted that the steps shown in the flowchart of the figures may be performed in a computer system such as a set of computer executable instructions, and although a logical sequence is shown in the flowchart, in some cases the steps shown or described may be performed in a different order than what is shown or described herein.
Fig. 1 is an in-game aiming control method according to embodiment 1 of the present invention, as shown in fig. 1, comprising the steps of:
step S102, displaying a first view screen through a graphical user interface.
Wherein a graphical user interface is provided by the terminal device, the graphical user interface comprising at least a portion of the game scene and at least a portion of a first virtual object located in the game scene, the first virtual object being configured as a virtual firearm.
The graphical user interface refers to a computer operation user interface displayed in a graphical manner through a terminal device, and the terminal device may be the aforementioned local terminal device or the aforementioned client device in the cloud interaction system.
The first view field picture is a view field provided by a game scene in the graphical user interface for a user.
Step S104, in response to the first touch operation on the graphical user interface, a first position of a touch point of the first touch operation is acquired, and the first view is controlled to be adjusted to a second view determined according to the first position.
The first view field picture and the second view field picture are different views provided by moving the virtual camera in the game.
In an alternative embodiment, the first touch operation may be an operation of moving the virtual camera.
In another alternative embodiment, the first touch operation may be a sliding operation or other operation capable of adjusting the game field of view. For example, in a game, a player performs a sliding operation on the device touch screen surface, which triggers a virtual camera in the game to move, thereby adjusting the game field of view from a first field of view screen to a second field of view screen.
In yet another alternative embodiment, the first touch may be applied to an aiming control in the game scene, to a blank area in the game scene, or to other controls that may adjust the field of view, such as a small eye control.
The graphical user interface may be presented in a First person perspective or a third person perspective, where the First person perspective is mainly applied to a First person shooting game (FPS, first-person shooting game), which mainly plays shooting games in a subjective perspective of a user, as shown in fig. 2; a Third person called shooting game (TPS, third-Personal Shooting) emphasizes the sense of motion even more, with the main angle visible on the game screen, as shown in fig. 3, where 1 represents the dwell time of the first touch operation on the graphical user interface.
The execution position of the first touch operation is not limited by the scheme. In an alternative embodiment, the first touch operation can be responded to, wherein the first touch operation acts on any position in the graphical user interface; in another alternative embodiment, the game field of view may be adjusted from the first field of view screen to the second field of view screen in response to a first touch operation of a user acting on the game interface through a finger or a cursor of a mouse, wherein the user is a game player for performing the first touch operation.
And step S106, controlling to start the virtual sighting telescope configured on the virtual firearm in response to the stay time of the touch point at the first position meeting the first preset time.
Specifically, the touch point is a position point where a finger or a mouse cursor of a user performs touch operation on the game interface.
In an alternative embodiment, in response to a first touch operation acting on the graphical user interface, the game view is adjusted from a first view image to a second view image, when the first touch operation stops moving under the second view image, a point where a user's finger stays on the graphical user interface to perform the touch operation can be obtained, the stay time of the user's finger at the touch point is recorded, and under the condition that the stay time meets a first preset time, a virtual sighting telescope configured on a virtual firearm is controlled to be started, so that the user can conveniently watch enemies with longer distances in a scene through a high-power lens.
The first preset duration can be freely set by a user; the virtual firearm refers to a virtual weapon used for shooting aiming by a user in a shooting game scene; the virtual sighting telescope refers to a virtual lens for aiming, which is equipped on a virtual firearm used by a user in a shooting game scene.
In an alternative embodiment, in the case that the stay time is less than the first preset time, the user may simply adjust the first view to the second view, which does not wish to perform the mirror opening operation, where it may be determined that the virtual sighting telescope configured on the virtual firearm is not turned on; under the condition that the stay time is shorter than the first preset time, the user may perform misoperation after the first view field picture is adjusted to the second view field picture, and the mirror opening operation is not hopefully performed, at this time, it can be determined that the virtual sighting telescope configured on the virtual firearm is not opened, and inconvenience in operation of the user in the game due to opening of the lens is avoided. Specifically, under the condition that the stay time is less than the first preset time, continuing to record the stay time of the first touch operation at the touch point until the finger leaves the game interface or the finger starts to move, if the total stay time of the first touch operation at the touch point is less than the first preset time, the user only needs to adjust the game field of view from the first field of view picture to the second field of view picture, and does not need to open the virtual sighting telescope, and at this time, the step of opening the virtual sighting telescope configured on the virtual firearm can not be executed.
Further, when the stay time length meets the first preset time length, it is stated that the user not only needs to adjust the first view field picture to the second view field picture, but also needs to open the virtual sighting telescope to search for enemy or accurately shoot, at this time, it can be determined to open the virtual sighting telescope configured on the virtual firearm, so that the user can search for enemy or accurately shoot through the sighting telescope. Specifically, under the condition that the stay time length meets the first preset time length, the situation that at the moment, after the first view field picture is adjusted to the second view field picture, a user needs to open a virtual sighting telescope configured on a virtual gun so as to quickly check enemies in a game scene, the situation that after the first view field picture is adjusted to the second view field picture, the user needs to click a screen again to cause the sighting point to deviate is avoided, and the stay time length and the first preset time length are compared, so that the user can execute the sighting operation only by keeping pressing after adjusting the view angle required to be in a sighting telescope, the steps of using the virtual sighting telescope by the user are reduced, and the game experience of the user is improved.
In an alternative embodiment, the first preset duration can be set to be shorter according to the operation habit and the requirement of the user, at this time, when the user executes the first touch operation, the user can start the virtual sighting telescope configured on the virtual firearm only by using a shorter touch time, so that the sensitivity of starting the virtual sighting telescope is improved, and the user can conveniently and quickly switch in the shooting process, so that the user feels a good game experience.
In another alternative embodiment, the first preset duration may be set longer according to the operation habit and the requirement of the user, and at this time, when the user performs the first touch operation, the user needs longer touch time to start the virtual sighting telescope configured on the virtual instrument, so that the user can be given longer touch time, thereby preventing misoperation of the user, and enabling the novice player to feel a good game experience.
In this embodiment, a first view field picture is displayed through a graphical user interface, a first position of a touch point of the first touch operation is obtained in response to a first touch operation acting on the graphical user interface, and the first view field picture is controlled to be adjusted to a second view field picture determined according to the first position; responding to the stay time of the touch point at the first position to meet the first preset time, controlling to start a virtual sighting telescope configured on the virtual firearm, reducing touch operation in shooting game, enabling a user to realize quick lens opening only by executing one touch operation, and meanwhile avoiding deviation of a sighting center after lens opening due to excessive touch operation, and further solving the technical problem that the sighting center after lens opening is deviated due to complex lens opening action in the game in the prior art.
Optionally, the method further comprises: detecting whether the first touch operation leaves the touch point or not according to the fact that the stay time of the touch point at the first position does not meet the first preset time; and stopping recording the stay time of the first touch operation at the touch point when the first touch operation is detected to leave the touch point.
Specifically, the first touch operation leaving the touch point may include two cases, where the first case is that the finger of the user does not leave the game interface, but only moves on the game interface; the second case is when the user's finger is completely off the game interface.
In an alternative embodiment, under the condition that the stay time is smaller than the first preset time, whether the first touch operation leaves the touch point can be detected, when the first touch operation leaves the touch point, the fact that the user finishes switching the first view picture into the second view picture is indicated, the operation of opening a mirror is not needed to be executed, the step of opening the virtual sighting telescope on the virtual firearm can be canceled by stopping recording the stay time of the first touch operation at the touch point, the user is given a buffer condition for opening the virtual sighting telescope, the operation of opening the virtual sighting telescope can be stopped by stopping the first touch operation on the graphical user interface when the user does not need to open the virtual sighting telescope on the virtual firearm, and the game experience of the user is improved.
Optionally, after controlling the first field of view screen to be adjusted to the second field of view screen determined from the first position, the method further comprises: and displaying a timing area, wherein the timing area is used for displaying the relation between the stay time and the first preset time.
As shown in fig. 4, the timing area 2 may be displayed in the form of an arc-shaped timing slot, and the timing slot may be of other shapes, such as a circular timing slot, a bar-shaped timing slot, etc.; the timing area may also be displayed in digital form, and illustratively may be displayed in countdown form in the middle of or directly above the screen.
The above-mentioned timing area shows the relationship between the residence time length and the first preset time length, and still taking fig. 4 as an example, the whole arc-shaped timing slot represents the first preset time length, the black area represents the residence time length of the first touch operation, and the black area gradually expands with the increase of the residence time length of the first touch operation.
In an alternative embodiment, the timing area may be displayed by a timing progress bar or a countdown manner, so that the user determines whether to stop the first touch operation or continue to execute the first touch operation according to the timing condition of the timing area; if the user needs to open the virtual sighting telescope, determining whether the first touch operation needs to be continuously executed according to the timing condition displayed in the timing area; if the progress bar of the timing area is finished, the residence time is equal to the first preset time, and the virtual sighting telescope is opened at the moment, so that the first touch operation is not required to be executed. If the progress bar of the timing area is not finished, the stopping time length is smaller than the first preset time length, the virtual sighting telescope is not opened at the moment, and the first touch operation can be continuously executed until the stopping time length is larger than or equal to the first preset time length, so that the operation of opening the virtual sighting telescope is executed.
Optionally, before displaying the timing area, the method further comprises: comparing the residence time with a second preset time; and when the residence time length exceeds a second preset time length, entering a step of displaying a timing area, wherein the second preset time length is smaller than the first preset time length.
The second preset time period may be set by the user.
In an alternative embodiment, the second preset duration may be set to be shorter according to the operation habit and the requirement of the user, at this time, when the user performs the first touch operation, the user may enter the step of displaying the timing area only with a shorter touch time, so that the user may select in advance whether the virtual sighting telescope needs to be started, so that the user may conveniently and rapidly switch in the shooting process, and thus feel a good game experience.
In another alternative embodiment, the second preset duration may be set longer according to the operation habit and the requirement of the user, and at this time, when the user performs the first touch operation, the user needs longer touch time to enter the step of displaying the timing area, which can give the user longer thinking time, prevent the user from misoperation, and enable the novice player to feel a good game experience.
In an alternative embodiment, before displaying the timing area, the dwell time length may be compared with the second preset time length, and when the dwell time length exceeds the second preset time length, the timing area may be displayed, so that a user may select whether to turn on the virtual sighting telescope according to the timing area, as shown in fig. 4, by displaying the timing area 2, the user may conveniently select whether to turn on the virtual sighting telescope equipped on the virtual firearm 3 according to the timing condition of the timing area, as shown in fig. 5, when the dwell time length of the first touch operation is longer than the first preset time length, the step of turning on the virtual sighting telescope 4 may be performed, and this process reduces the complexity of turning on the virtual sighting telescope, so that the user may quickly aim at an enemy through the virtual sighting telescope, and the game experience of the user is improved.
For example, a first preset duration may be set to 3 seconds, a second preset duration may be set to 1 second, and when the time that the finger of the user stays on the game interface exceeds 1 second, a time slot (i.e. the time-counting area) is displayed, so that the user determines whether to stay on according to the time-counting area, if the user needs to turn on the virtual sighting telescope, the finger stays on until the time slot is full, and when the time slot is full, the time slot reaches the first preset duration; if the user does not need to turn on the virtual sighting telescope, the finger is required to leave or continue to move before the time slot is full.
Optionally, after controlling to turn on the virtual sighting telescope configured on the virtual firearm in response to the stay time of the touch point at the first position meeting a first preset time length, the method further includes: when the first touch operation is detected to leave the touch point, adjusting the visual angle of the virtual sighting telescope according to the movement of the first touch operation; and closing the virtual sighting telescope configured on the virtual firearm when the first touch operation is detected to be canceled.
In an alternative embodiment, when the stay time is longer than or equal to the first preset time, after the virtual sighting telescope configured on the virtual firearm is started, and when the movement of the first touch operation away from the touch point is detected, the visual angle of the virtual sighting telescope can be adjusted according to the movement of the first touch operation, so that a user can adjust the visual angle according to the movement track of the enemy, and the user can aim at the enemy better, thereby successfully shooting and improving the game experience of the user.
An application scenario of the present embodiment is provided below. A schematic of a graphical user interface is shown in fig. 6. The interface includes a game scene and virtual objects located in the game scene, which are not shown in fig. 6. The player controls the virtual object to move in the game scene. The interface in FIG. 6 shows a variety of controls that are displayed over the game scene.
Wherein 1000 is a backpack control, and the backpack interface is triggered to be displayed in the graphical user interface through touch operation acting on the control, and the backpack interface is used for viewing article props of the backpack, and the article props can include but are not limited to: props such as medicines, weapons, accessories, and defenders. In an alternative embodiment, the edges of the backpack control are provided with edge lines for indicating the amount of items within the backpack. The touch operation on the backpack control can be clicking, double clicking, long pressing, sliding and other operations.
1002 is a movement control and the shaded circles are rockers in the movement control. In an alternative embodiment, the movement control is configured with a movement response area, where the movement response area may be the same size as the movement control, that is, an area in the graphical user interface corresponding to the movement control is a movement response area. In an alternative embodiment, the size of the movement response area may be larger than the size of the movement control, that is, the movement control is disposed in the movement response area, and the graphical user interface is divided into a left area and a right area from the middle, where the left area is the movement response area of the movement control. And triggering a movement control instruction through touch operation acting on a movement response area of the movement control, and controlling the movement direction of the virtual object according to the movement control instruction. The touch operation applied to the movement response area of the movement control can be clicking, double clicking, long pressing, sliding and other operations.
When the touch operation triggering the movement control instruction is detected to meet the preset condition, triggering a 'run' instruction, and controlling the virtual object to automatically run continuously in the game scene. In an alternative embodiment, as shown in fig. 5, the touch operation that triggers the movement control command is a touch operation that acts on the rocker area when the touch operation meets a predetermined condition slides along a predetermined direction to a predetermined distance or a predetermined area (e.g., a position indicated by an arrow above the rocker). For example, when the touch operation drags the rocker upward, a run control 1006 appears, and when the run control is triggered, the virtual object enters the fast running mode. Meanwhile, the 1030 control is also a running control, and when the 1030 control is clicked, the virtual object enters a fast running mode. In an alternative embodiment, when the touch operation triggering the movement control command meets a preset condition, the touch operation acting on the rocker area meets a preset duration, or meets a preset pressure, etc.
1004 and 1014 are both attack controls, an attack instruction is triggered through touch operation acting on an attack response area of the attack control, and the virtual character is controlled to execute the attack operation in the game scene according to the attack instruction. The attack behavior triggered by the attack control corresponds to the current assembly state of the virtual object, for example, when the virtual object is assembled with a far combat weapon, the attack control triggers when the attack behavior is a shooting behavior, that is, the left hand can execute shooting operation by triggering 1004 the control, and the right hand can execute shooting operation by triggering 1014 the control; when the virtual character is equipped with a near combat weapon, the attack control is triggered when the attack behavior is a hacking behavior, and the like. In an alternative embodiment, attack control 1004 and attack control 1014 display only one of them. The display position of the attack control can be adjusted according to a position setting instruction, wherein the position setting instruction can be triggered on a setting interface or in the process of checking the office. The touch operation acting on the attack control can be clicking, double clicking, long pressing, sliding and other operations.
1008 and 1010 are gesture controls, and gesture adjustment instructions are triggered by touch operations acting on the gesture controls to adjust the gesture of the virtual object in the game scene. The trigger 1008 control can control the weapon to be positioned at the left arm position of the virtual object and control the virtual object to stand in the left-side twisting posture; triggering 1010 a control may control a weapon to be positioned at a right arm position of a virtual object and control the virtual object to stand in a right-hand twist pose. 1008 and 1010 are often used for controlling the posture of a virtual object when the virtual object is avoiding an obstacle such as a wall or a big tree. The touch operation applied to the gesture control may be operations such as clicking, double clicking, long pressing, sliding, etc.
1012 is a weapon slot, which is triggered to switch the weapon used by the player. The left side "burst" is used to indicate the current firing mode in which multiple rounds are fired in succession. In addition to the "shot" mode, it is also possible to switch to a "shot" mode in which a single bullet is fired. The lower shaded bar is used to show the amount of cartridges in the weapon. When the bullet quantity is changed to zero, the bullet can be automatically changed, and the bullet quantity is the maximum quantity. In addition, the 1016 control is a reloading control, and when the reloading control is triggered, bullets can be added to the currently used weapon, and after reloading, the bullets amount is the maximum value. The touch operation applied to the weapon slot can be clicking, double clicking, long pressing, sliding and other operations.
The 1018 and 1020 controls are action controls, and the virtual object is controlled to execute corresponding actions in the game scene through touch operation on the action controls. For example, trigger 1018 may control the virtual object to squat down, and trigger 1020 may control the virtual object to adopt a volt-ampere. Control 1022 may control the virtual object to perform a jump action. The touch operation applied to the action control can be clicking, double clicking, long pressing, sliding and other operations.
The 1024 control is used for controlling the current game scene to enter a lens opening mode, when the first touch operation acting on the control 1024 is detected, the control is controlled to enter the lens opening mode, in the lens opening mode, the aiming lens is displayed in the game scene, the game scene seen by the virtual object through the aiming lens is also displayed, and in the lens opening mode, the shooting accuracy rate can be improved. When a second touch operation is detected on control 1024, control exits the mirror-on mode. In an alternative embodiment, the first touch operation and the second touch operation are independent touch operations, for example, click on a control 1024, control the current game scene to enter the open mirror mode, click on the control 1024 again, exit the open mirror mode, and display again as a game scene that the virtual camera follows the virtual object. In an alternative embodiment, the first touch operation and the second touch operation may be a start operation and an end operation of one touch operation, for example, when detecting that the touch is in the 1024 control, the game scene is controlled to enter the open mirror mode, in the continuous pressing process, the game scene is controlled to be continuously in the open mirror mode, and when detecting that the touch cut-off triggering the touch operation leaves the touch detection area, the game scene is controlled to exit the open mirror mode.
The 1026 control is used to control the shooting parameters of the virtual camera that presents the game screen, thereby adjusting the view screen of the game scene displayed in the graphical user interface. In an alternative embodiment, it is determined that a touch operation is detected on the 1026 control, and shooting parameters of the virtual camera are adjusted according to the touch operation, where the shooting parameters include, but are not limited to, a position, an orientation, a viewing angle, and the like of the virtual camera. The touch operation applied to the 1026 control may be clicking, double clicking, long pressing, sliding, etc.
The 1028 control is a marking control, and the virtual object, the virtual object and the like in the game scene are marked through touch operation on the marking control. The control 1032 is a setting control, and the control 1032 is clicked to display a setting menu for setting the basic function of the current game. The touch operation applied to the 1028 control can be clicking, double clicking, long pressing, sliding and other operations.
1034 and 1036 controls are message controls, where 1034 can be used to view system notifications, 1036 can be used to view messages sent by teammates, or to send messages to teammates. 1038 is a minimap for displaying the location of virtual objects controlled by the player, and may also display the location of virtual objects of some other players.
On the basis of the application scenario, the player can trigger and execute the in-game aiming control method in the embodiment through the first touch operation.
Example 2
There is also provided, in accordance with an embodiment of the present invention, an in-game aiming control method, in which steps shown in the flowcharts of the figures may be performed in a computer system such as a set of computer executable instructions, and although a logical order is shown in the flowcharts, in some cases, the steps shown or described may be performed in an order other than that shown or described herein.
Fig. 7 is an in-game aiming control method according to embodiment 2 of the present invention, as shown in fig. 7, comprising the steps of:
step S702, displaying the first view screen through the graphical user interface.
Wherein the graphical user interface is provided by the terminal device, the graphical user interface comprises at least a part of a game scene and at least a part of a virtual object positioned in the game scene, and the first virtual object is configured as a virtual firearm.
In step S704, in response to the first touch operation applied to the graphical user interface, the current position of the touch point of the first touch operation is acquired, and the first view is controlled to be adjusted to the second view determined according to the current position.
The current position of the touch point of the first touch operation may be a position that changes in real time.
In step S706, in the duration of the first touch operation, the virtual sighting telescope configured on the virtual firearm is controlled to be turned on in response to the first preset duration being satisfied by the stay time of the target area in the game scene corresponding to the preset position in the second view screen.
The preset position may be a position aligned with a sighting telescope of the virtual gun, and the target area may be an area where other virtual objects are located.
Optionally, the graphical user interface includes a second virtual object, where the second virtual object is a virtual object that is in a different camping with the first virtual object, and the step of responding to the stay time of the target area in the game scene corresponding to the preset position in the second view field picture to meet the first preset time length includes: responding to the first preset duration when the stay duration of the second virtual object in the game scene corresponding to the preset position in the second view field picture meets the first preset duration.
The second virtual object may be an enemy of the first virtual object in the game scene.
In an optional embodiment, when the stay time of the aiming mirror of the virtual gun aiming at the enemy position in the second view field picture in the first touch operation meets the first preset time, the situation that the user aims at the enemy at the moment is indicated, the virtual aiming mirror configured on the virtual gun can be controlled to be started to carefully check the enemy, manual starting of the user is avoided, and the game experience sense of the user is improved.
Optionally, the step of responding to the first preset duration being satisfied by the stay duration of the target area in the game scene corresponding to the preset position in the second view field picture includes: acquiring a stay time of a first scene area in a game scene corresponding to a preset position in a second view field picture; when the stay time length of the first scene area in the corresponding game scene of the preset position in the second view field picture meets the first preset time length, responding to the stay time length of the first scene area in the corresponding game scene of the preset position in the second view field picture meets the first preset time length.
The first scene area described above may be other areas than enemies in the game scene, for example, a building area, a grass area, a sky area, a water surface area, and the like.
In an alternative embodiment, the residence time of the position corresponding to the sighting telescope of the virtual gun in the second view field picture in the first scene area in the game scene can be obtained, whether the user needs to clearly watch the first scene area or not can be judged by obtaining the residence time, and under the condition that the residence time of the first scene area in the corresponding game scene of the preset position in the second view field picture meets the first preset time, the requirement that the user views the first scene area through the magnifier is indicated, and at the moment, the virtual sighting telescope configured on the virtual gun can be controlled to be started for viewing, so that the user is prevented from manually starting the virtual sighting telescope, and the game experience of the user is improved.
Example 3
According to the embodiment of the present invention, there is further provided an in-game aiming control device, which can execute the in-game aiming control method in the above embodiment 1, and the specific implementation manner and the preferred application scenario are the same as those in the above embodiment, and are not described herein.
Fig. 8 is a schematic view of an in-game aiming control device according to embodiment 3 of the present invention, as shown in fig. 8, the device including:
a first display module 82 for displaying a first field of view screen through a graphical user interface;
a first obtaining module 84, configured to obtain a first position of a touch point of a first touch operation in response to the first touch operation acting on the graphical user interface, and control to adjust the first view to a second view determined according to the first position;
the first control module 86 is configured to control to turn on the virtual sighting telescope configured on the virtual firearm in response to the stay time of the touch point at the first position meeting a first preset time.
Optionally, the apparatus further comprises: the first detection module is used for detecting whether the first touch operation leaves the touch point or not according to the fact that the stay time of the touch point at the first position does not meet the first preset time; and the stopping module is used for stopping recording the stay time of the first touch operation at the touch point when the first touch operation is detected to leave the touch point.
Optionally, the first display module is further configured to display a timing area, where the timing area is configured to display a relationship between the residence time length and the first preset time length.
Optionally, the apparatus further comprises: the comparison module is used for comparing the stay time length with a second preset time length; the first display module is further configured to enter a step of displaying the timing area when the residence time period exceeds a second preset time period, where the second preset time period is less than the first preset time period.
Optionally, the apparatus further comprises: the adjusting module is used for adjusting the visual angle of the virtual sighting telescope according to the movement of the first touch operation when detecting that the first touch operation moves away from the touch point; and the second detection module is used for closing the virtual sighting telescope configured on the virtual firearm when the first touch operation is canceled.
Example 4
According to the embodiment of the present invention, there is further provided an in-game aiming control device, which can execute the in-game aiming control method in the above embodiment 2, and the specific implementation manner and the preferred application scenario are the same as those in the above embodiment, and are not described herein.
Fig. 9 is a schematic view of an in-game aiming control device according to an embodiment of the present invention, as shown in fig. 9, the device including:
A second display module 92 for displaying the first view through the graphical user interface;
a second obtaining module 94, configured to obtain a current position of a touch point of the first touch operation in response to the first touch operation acting on the graphical user interface, and control to adjust the first view to a second view determined according to the current position;
the second control module 96 is configured to control to start a virtual sighting telescope configured on the virtual firearm in response to a stay time of a target area in the game scene corresponding to a preset position in the second view screen meeting a first preset time length in a duration process of the first touch operation.
Optionally, the second control module includes: the first control unit is used for responding to the first preset duration when the stay duration of the second virtual object in the corresponding game scene of the preset position in the second view field picture meets the first preset duration.
Optionally, the second control module includes: the acquisition unit is used for acquiring the stay time of the first scene area in the corresponding game scene at the preset position in the second view field picture; the second control unit is used for responding to the condition that the stay time length of the first scene area in the game scene corresponding to the preset position in the second view field picture meets the first preset time length when the stay time length of the first scene area in the game scene corresponding to the preset position in the second view field picture meets the first preset time length.
Example 5
According to an embodiment of the present invention, there is also provided a storage medium including a stored program, wherein the apparatus in which the computer-readable storage medium is controlled to execute the in-game aiming control method of the above embodiment when the program is run.
Example 6
According to an embodiment of the present invention, there is also provided a processor for running a program, wherein the program executes an in-game aiming control method in the above embodiment.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.

Claims (12)

1. An in-game aiming control method, characterized in that a graphical user interface is provided by a terminal device, the graphical user interface comprising at least part of a game scene and at least part of a first virtual object located in the game scene, the first virtual object being configured as a virtual firearm, the method comprising:
displaying a first view through the graphical user interface;
responding to a first touch operation acted on the graphical user interface, acquiring a first position of a touch point of the first touch operation, and controlling the first view picture to be adjusted to a second view picture determined according to the first position;
responding to the residence time of the touch point at the first position to meet a first preset time, and controlling to start a virtual sighting telescope configured on the virtual firearm;
the first view field picture and the second view field picture are different views provided by adjusting a virtual camera in the game scene.
2. The method according to claim 1, wherein the method further comprises:
detecting whether the first touch operation leaves the touch point or not according to the fact that the stay time of the touch point at the first position does not meet a first preset time;
and stopping recording the stay time of the first touch operation at the touch point when the first touch operation is detected to leave the touch point.
3. The method of claim 1, wherein after controlling the adjustment of the first field of view to a second field of view determined from the first location, the method further comprises:
and displaying a timing area, wherein the timing area is used for displaying the relation between the stay time length and the first preset time length.
4. A method according to claim 3, wherein prior to displaying the timing area, the method further comprises:
comparing the residence time with a second preset time;
and when the residence time length exceeds the second preset time length, entering a step of displaying a timing area, wherein the second preset time length is smaller than the first preset time length.
5. The method of claim 2, wherein after controlling the turning on of the virtual scope configured on the virtual firearm in response to the dwell time of the touch point at the first location satisfying a first preset time period, the method further comprises:
When the first touch operation is detected to move away from the touch point, adjusting the visual angle of the virtual sighting telescope according to the movement of the first touch operation;
and closing a virtual sighting telescope configured on the virtual firearm when the first touch operation is detected to be canceled.
6. An in-game aiming control method, characterized in that a graphical user interface is provided by a terminal device, the graphical user interface comprising at least part of a game scene and at least part of virtual objects located in the game scene, a first virtual object being configured as a virtual firearm, the method comprising:
displaying a first view through the graphical user interface;
responding to a first touch operation acted on the graphical user interface, acquiring the current position of a touch point of the first touch operation, and controlling the first view picture to be adjusted to a second view picture determined according to the current position;
in the continuous process of the first touch operation, responding to the fact that the stay time of the preset position in the second view field picture corresponding to the target area in the game scene meets a first preset time, and controlling to start a virtual sighting telescope configured on the virtual firearm;
The first view field picture and the second view field picture are different views provided by adjusting a virtual camera in the game scene.
7. The method of claim 6, wherein the graphical user interface includes a second virtual object, the second virtual object being a virtual object that is in a different campaigns than the first virtual object, and the step of responding to the dwell time of the target area in the game scene corresponding to the preset position in the second view screen meeting the first preset time length includes: responding to the first preset duration of the stay time of the second virtual object in the game scene corresponding to the preset position in the second view field picture.
8. The method of claim 6, wherein the step of responding to the first preset duration being satisfied by the dwell time of the target area in the game scene at the preset position in the second view screen, comprises:
acquiring the stay time of a first scene area in the game scene corresponding to a preset position in the second view field picture;
when the stay time length of the preset position in the second view field picture corresponding to the first scene area in the game scene meets the first preset time length, responding to the stay time length of the preset position in the second view field picture corresponding to the first scene area in the game scene to meet the first preset time length.
9. An in-game aiming control device, characterized in that a graphical user interface is provided by a terminal device, said graphical user interface comprising at least part of a game scene and at least part of a first virtual object located in said game scene, said first virtual object being configured as a virtual firearm, said device comprising:
the first display module is used for displaying a first visual field picture through the graphical user interface;
the first acquisition module is used for responding to a first touch operation acted on the graphical user interface, acquiring a first position of a touch point of the first touch operation, and controlling the first visual field picture to be adjusted to a second visual field picture determined according to the first position;
the first control module is used for responding to the fact that the stay time of the touch point at the first position meets a first preset time, and controlling to start a virtual sighting telescope configured on the virtual firearm;
the first view field picture and the second view field picture are different views provided by adjusting a virtual camera in the game scene.
10. An in-game aiming control device, characterized in that a graphical user interface is provided by a terminal device, said graphical user interface comprising at least part of a game scene and at least part of virtual objects located in said game scene, a first virtual object being configured as a virtual firearm, said device comprising:
The second display module is used for displaying a first view field picture through the graphical user interface;
the second acquisition module is used for responding to a first touch operation acted on the graphical user interface, acquiring the current position of a touch point of the first touch operation, and controlling the first visual field picture to be adjusted to a second visual field picture determined according to the current position;
the second control module is used for responding to the fact that the stay time of the preset position in the second visual field picture corresponding to the target area in the game scene meets the first preset time length in the continuous process of the first touch operation, and controlling to start a virtual sighting telescope configured on the virtual firearm;
the first view field picture and the second view field picture are different views provided by adjusting a virtual camera in the game scene.
11. A storage medium comprising a stored program, wherein the program, when run, controls a device in which the storage medium is located to perform the in-game aiming control method according to any one of claims 1 to 8.
12. A processor for running a program, wherein the program when run performs the in-game aiming control method according to any one of claims 1 to 8.
CN202110616223.0A 2021-06-02 2021-06-02 In-game aiming control method and device Active CN113318431B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110616223.0A CN113318431B (en) 2021-06-02 2021-06-02 In-game aiming control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110616223.0A CN113318431B (en) 2021-06-02 2021-06-02 In-game aiming control method and device

Publications (2)

Publication Number Publication Date
CN113318431A CN113318431A (en) 2021-08-31
CN113318431B true CN113318431B (en) 2023-12-19

Family

ID=77423253

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110616223.0A Active CN113318431B (en) 2021-06-02 2021-06-02 In-game aiming control method and device

Country Status (1)

Country Link
CN (1) CN113318431B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107678647A (en) * 2017-09-26 2018-02-09 网易(杭州)网络有限公司 Virtual shooting main body control method, apparatus, electronic equipment and storage medium
CN109499061A (en) * 2018-11-19 2019-03-22 网易(杭州)网络有限公司 Method of adjustment, device, mobile terminal and the storage medium of scene of game picture
CN109589601A (en) * 2018-12-10 2019-04-09 网易(杭州)网络有限公司 Virtual aim mirror control method and device, electronic equipment and storage medium
CN110597449A (en) * 2019-09-12 2019-12-20 腾讯科技(深圳)有限公司 Prop using method, device, terminal and storage medium based on virtual environment
WO2020168680A1 (en) * 2019-02-22 2020-08-27 网易(杭州)网络有限公司 Game role control method, apparatus and device and storage medium
CN112354181A (en) * 2020-11-30 2021-02-12 腾讯科技(深圳)有限公司 Open mirror picture display method and device, computer equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6207415B2 (en) * 2014-01-31 2017-10-04 株式会社バンダイ Information providing system and information providing program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107678647A (en) * 2017-09-26 2018-02-09 网易(杭州)网络有限公司 Virtual shooting main body control method, apparatus, electronic equipment and storage medium
CN109499061A (en) * 2018-11-19 2019-03-22 网易(杭州)网络有限公司 Method of adjustment, device, mobile terminal and the storage medium of scene of game picture
CN109589601A (en) * 2018-12-10 2019-04-09 网易(杭州)网络有限公司 Virtual aim mirror control method and device, electronic equipment and storage medium
WO2020168680A1 (en) * 2019-02-22 2020-08-27 网易(杭州)网络有限公司 Game role control method, apparatus and device and storage medium
CN110597449A (en) * 2019-09-12 2019-12-20 腾讯科技(深圳)有限公司 Prop using method, device, terminal and storage medium based on virtual environment
CN112354181A (en) * 2020-11-30 2021-02-12 腾讯科技(深圳)有限公司 Open mirror picture display method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN113318431A (en) 2021-08-31

Similar Documents

Publication Publication Date Title
US11325036B2 (en) Interface display method and apparatus, electronic device, and computer-readable storage medium
KR102619439B1 (en) Methods and related devices for controlling virtual objects
US20200376381A1 (en) Posture adjustment method and apparatus, storage medium, and electronic device
CN113398601B (en) Information transmission method, information transmission device, computer-readable medium, and apparatus
CN111437601B (en) Game playback control method and device, electronic equipment and storage medium
JP7334347B2 (en) Virtual environment screen display method, device, equipment and program
JP7447296B2 (en) Interactive processing method, device, electronic device and computer program for virtual tools
WO2022252905A1 (en) Control method and apparatus for call object in virtual scene, device, storage medium, and program product
WO2023045375A1 (en) Method and apparatus for spectating game after character is killed, and electronic device and storage medium
WO2022237420A1 (en) Control method and apparatus for virtual object, device, storage medium, and program product
US20230364502A1 (en) Method and apparatus for controlling front sight in virtual scenario, electronic device, and storage medium
CN113559507A (en) Information processing method, information processing apparatus, storage medium, and electronic device
CN112316429A (en) Virtual object control method, device, terminal and storage medium
CN113713393A (en) Control method and device of virtual prop, storage medium and electronic equipment
CN113318431B (en) In-game aiming control method and device
CN112755524B (en) Virtual target display method and device, electronic equipment and storage medium
CN113893540B (en) Information prompting method and device, storage medium and electronic equipment
CN113384883B (en) Display control method and device in game, electronic equipment and storage medium
CN112156472B (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN112121433B (en) Virtual prop processing method, device, equipment and computer readable storage medium
CN111589129B (en) Virtual object control method, device, equipment and medium
CN113996051A (en) Method, device, equipment and storage medium for canceling and releasing game skills
CN113730909B (en) Aiming position display method and device, electronic equipment and storage medium
CN112891930B (en) Information display method, device, equipment and storage medium in virtual scene
CN113750532B (en) Track display method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant