CN107913520B - Information processing method, information processing device, electronic equipment and storage medium - Google Patents

Information processing method, information processing device, electronic equipment and storage medium Download PDF

Info

Publication number
CN107913520B
CN107913520B CN201711337585.6A CN201711337585A CN107913520B CN 107913520 B CN107913520 B CN 107913520B CN 201711337585 A CN201711337585 A CN 201711337585A CN 107913520 B CN107913520 B CN 107913520B
Authority
CN
China
Prior art keywords
touch event
touch
auxiliary control
amplitude
skill
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711337585.6A
Other languages
Chinese (zh)
Other versions
CN107913520A (en
Inventor
郑继峰
蒋慧冲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201711337585.6A priority Critical patent/CN107913520B/en
Publication of CN107913520A publication Critical patent/CN107913520A/en
Application granted granted Critical
Publication of CN107913520B publication Critical patent/CN107913520B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Abstract

The disclosure provides an information processing method, an information processing device, electronic equipment and a computer readable storage medium, and relates to the technical field of human-computer interaction. The method comprises the following steps: providing an auxiliary control on the operation interface when a first touch event acting on a skill control is detected; adjusting the presenting view of the game scene picture in a first amplitude or a second amplitude according to the position relation between the current position of the touch point of the second touch event and the auxiliary control; and when the second touch event is detected to be finished, determining the release position of the skill control corresponding to the skill according to the display visual field of the game scene picture. According to the method and the device, the visual field is presented by adjusting the game scene picture, so that the precision of skill release is improved.

Description

Information processing method, information processing device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of human-computer interaction, and in particular, to an information processing method, an information processing apparatus, an electronic device, and a computer-readable storage medium.
Background
With the rapid development of mobile communication technology, a large number of game applications are emerging on terminals. In various shooting-type game applications, it is often necessary to aim at a virtual object and perform a shooting operation thereon or release a corresponding skill.
In the third person in the related art, viewing angle type game application, a player can freely adjust the angle and position of a virtual camera. The general adjusting mode needs to press a skill or weapon button to enter an aiming state, then the display visual field of a game picture is adjusted according to the mapping relation between the vector generated by the finger sliding and pressing the button and the rotation direction of the camera lens, and the weapon or skill is released after the finger is released. In the shooting type mobile phone game application a shown in fig. 1, a weapon control needs to be clicked to enter a preparation state, an aiming control presented on an operation interface is dragged to enter an aiming state, an aiming direction is adjusted through sliding operation, and finally a right hand is lifted to release a weapon. In the mobile game application B shown in fig. 2, the steps of entering the aiming state, adjusting the view of the game screen and releasing the skill are similar to those in fig. 1.
The manner in which the field of view of the game screen presentation is adjusted provided in the relevant gaming application may have the following problems: for the mode in fig. 1, due to the limitation of the operation interface, the angle of rotation to the preset direction is limited, for example, the aiming state needs to be exited when aiming to the right, and the aiming state is started after the lens direction is adjusted; adjusting the direction by sliding the other finger on the right half side of the screen in the aiming state; the skill button needs to be clicked first, and then the aiming control is dragged to aim. When the skill control and the aiming control are located far away, the complicated operation steps may affect the game, and the user experience is not good. Although the problem of restriction of the adjustment of the virtual camera direction by the operation interface in fig. 1 can be solved by the left hand in fig. 2, the movement of the virtual object cannot be controlled at the same time, and the problem of deviation of the skill release position occurs.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide an information processing method, an information processing apparatus, an electronic device, and a computer-readable storage medium, which overcome one or more of the problems due to the limitations and disadvantages of the related art, at least to some extent.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to an aspect of the present disclosure, an information processing method is provided, which is applied to a touch terminal capable of presenting an operation interface of a game scene screen, the method including:
providing an auxiliary control on the operation interface when a first touch event acting on a skill control is detected;
adjusting the presenting view of the game scene picture in a first amplitude or a second amplitude according to the position relation between the current position of the touch point of the second touch event and the auxiliary control;
and when the second touch event is detected to be finished, determining the release position of the skill control corresponding to the skill according to the display visual field of the game scene picture.
In an exemplary embodiment of the present disclosure, adjusting the rendering field of view of the game scene screen by a first amplitude or a second amplitude according to a position relationship between the current position of the second touch event touch point and the auxiliary control includes:
when a second touch event touch point is detected to be in the auxiliary control, adjusting the presenting visual field of the game scene picture by the first amplitude according to the current position of the second touch event touch point;
and when detecting that the touch point of a second touch event is outside the auxiliary control, adjusting the presenting visual field of the game scene picture by the second amplitude according to the current position of the touch point of the second touch event.
In an exemplary embodiment of the present disclosure, adjusting the rendering field of view of the game scene screen by the first magnitude according to the current position of the second touch event touch point includes:
calculating the offset direction of the current position of the touch point of the second touch event relative to the original point position of the auxiliary control;
and determining the rotation angle of a virtual camera for presenting the game scene picture according to the offset direction so as to adjust the presentation visual field of the game scene picture.
In an exemplary embodiment of the present disclosure, adjusting the rendering field of view of the game scene screen by the second magnitude according to the current position of the second touch event touch point includes:
determining the intersection point of the current position of the second touch event touch point and the auxiliary control according to the current position of the second touch event touch point and the original point position of the auxiliary control;
and determining a vector between the origin position of the auxiliary control and the intersection point, and adjusting the presenting view of the game scene picture according to the vector.
In an exemplary embodiment of the present disclosure, determining a release position of the skill control for the skill comprises:
and acquiring an intersection point of the visual field center of the virtual camera and a virtual plane, and determining the intersection point as the release position of the skill.
In an exemplary embodiment of the present disclosure, when the touch point of the second touch event is outside the auxiliary control, the method further includes:
when a third touch event continuous with the second touch event is detected, and a touch point of the third touch event is in the auxiliary control, adjusting the presenting view of the game scene picture with a first amplitude according to the current position of the touch point of the third touch event.
In an exemplary embodiment of the present disclosure, the second touch event is a sliding operation and the second touch event is continuous with the first touch event.
In an exemplary embodiment of the present disclosure, the second amplitude is greater than the first amplitude.
In an exemplary embodiment of the disclosure, upon detecting a first touch event acting on a skill control, the method further comprises:
and triggering the display visual field of the game scene picture to be adjusted from the first visual field mode to the second visual field mode.
According to an aspect of the present disclosure, there is provided an information processing apparatus applied to a touch terminal capable of presenting an operation interface of a game scene screen, the apparatus including:
the control providing module is used for providing an auxiliary control on the operation interface when a first touch event acting on a skill control is detected;
the visual field adjusting module is used for adjusting the presenting visual field of the game scene picture in a first amplitude or a second amplitude according to the position relation between the current position of the touch point of the second touch event and the auxiliary control;
and the position determining module is used for determining the release position of the skill control corresponding to the skill according to the display visual field of the game scene picture when the second touch event is detected to be finished.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the information processing method of any one of the above.
According to an aspect of the present disclosure, there is provided an electronic device including:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform any one of the information processing methods described above via execution of the executable instructions.
In an information processing method, an information processing apparatus, an electronic device, and a computer-readable storage medium provided by an exemplary embodiment of the present disclosure, when a first touch event acting on a skill control is detected, an auxiliary control is provided on the operation interface; when detecting that a touch point of a second touch event is in the auxiliary control, adjusting the presenting view of the game scene picture by a first amplitude according to the current position of the touch point of the second touch event; and when the second touch event is detected to be finished, determining the release position of the skill control corresponding to the skill according to the display visual field of the game scene picture. On one hand, by providing the auxiliary control, a user can conveniently and freely adjust the presenting visual field of the game scene picture to any direction, and the skill control can be released more accurately by determining the release position of the skill control corresponding to the skill according to the presenting visual field of the game scene picture; on the other hand, the skill release position is determined according to the display visual field of the game scene picture, so that the operation steps are simplified, the efficiency of determining the skill release position is improved, the game process is smoother, and the user experience is better.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The above and other features and advantages of the present disclosure will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
FIG. 1 is a schematic diagram of an operation interface of a mobile game application A in the related art;
FIG. 2 is a diagram illustrating an operation interface of a mobile game application B in the related art;
FIG. 3 schematically illustrates an information processing method in an exemplary embodiment of the disclosure;
FIG. 4 schematically illustrates a schematic diagram of providing a secondary control in an exemplary embodiment of the present disclosure;
FIG. 5 is a schematic diagram illustrating an operator interface for adjusting a presentation field of view within an auxiliary control in an exemplary embodiment of the present disclosure;
FIG. 6 is a schematic diagram illustrating an operator interface for adjusting a presentation field of view outside of an auxiliary control in an exemplary embodiment of the present disclosure;
fig. 7 schematically illustrates a schematic diagram of determining a skill release location in an exemplary embodiment of the disclosure;
fig. 8 schematically illustrates a block diagram of an information processing apparatus in an exemplary embodiment of the present disclosure;
FIG. 9 schematically illustrates a block diagram of an electronic device in an exemplary embodiment of the disclosure;
fig. 10 schematically illustrates a program product in an exemplary embodiment of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals denote the same or similar parts in the drawings, and thus, a repetitive description thereof will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the embodiments of the disclosure can be practiced without one or more of the specific details, or with other methods, components, materials, devices, steps, and so forth. In other instances, well-known structures, methods, devices, implementations, materials, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. That is, these functional entities may be implemented in the form of software, or in one or more software-hardened modules, or in different networks and/or processor devices and/or microcontroller devices.
The exemplary embodiment first discloses an information processing method, which may be applied to a touch terminal capable of presenting an operation interface including a game scene picture of a shooting-type mobile phone game application with a full view angle, and the operation interface may be obtained by executing a software application on a processor of the touch terminal and rendering the software application on a display of the touch terminal. When the game application is a third person name shooting game, the operation interface can comprise a virtual shooting main body, a virtual rocker control and a skill control 402, and can also comprise a virtual object corresponding to a user and one or more virtual shooting objects. That is, in the third person perspective, the virtual object corresponding to the user can be presented in the operation interface as a whole, so that the operation difficulty and the vertigo feeling of the user are reduced, the action feeling of the screen is emphasized, and specifically, the virtual object is displayed in any way, so that the developer can select the virtual object according to the content of the game, and the invention is not limited thereto.
In addition, the operation interface can also comprise a virtual battle scene, a virtual natural environment, a map control area, an information panel control area, blank areas outside the control areas and the like. The virtual shooting subject may be, for example, a virtual gun controlled by a virtual object or other virtual weapon such as a grenade. The virtual rocker control may be located anywhere to the left of the operator interface, and the virtual object may be configured to move in accordance with the virtual rocker control. The touch terminal can be various electronic devices with touch screens, such as a mobile phone, a tablet computer, a notebook computer, a game machine, a PDA, and the like. Referring to fig. 3, the information processing method may include the steps of:
s310, when a first touch event acting on a skill control is detected, providing an auxiliary control on the operation interface;
s320, adjusting the display visual field of the game scene picture in a first amplitude or a second amplitude according to the position relation between the current position of the touch point of the second touch event and the auxiliary control;
and S330, when the second touch event is detected to be finished, determining the release position of the skill control corresponding to the skill according to the display visual field of the game scene picture.
According to the information processing method in the exemplary embodiment, on one hand, by providing the auxiliary control, a user can freely adjust the presenting visual field of the game scene picture to any direction according to the first amplitude or the second amplitude, and further determine the release position of the skill control corresponding to the skill according to the presenting visual field of the game scene picture, so that the release of the skill can be controlled more accurately; on the other hand, the release position of the skill control corresponding to the skill is determined according to the display visual field of the game scene picture, so that the operation steps are simplified, the efficiency of determining the skill release position is improved, the game process is smoother, and the user experience is better.
Next, each step in the information processing method is further explained with reference to fig. 3 to 7.
In step S310, when a first touch event acting on a skill control is detected, an auxiliary control is provided on the operation interface.
In the exemplary embodiment, the skill control may be, for example, a shooting control or a control corresponding to another virtual weapon, or another function control for releasing a preset operation, or the like. The skill control can be arranged at any position of the operation interface, for example, the skill control can be arranged at a different side or the same side of the operation interface relative to the virtual rocker control. The first touch event may be one or a combination of multiple operations, such as clicking, double clicking, long pressing, or pressing, and the first touch event is taken as an example of the long pressing operation. Specifically, whether a long-press operation is received at any position on an operation interface or not can be detected by combining a position sensor and a time sensor, when the long-press operation is detected, whether the long-press operation acts on a skill control or not can be judged through the position sensor, and when the long-press operation is judged to act on the skill control, an auxiliary control can be provided on the operation interface.
In the present exemplary embodiment, one auxiliary control 403 may be provided and presented on the operation interface. The shape of the secondary control may be circular, square, or irregular. It should be noted that the auxiliary control is arranged on the same side of the operation interface as the skill control, and may be used to assist in releasing the skill or adjusting the rendering field of view of the game scene picture. The auxiliary control can be arranged at a preset position on the operation interface, and the preset position can be any direction and any position within a certain range around the skill control, for example, the position right below the skill control or any direction within the preset range around the skill control.
In addition, for more convenient identification of the auxiliary control, the auxiliary control may be displayed distinctively, for example, the auxiliary control may be set as a touch area with a visual indication. For example, the auxiliary control element may be a touch area filled with color, or a touch area with a preset transparency, or other areas capable of visually indicating the range of the touch area, or the boundary of the auxiliary control element may be distinctively displayed in a bold or highlight manner.
It is necessary to supplement that the auxiliary control can be presented only in a certain state and can be kept in a clickable state, and is hidden in other states, which can save screen space to a certain extent, improve screen utilization and improve operation convenience. For example, the secondary control may be controlled to be presented only in the aiming state and may remain clickable while being hidden in the mobile state.
In addition, when a first touch event acting on a skill control is detected, the display view of the game scene picture can be triggered to be adjusted from a first view mode to a second view mode, and at this time, the game scene picture can be considered to enter the aiming state. The first view mode may be a panoramic view, that is, the virtual battle scene may be observed in all directions, and the direction of the view presented by the operation interface may be correspondingly adjusted and changed according to, for example, a sliding operation applied to the operation interface. The second view mode may be set according to actual requirements, and may be a local view, for example. However, in the special case of observing a wide field of view, the second viewing mode may be maintained the same as the first viewing mode, or may be set to a viewing mode having a larger viewing range than the first viewing mode, all of which are within the scope of the present disclosure. It should be noted that, when the first touch event is detected, if the first view mode is switched to the second view mode, the release of the skill can be controlled more precisely. In addition, it also belongs to the protection scope of the present disclosure to keep the first view mode unchanged when the first touch event is detected, and the example of switching to the second view mode is taken as an example for description in this example.
When detecting that a long-time press operation acting on a skill control or a user needs to use a virtual shooting main body to aim at and shoot a certain virtual shooting object far away or the user needs to release a skill corresponding to the skill control to the certain virtual shooting object far away, the auxiliary control is presented within a preset range of the skill control on the right side of the operation interface at the same time, and the user can adjust the presentation visual field of the game scene picture through sliding operation. In order to avoid the limitation of the sliding operation by the screen in the second mode, the present example can also realize the wide-range free adjustment of the presenting visual field of the game scene picture by means of the presenting auxiliary control. In detail:
in step S320, a display view of the game scene image is adjusted by a first amplitude or a second amplitude according to a position relationship between a current position of the touch point of the second touch event and the auxiliary control.
In this exemplary embodiment, whether the touch point of the second touch event is within the auxiliary control may be detected, and then the display view of the game scene screen may be adjusted and switched in a corresponding manner, in a corresponding first range or in a corresponding second range according to a position relationship between the position of the touch point of the second touch event and the auxiliary control. The first width and the second width may be implemented individually or in combination, as long as the field of view of the game scene screen can be adjusted. The second touch event in this example may be a sliding operation and the sliding operation may be continuous with the first touch event. In detail, adjusting the rendering view of the game scene picture by a first amplitude or a second amplitude according to the position relationship between the current position of the touch point of the second touch event and the auxiliary control may include:
when a second touch event touch point is detected to be in the auxiliary control, adjusting the presenting visual field of the game scene picture by a first amplitude according to the current position of the second touch event touch point;
and when detecting that the touch point of a second touch event is outside the auxiliary control, adjusting the presenting view of the game scene picture by a second amplitude according to the current position of the touch point of the second touch event.
With the above steps combined, when a long press operation acting on a certain skill control is detected, a circular auxiliary control is immediately presented at any position of the skill control in any direction on an operation interface, and then, as long as a position sensor is used for judging that a touch point of the sliding operation is within the range of the auxiliary control, the presentation view field of the game scene picture can be accurately controlled and adjusted in a first amplitude according to the current position of the sliding operation.
Specifically, in this example embodiment, adjusting the rendering field of view of the game scene screen by the first amplitude according to the current position of the second touch event touch point may include:
calculating the offset direction of the current position of the touch point of the second touch event relative to the original point position of the auxiliary control;
and determining the rotation angle of a virtual camera for presenting the game scene picture according to the offset direction so as to adjust the presentation visual field of the game scene picture according to the rotation angle of the virtual camera.
In the third person-to-shoot type game in the present exemplary embodiment, the virtual camera may be disposed above and behind the virtual object, and all the game scenes may be photographed. A mapping relationship may be set between the offset direction of the touch point of the second touch event relative to the origin position of the auxiliary control and the rotation angle of the virtual camera, so as to control the virtual camera to rotate. When the position of the sliding operation moves for multiple times, the offset direction of the current position of the sliding operation touch point relative to the position of the origin of the auxiliary control can be calculated, and the offset direction of the current position of the touch point relative to the position of the first sliding operation touch point can also be calculated.
Specifically, an offset direction of the current position of the sliding event touch point relative to the origin position of the auxiliary control may be obtained. For example, when the current position of the sliding event touch point is in the auxiliary control and the current position of the sliding event touch point is above, below, left and right of the origin position of the auxiliary control, the virtual camera may be controlled to rotate according to the mapping relationship between the offset direction of the current position of the sliding event touch point relative to the origin position of the auxiliary control and the rotation angle of the virtual camera.
And mapping the coordinates of the auxiliary control to a pitch angle and a yaw angle corresponding to the virtual camera, and controlling the rotation direction and the rotation angle of the virtual camera according to the pitch angle and the yaw angle. A coordinate system may be established with the origin of the auxiliary control as a center of circle, and the mapping relationship between the sliding event touch point and the rotation angle of the virtual camera may be: the abscissa of the touch point is negative, the virtual camera rotates to the left, the abscissa of the touch point is positive, and the virtual camera rotates to the right; the vertical coordinate of the touch point is negative, the virtual camera rotates downwards, the vertical coordinate of the touch point is positive, and the virtual camera rotates upwards. In addition, preset limiting conditions can be added to the Euler angles of the virtual cameras, so that the virtual cameras cannot move beyond the range. For example, when the visual angle of the camera is rotated upwards or downwards, the included angle formed by the visual line of the camera and the ground is not more than 90 degrees, namely the pitch angle is smaller than plus or minus 90 degrees, so that the virtual object can only see the sky or the feet but cannot go beyond the sky further, and the phenomenon of lens reversal is prevented. Referring to fig. 5, if the current position of the touch point F of the sliding operation is at the 8O 'clock position of the auxiliary control origin O, the virtual camera may be correspondingly controlled to rotate, so as to present a view corresponding to the 8O' clock position on the game scene picture.
In addition, if the current position of the touch point of the second touch event is determined to be outside the range of the auxiliary control by the position sensor, the rotation direction of the virtual camera can be adjusted in a large range in a second amplitude according to the current position of the second touch event, so as to adjust the display field of view of the game scene picture in a large range.
Specifically, in this example embodiment, adjusting the rendering field of view of the game scene screen by the second amplitude according to the current position of the second touch event touch point may include:
determining an intersection point of the touch point of the second touch event and the auxiliary control according to the current position of the touch point of the second touch event and the original point position of the auxiliary control;
and determining a vector between the origin position of the auxiliary control and the intersection point, and adjusting the presenting view of the game scene picture according to the vector.
In this exemplary embodiment, an intersection point of the touch point of the second touch event and the auxiliary control may be determined according to the current position of the sliding operation touch point and the origin position of the auxiliary control. For example, referring to fig. 6, the current position OF the touch point F OF the sliding operation is outside the auxiliary control a, and in the 2O' clock direction OF the origin O OF the auxiliary control, a vector OF is formed between the touch point F OF the sliding operation and the origin O OF the auxiliary control, and an intersection point OF the vector and the boundary OF the auxiliary control a is P, so as to form a vector OP. It should be noted that during the adjustment of the virtual camera field of view, instead of rotating directly to the 2 o' clock orientation in one step, the virtual camera field of view may assume any orientation before the target direction while continuing to rotate at a constant speed toward the target direction. The virtual camera can rotate to realize the adjustment of the wide-range display visual field at any angle, and the problem that the visual field of the game picture cannot be quickly and greatly adjusted by screen control operation in the related technology is solved.
It should be added that the second amplitude in this example is larger than the first amplitude, for example, the adjustment angle and the adjustment distance according to the second amplitude may be larger than the adjustment angle and the adjustment distance according to the first amplitude. In addition, the process of accurately adjusting the displayed view of the game scene picture according to the first amplitude and the process of roughly adjusting the displayed view of the game scene picture according to the second amplitude in a large range can be completed by only one hand, so that the problem that the movement of a virtual object cannot be controlled due to the fact that two hands of a user are required to be matched simultaneously when the displayed view of the game scene picture is adjusted in a large range in the related art shown in fig. 2 is solved, and the game process can be smoother.
For the method of adjusting the rendering field of view of the game scene screen in this example, if the position of the skill control is at the edge of the screen, for example, the right edge, and if the auxiliary control a is generated directly at the right of the skill control, the auxiliary control a may be out of the screen range, resulting in the user being unable to slide in the right direction. The time length is shifted horizontally or vertically according to the generated auxiliary control A relative to the skill control, so that the distance between the area A and the edge of the screen is larger than a certain numerical value, and the numerical value can be set according to actual needs. And when the auxiliary control A is generated on the right side of the skill control, the generated auxiliary control is shifted to the left side of the skill control, so that the limitation and influence of the screen boundary on the sliding operation and the display visual field adjusting process of the game scene picture are prevented.
In addition, when the current position of the touch point of the second touch event is outside the auxiliary control, the method may further include:
when a third touch event continuous with the second touch event is detected and a touch point of the third touch event is in the auxiliary control, adjusting the display visual field of the game scene picture according to the current position of the touch point of the third touch event.
In this exemplary embodiment, when the touch point of the second touch event is outside the auxiliary control to adjust the rendering field of the game scene picture in a large range, if the rendering field needs to be accurately adjusted, the corresponding condition needs to be satisfied. Specifically, whether a third touch event continuous with the second touch event is received or not may be detected, and when the third touch event is detected, whether a touch point of the third touch event is in the auxiliary control or not may be determined. The third touch event may be a sliding operation, and the sliding direction of the third touch event may be different from that of the second sliding operation, that is, the end point of the second sliding operation may be used as the starting point of the third sliding event. If the touch point of the third touch event is in the auxiliary control, the display view of the game scene picture can be adjusted according to the third touch event. The specific step of accurately adjusting the view field of the game scene picture according to the current position of the touch point of the third touch event is the same as the step of adjusting the view field according to the touch point of the second touch event, and is not repeated here.
Similarly, when the current position of the touch point of the second touch event is outside the auxiliary control, and the touch point of the third touch event is outside the auxiliary control, since the position of the touch point still does not satisfy the condition for accurately adjusting the game scene picture, the rendering field of view of the game scene picture can be adjusted in a second range according to the intersection point of the current position of the touch point of the third touch event and the auxiliary control. The process of adjusting the presented view is similar to the above steps, and is not described herein again.
In step S330, when it is detected that the second touch event is ended, a release position of the skill control corresponding to the skill is determined according to the presentation field of view of the game scene picture.
In this exemplary embodiment, it may be first detected whether the second touch event acting on the auxiliary control is ended. The ending of the second touch event may be understood as an operation of a finger leaving an operation interface or an action of lifting the finger, and whether the second touch event is ended may be determined by detecting a position or pressure of a touch point of the second touch event or other characteristics. For example, when the second touch event is a sliding operation, whether the sliding operation is finished may be determined by determining whether a touch point exists at a position where the sliding operation end point is located, or whether a pressure value at the position where the touch point is located is zero. When the second touch event is judged to be finished, namely the operation that the finger leaves the operation interface is detected, a skill release position can be determined according to the display visual field of the game scene, and the corresponding skill is released according to the skill release position.
Specifically, in this exemplary embodiment, determining a release position of the skill control for the skill according to the presentation field of view of the game scene screen may include:
and acquiring an intersection point of the visual field center of the virtual camera and a virtual plane, and determining the intersection point as the release position of the skill.
In the exemplary embodiment, after detecting a long press operation acting on the skill control, the ray emitted from the current view center of the virtual camera may form an intersection point with the virtual plane, the intersection point may be used as a skill release position, and the skill corresponding to the skill control may be released to the position. Wherein the virtual plane may be the ground in the game scene. It should be added that, when the view field of the game scene picture is accurately adjusted or adjusted in a wide range according to the above steps, the center of the view field of the virtual camera changes due to the corresponding up-down rotation or left-right rotation of the virtual camera, so the position of the intersection between the center of the view field of the virtual camera and the virtual plane changes correspondingly, and the higher the lens is, the farther the position of the intersection between the virtual camera and the virtual plane is, and the lower the lens is, the closer the position of the intersection between the virtual camera and the virtual plane is.
Referring to fig. 7, if the intersection point of the ray emitted from the current view center C of the virtual camera and the ground in the game scene is X, the position at X may be used as a skill release position, and a skill or a function corresponding to the skill control is released to an object corresponding to X, for example, another virtual shooting object or a virtual natural environment. In this example, through regard as the crossing point on the visual field center of virtual camera and ground as skill release position, compare can only confirm skill release direction or shooting direction in correlation technique, the position that can more accurate definite skill release can avoid the maloperation to a certain extent to improve skill release efficiency and rate of accuracy, improve user experience and feel.
It should be noted that, the user can control the virtual joystick control provided on the left side of the operation interface by using the left hand to control the movement of the virtual object, and in the process of adjusting the display view of the game scene picture, releasing the skill and the like in this example, the control of the user on the virtual joystick control is not affected, which is more beneficial to the game.
The method provided in the example can finish the whole processes of aiming, presentation adjustment and skill release only by sliding to any position after long-time pressing of the skill control and lifting of the finger, simplifies the steps of executing skill release operation while aiming, enables a user to quickly finish skill release in the game process, and improves the efficiency of skill release; the whole operation process can be completed by only one finger, so that the other hand controls the virtual object to move through the virtual rocker, or controls the virtual object to execute other operations, the progress of the game is not influenced, more operation combinations are provided, and the game diversity is improved; in addition, the intersection point of the visual field center of the virtual camera for presenting the game scene picture and the virtual plane is used as the skill release position corresponding to the skill control, so that the accuracy of skill release is improved.
In an exemplary embodiment of the present disclosure, there is also provided an information processing apparatus, which may be applied to a touch terminal capable of presenting an operation interface of a game scene screen, as shown in fig. 8, the apparatus 800 may include:
the control providing module 801 may be configured to provide an auxiliary control on the operation interface when a first touch event acting on a skill control is detected;
the view adjusting module 802 may be configured to adjust a rendering view of the game scene picture by a first amplitude or a second amplitude according to a position relationship between a current position of a second touch event touch point and the auxiliary control;
the position determining module 803 may be configured to determine, when it is detected that the second touch event is ended, a release position of the skill control corresponding to the skill according to the presentation field of view of the game scene picture.
The details of each module in the information processing apparatus have been described in detail in the corresponding information processing method, and therefore are not described herein again.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 900 according to this embodiment of the invention is described below with reference to fig. 9. The electronic device 900 shown in fig. 9 is only an example and should not bring any limitations to the function and scope of use of the embodiments of the present invention.
As shown in fig. 9, the electronic device 900 is embodied in the form of a general purpose computing device. Components of electronic device 900 may include, but are not limited to: the at least one processing unit 910, the at least one storage unit 920, a bus 930 connecting different system components (including the storage unit 920 and the processing unit 910), and a display unit 940.
Wherein the storage unit stores program code that is executable by the processing unit 910 to cause the processing unit 910 to perform steps according to various exemplary embodiments of the present invention described in the above section "exemplary methods" of the present specification. For example, the processing unit 910 may perform the steps as shown in fig. 3.
The storage unit 920 may include a readable medium in the form of a volatile storage unit, such as a random access memory unit (RAM)9201 and/or a cache memory unit 9202, and may further include a read only memory unit (ROM) 9203.
Storage unit 920 may also include a program/utility 9204 having a set (at least one) of program modules 9205, such program modules 9205 including but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 930 can be any of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 900 may also communicate with one or more external devices 970 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 900, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 900 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interface 950. Also, the electronic device 900 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet) via the network adapter 960. As shown, the network adapter 960 communicates with the other modules of the electronic device 900 via the bus 930. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 900, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
Referring to fig. 10, a program product 1000 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (8)

1. An information processing method is applied to a touch terminal capable of presenting an operation interface of a game scene picture, and is characterized by comprising the following steps:
when a first touch event acting on a skill control is detected, providing an auxiliary control on the operation interface, and triggering a presentation visual field of the game scene picture to be adjusted from a first visual field mode to a second visual field mode;
in the second visual field mode, according to the position relation between the current position of the touch point of the second touch event and the auxiliary control, adjusting the rotation angle of the virtual camera by a first amplitude or a second amplitude, so as to adjust the display visual field of the game scene picture according to the rotation angle of the virtual camera; wherein the virtual camera is configured to present the game scene picture, and the second amplitude is greater than the first amplitude;
when the second touch event is detected to be finished, determining the release position of the skill control corresponding to the skill according to the display visual field of the game scene picture;
wherein adjusting the rotation angle of the virtual camera by the first amplitude or the second amplitude comprises:
when a second touch event touch point is detected to be in the auxiliary control, calculating the offset direction of the current position of the second touch event touch point relative to the original point position of the auxiliary control, determining the rotation angle of the virtual camera according to the offset direction, and adjusting the rotation angle of the virtual camera by a first amplitude; or the like, or, alternatively,
when a second touch event touch point is detected to be outside the auxiliary control, determining an intersection point of the current position of the second touch event touch point and the auxiliary control according to the current position of the second touch event touch point and the original point position of the auxiliary control, determining a vector between the original point position of the auxiliary control and the intersection point, determining a target direction of the virtual camera according to the vector, and adjusting the rotation angle of the virtual camera by a second amplitude.
2. The information processing method according to claim 1, wherein determining a release position of the skill control for the skill comprises:
and acquiring an intersection point of the visual field center of the virtual camera and a virtual plane, and determining the intersection point as the release position of the skill.
3. The information processing method according to claim 1, wherein when the touch point of the second touch event is outside the auxiliary control, the method further comprises:
when a third touch event continuous with the second touch event is detected, and a touch point of the third touch event is in the auxiliary control, adjusting the rotation angle of the virtual camera by a first amplitude according to the current position of the touch point of the third touch event.
4. The information processing method according to claim 1, wherein the second touch event is a sliding operation and the second touch event is continuous with the first touch event.
5. The information processing method of claim 1, wherein upon detecting a first touch event acting on a skill control, the method further comprises:
and triggering the display visual field of the game scene picture to be adjusted from the first visual field mode to the second visual field mode.
6. An information processing apparatus applied to a touch terminal capable of presenting an operation interface of a game scene picture, the apparatus comprising:
the control providing module is used for providing an auxiliary control on the operation interface and triggering the display visual field of the game scene picture to be adjusted from a first visual field mode to a second visual field mode when a first touch event acting on a skill control is detected;
the visual field adjusting module is used for adjusting the rotation angle of the virtual camera in a first amplitude or a second amplitude according to the position relation between the current position of the touch point of the second touch event and the auxiliary control in the second visual field mode, so as to adjust the presenting visual field of the game scene picture according to the rotation angle of the virtual camera; wherein the virtual camera is configured to present the game scene picture, and the second amplitude is greater than the first amplitude;
the position determining module is used for determining the release position of the skill control corresponding to the skill according to the display visual field of the game scene picture when the second touch event is detected to be finished;
wherein adjusting the rotation angle of the virtual camera by the first amplitude or the second amplitude comprises:
when a second touch event touch point is detected to be in the auxiliary control, calculating the offset direction of the current position of the second touch event touch point relative to the original point position of the auxiliary control, determining the rotation angle of the virtual camera according to the offset direction, and adjusting the rotation angle of the virtual camera by a first amplitude; or the like, or, alternatively,
when a second touch event touch point is detected to be outside the auxiliary control, determining an intersection point of the current position of the second touch event touch point and the auxiliary control according to the current position of the second touch event touch point and the original point position of the auxiliary control, determining a vector between the original point position of the auxiliary control and the intersection point, determining a target direction of the virtual camera according to the vector, and adjusting the rotation angle of the virtual camera by a second amplitude.
7. A computer-readable storage medium on which a computer program is stored, the computer program, when being executed by a processor, implementing the information processing method of any one of claims 1 to 5.
8. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the information processing method of any one of claims 1 to 5 via execution of the executable instructions.
CN201711337585.6A 2017-12-14 2017-12-14 Information processing method, information processing device, electronic equipment and storage medium Active CN107913520B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711337585.6A CN107913520B (en) 2017-12-14 2017-12-14 Information processing method, information processing device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711337585.6A CN107913520B (en) 2017-12-14 2017-12-14 Information processing method, information processing device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN107913520A CN107913520A (en) 2018-04-17
CN107913520B true CN107913520B (en) 2021-04-23

Family

ID=61893333

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711337585.6A Active CN107913520B (en) 2017-12-14 2017-12-14 Information processing method, information processing device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN107913520B (en)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108595010B (en) * 2018-04-27 2021-06-18 网易(杭州)网络有限公司 Interaction method and device for virtual objects in virtual reality
CN108635850B (en) * 2018-05-17 2022-04-15 网易(杭州)网络有限公司 Information processing method, device and storage medium
CN108771863B (en) * 2018-06-11 2022-04-15 网易(杭州)网络有限公司 Control method and device for shooting game
CN109032493A (en) * 2018-08-03 2018-12-18 网易(杭州)网络有限公司 Information processing method, device and electronic equipment
CN109045695B (en) * 2018-08-08 2020-09-29 腾讯科技(深圳)有限公司 Accessory selection method, device and storage medium in virtual environment
CN109550234A (en) * 2018-09-20 2019-04-02 厦门吉比特网络技术股份有限公司 A kind of character control method and device of game
CN109550241B (en) * 2018-09-20 2023-04-07 厦门吉比特网络技术股份有限公司 Single rocker control method and system
CN109550240A (en) * 2018-09-20 2019-04-02 厦门吉比特网络技术股份有限公司 A kind of the technical ability method for releasing and device of game
CN109513209B (en) * 2018-11-22 2020-04-17 网易(杭州)网络有限公司 Virtual object processing method and device, electronic device and storage medium
CN109758764B (en) * 2018-12-11 2023-02-03 网易(杭州)网络有限公司 Game skill control method and device, electronic equipment and storage medium
CN109718548B (en) * 2018-12-19 2019-11-26 网易(杭州)网络有限公司 The method and device of virtual lens control in a kind of game
CN109568957B (en) * 2019-01-10 2020-02-07 网易(杭州)网络有限公司 In-game display control method, device, storage medium, processor and terminal
CN109675308A (en) 2019-01-10 2019-04-26 网易(杭州)网络有限公司 Display control method, device, storage medium, processor and terminal in game
CN109800047B (en) * 2019-01-18 2022-03-18 网易(杭州)网络有限公司 Game skill switching method and device, storage medium and electronic equipment
CN111124226B (en) * 2019-12-17 2021-07-30 网易(杭州)网络有限公司 Game screen display control method and device, electronic equipment and storage medium
CN111481934B (en) 2020-04-09 2023-02-10 腾讯科技(深圳)有限公司 Virtual environment picture display method, device, equipment and storage medium
CN111467802B (en) * 2020-04-09 2022-02-22 腾讯科技(深圳)有限公司 Method, device, equipment and medium for displaying picture of virtual environment
CN111760287B (en) * 2020-06-30 2024-02-02 网易(杭州)网络有限公司 Game skill control method, game skill control device, electronic equipment and computer readable medium
CN112044064A (en) * 2020-09-02 2020-12-08 完美世界(北京)软件科技发展有限公司 Game skill display method, device, equipment and storage medium
CN112148176B (en) * 2020-09-30 2022-11-15 维沃移动通信有限公司 Pendant control method and device, electronic equipment and readable storage medium
CN112402967B (en) * 2020-12-04 2024-04-12 网易(杭州)网络有限公司 Game control method, game control device, terminal equipment and medium
CN112843713B (en) * 2020-12-31 2023-04-18 上海米哈游天命科技有限公司 Method, device, equipment and medium for determining center point of visual field
CN112791410A (en) * 2021-01-25 2021-05-14 网易(杭州)网络有限公司 Game control method and device, electronic equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105760076A (en) * 2016-02-03 2016-07-13 网易(杭州)网络有限公司 Game control method and device
CN107168611A (en) * 2017-06-16 2017-09-15 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6581341B2 (en) * 2014-10-15 2019-09-25 任天堂株式会社 Information processing apparatus, information processing program, information processing method, and information processing system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105760076A (en) * 2016-02-03 2016-07-13 网易(杭州)网络有限公司 Game control method and device
CN107168611A (en) * 2017-06-16 2017-09-15 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN107913520A (en) 2018-04-17

Similar Documents

Publication Publication Date Title
CN107913520B (en) Information processing method, information processing device, electronic equipment and storage medium
CN108905212B (en) Game screen display control method and device, storage medium and electronic equipment
US10716995B2 (en) Information processing method and apparatus, storage medium, and electronic device
CN108780360B (en) Virtual reality navigation
US10191612B2 (en) Three-dimensional virtualization
JP2019058659A (en) Method for controlling virtual shooting character, apparatus, program, storage medium, and electronic device
CN108536374B (en) Virtual object direction control method and device, electronic equipment and storage medium
US20160092080A1 (en) Touch interface for precise rotation of an object
CN107329690B (en) Virtual object control method and device, storage medium and electronic equipment
CN107562201B (en) Directional interaction method and device, electronic equipment and storage medium
CN107832001B (en) Information processing method, information processing device, electronic equipment and storage medium
CN108211350B (en) Information processing method, electronic device, and storage medium
US20140317576A1 (en) Method and system for responding to user's selection gesture of object displayed in three dimensions
CN108553894B (en) Display control method and device, electronic equipment and storage medium
CN110215685B (en) Method, device, equipment and storage medium for controlling virtual object in game
CN108776544B (en) Interaction method and device in augmented reality, storage medium and electronic equipment
CN108355352B (en) Virtual object control method and device, electronic device and storage medium
CN113559501B (en) Virtual unit selection method and device in game, storage medium and electronic equipment
CN108170295A (en) Virtual camera condition control method, device, electronic equipment and storage medium
US9665232B2 (en) Information-processing device, storage medium, information-processing method, and information-processing system for enlarging or reducing an image displayed on a display device
CN112987924A (en) Method, apparatus, device and storage medium for device interaction
CN112965773A (en) Method, apparatus, device and storage medium for information display
US10379639B2 (en) Single-hand, full-screen interaction on a mobile device
CN107982916B (en) Information processing method, information processing device, electronic equipment and storage medium
CN108079572B (en) Information processing method, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant