CN111921196A - Game visual angle control method and device, storage medium and electronic equipment - Google Patents

Game visual angle control method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN111921196A
CN111921196A CN202010902824.3A CN202010902824A CN111921196A CN 111921196 A CN111921196 A CN 111921196A CN 202010902824 A CN202010902824 A CN 202010902824A CN 111921196 A CN111921196 A CN 111921196A
Authority
CN
China
Prior art keywords
controlling
game
touch
visual angle
speed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010902824.3A
Other languages
Chinese (zh)
Inventor
余村
黄志雄
凃益民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010902824.3A priority Critical patent/CN111921196A/en
Publication of CN111921196A publication Critical patent/CN111921196A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5255Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Abstract

The disclosure relates to the technical field of information display, in particular to a method and a device for controlling a game view angle, a computer readable storage medium and an electronic device, wherein the method comprises the following steps: detecting touch operation for adjusting a visual angle, and acquiring a touch track of the touch operation; adjusting the game visual angle according to the touch track; responding to the end of the touch operation, and determining touch parameters of the touch operation; when the touch parameter meets the touch parameter threshold, controlling to continuously adjust the game visual angle; and when the touch parameter does not meet the touch parameter threshold, controlling to stop adjusting the game visual angle or controlling the game visual angle to be restored to the initial state. The technical scheme of the embodiment of the disclosure can continuously complete large-scale adjustment of the visual angle at a single time, and simultaneously improves user experience.

Description

Game visual angle control method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of information display technologies, and in particular, to a method and an apparatus for controlling a game view angle, a computer-readable storage medium, and an electronic device.
Background
In a 3D game of a touch screen device, a slide is generally used to rotate a lens, and when a user stops sliding, a virtual camera stops rotating.
In the prior art, the physical movement range of a human finger on a screen is limited. After the user slides to the limit of the moving range of the finger, if the user wants to rotate the viewing angle in the direction, the user needs to lift the finger, slide the screen again after returning, and cannot continuously complete the large-range adjustment of the viewing angle once.
Therefore, it is necessary to design a new control method for the game perspective.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to a method and an apparatus for controlling a game viewing angle, a computer-readable storage medium, and an electronic device, so as to overcome, at least to a certain extent, a problem in the prior art that a wide-range adjustment of a viewing angle cannot be performed consecutively at a single time.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, a method for controlling a game view angle is provided, in which a terminal device displays a graphical user interface, a game screen including at least a part of a game scene is displayed on the graphical user interface, and the game screen is determined by a game view angle corresponding to a virtual camera in a game, including:
detecting touch operation for adjusting a visual angle, and acquiring a touch track of the touch operation;
adjusting the game visual angle according to the touch track;
responding to the end of the touch operation, and determining touch parameters of the touch operation;
when the touch parameter meets a touch parameter threshold value, controlling to continuously adjust the game visual angle;
and when the touch parameter does not meet the touch parameter threshold, controlling to stop adjusting the game visual angle or controlling the game visual angle to return to the initial state.
In an exemplary embodiment of the present disclosure, the touch parameter of the touch operation is a touch parameter of the touch operation in a preset time period.
In an exemplary embodiment of the present disclosure, the preset duration is a preset duration determined from the end of the touch operation to the front.
In an exemplary embodiment of the present disclosure, the touch operation is a sliding operation.
In an exemplary embodiment of the present disclosure, the controlling of the step of continuing to adjust the game perspective includes:
determining the current adjustment direction and the current adjustment speed of the game visual angle at the end of touch operation;
and continuously adjusting the game visual angle according to the current adjusting direction and the current adjusting speed.
In an exemplary embodiment of the present disclosure, continuously adjusting the game perspective according to the current adjustment direction and the current adjustment speed includes:
and controlling the virtual camera to continue rotating according to the current adjusting direction and the current adjusting speed.
In an exemplary embodiment of the present disclosure, the controlling of the step of continuing to adjust the game perspective includes:
and detecting speed data of preset duration before the touch operation is finished, and controlling the virtual camera to continue rotating according to the speed data after the sliding operation is finished.
In an exemplary embodiment of the present disclosure, the detecting speed data of a preset duration before the end of the sliding operation, and controlling the virtual camera to continue rotating according to the speed data after the end of the sliding operation includes:
detecting the displacement of a preset time length before the sliding operation is finished, and calculating a target direction and a target speed according to the displacement and the preset time length;
and after the sliding operation is finished, controlling the virtual camera to continuously rotate towards the target direction at a rotation speed corresponding to the target speed.
In an exemplary embodiment of the present disclosure, controlling the virtual camera to continue to rotate in the target direction at a rotation rate corresponding to the target rate includes:
and controlling the virtual camera to rotate in a decelerating way towards the target direction at the rotating speed corresponding to the target speed as the initial speed.
In an exemplary embodiment of the present disclosure, controlling the virtual camera to perform deceleration rotation in a target direction at a rotation speed corresponding to a target speed as an initial speed includes:
and controlling the virtual camera to perform uniform deceleration rotation by taking the target speed as an initial speed.
In an exemplary embodiment of the present disclosure, an absolute value of the acceleration in the first coordinate direction in the decelerating motion is larger than an absolute value of the acceleration in the second coordinate direction.
In an exemplary embodiment of the present disclosure, controlling the virtual camera to perform deceleration rotation in a target direction at a rotation speed corresponding to a target speed as an initial speed includes:
and controlling the virtual camera to perform deceleration rotation in a mode of multiplying the current speed by a coefficient smaller than 1 every other preset time period by taking the target speed as an initial speed.
In an exemplary embodiment of the present disclosure, detecting speed data of a preset duration before a sliding operation is ended, and controlling the virtual camera to continue rotating according to the speed data after the sliding operation is ended includes:
detecting the speeds of a plurality of sub-periods in a preset time length before the sliding operation is finished, and carrying out weighted average on the speeds to obtain a target speed;
and when the target speed is greater than a touch parameter threshold, controlling the virtual camera to continue rotating according to the speed data.
In an exemplary embodiment of the present disclosure, the method further comprises:
and when the target speed is less than or equal to the touch parameter threshold, controlling the virtual camera to stop rotating.
In an exemplary embodiment of the present disclosure, the speed data includes a direction of a last sub-period and a direction of a second last sub-period of the sliding operation, the speed data of a preset duration before the sliding operation is ended is detected, and after the sliding operation is ended, the controlling the virtual camera to continue to rotate according to the speed data includes:
and if the direction of the last sub-period is the same as the direction of the penultimate sub-period, controlling the virtual camera to continue rotating according to the speed data.
In an exemplary embodiment of the present disclosure, controlling continues to adjust the game perspective, the method further comprises:
and responding to the touch operation of the user on the graphical user interface, and controlling to stop adjusting the game visual angle or controlling the game visual angle to return to the initial state.
According to an aspect of the present disclosure, there is provided a virtual camera rotating apparatus including:
the detection module is used for adjusting the touch operation of the visual angle and acquiring the touch track of the touch operation;
the adjusting module is used for adjusting the game visual angle according to the touch track;
a response module for responding the end of the touch operation and determining the touch parameters of the touch operation
A first control module for controlling to continuously adjust the game visual angle when the touch parameter meets a touch parameter threshold value
And the second control module is used for controlling to stop adjusting the game visual angle or controlling the game visual angle to recover to an initial state when the touch parameter does not meet the touch parameter threshold value.
According to an aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method of controlling a perspective of a game as described in any one of the above.
According to an aspect of the present disclosure, there is provided an electronic device including:
a processor; and
memory for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement a method of controlling a point of view of a game as claimed in any preceding claim.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
in the method for controlling a game view angle provided by an embodiment of the present disclosure, a touch trajectory of a touch operation is acquired by detecting the touch operation for adjusting the view angle; adjusting the game visual angle according to the touch track; responding to the end of the touch operation, and determining touch parameters of the touch operation; when the touch parameter meets the touch parameter threshold, controlling to continuously adjust the game visual angle; and when the touch parameter does not meet the touch parameter threshold, controlling to stop adjusting the game visual angle or controlling the game visual angle to be restored to the initial state. Compared with the prior art, the game visual angle can be controlled to continue rotating after the touch operation is finished, so that the user can continuously complete large-range adjustment of the visual angle at a single time only by sliding on the graphical user interface within a small range under the condition that the sensitivity of visual angle control is not improved, and the user experience is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
fig. 1 schematically illustrates a flow chart of a control method of a game perspective in an exemplary embodiment of the present disclosure;
fig. 2 schematically illustrates a flowchart of controlling the virtual camera to rotate after the sliding operation is ended in an exemplary embodiment of the present disclosure;
fig. 3 schematically illustrates a composition diagram of a virtual camera rotating apparatus in an exemplary embodiment of the present disclosure;
FIG. 4 schematically illustrates a structural diagram of a computer system suitable for use with an electronic device that implements an exemplary embodiment of the present disclosure;
fig. 5 schematically illustrates a schematic diagram of a computer-readable storage medium, according to some embodiments of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
In the exemplary embodiment, first, a method for controlling a game view is provided, in which a terminal device displays a graphical user interface, a game screen including at least a part of a game scene is displayed on the graphical user interface, the game screen is determined by a game view corresponding to a virtual camera in a game, and the method can be applied to view control in a three-dimensional scene, for example, in a 3D game. Referring to fig. 1, the method for controlling a game perspective may include the steps of:
s110, detecting touch operation for adjusting a visual angle, and acquiring a touch track of the touch operation;
s120, adjusting the game visual angle according to the touch track;
s130, responding to the end of the touch operation, and determining touch parameters of the touch operation;
s140, when the touch parameter meets a touch parameter threshold value, controlling to continuously adjust the game visual angle;
s150, when the touch parameter does not meet the touch parameter threshold, controlling to stop adjusting the game visual angle or controlling the game visual angle to return to an initial state.
According to the control method of the game visual angle provided in the exemplary embodiment, compared with the prior art, the game visual angle can be controlled to continue rotating after the touch operation is finished, so that a user can continuously complete large-scale adjustment of the visual angle once only by sliding a small range on the graphical user interface under the condition that the sensitivity of visual angle control is not improved, and meanwhile, the user experience is improved.
Hereinafter, the steps of the control method of the game perspective in the present exemplary embodiment will be described in more detail with reference to the drawings and the embodiments.
In step S110, a touch operation for adjusting a viewing angle is detected, and a touch trajectory of the touch operation is acquired;
in step S120, the game viewing angle is adjusted according to the touch trajectory.
In an example embodiment of the present disclosure, the touch operation may be a sliding operation, a sliding track of the sliding operation may be detected, the server may detect a sliding operation of the user on the graphical user interface, the sliding operation of the user may be a sliding operation of the user on the graphical user interface by a finger, or a sliding operation of the user on the graphical user interface by controlling an identifier of a mouse, which is not specifically limited in this example embodiment.
In the present exemplary embodiment, when the server controls the virtual camera to rotate according to the direction and speed of the sliding operation, and the sensitivity of the virtual camera.
In step S130, determining a touch parameter of the touch operation in response to the end of the touch operation;
in step S140, when the touch parameter satisfies the touch parameter threshold, controlling to continuously adjust the game viewing angle.
In an example embodiment of the present disclosure, a direction of a last sub-period and a direction of a penultimate sub-period of a touch operation may be detected, and it may be determined whether the direction of the last sub-period and the direction of the penultimate sub-period are the same or not, and if the direction of the last sub-period and the direction of the penultimate sub-period are the same, it is determined that the touch parameter satisfies a touch parameter threshold. And if the direction of the last sub-period is different from the direction of the penultimate sub-period, judging that the touch parameter does not meet the touch parameter threshold.
In the present exemplary embodiment, each of the subintervals may be 0.05 second, 0.1 second, etc., and the sum of the direction of the last subinterval and the penultimate subinterval is less than or equal to the preset time period.
In an example embodiment of the present disclosure, referring to fig. 2, detecting speed data of a preset time period before the end of the sliding operation, and controlling the virtual camera to continue rotating according to the speed data after the end of the sliding operation may include steps S210 to S220, where the steps S210 to S220 are described in detail below.
In step S210, a displacement of a preset duration before the sliding operation is finished is detected, and a target direction and a target speed are calculated according to the displacement and the preset duration.
In an example embodiment of the present disclosure, the server may detect the speeds of the user for a plurality of sub-periods within a preset time period before the sliding operation of the graphical user interface is finished, where the preset time period may be 0.05 second, 0.1 second, or the like, or may be a percentage of a total time period of the sliding operation, for example, 10% or 11% of the total time period, and may also be customized according to a requirement, which is not specifically limited in this example embodiment.
In the present exemplary embodiment, the preset time period may be divided into a plurality of sub-periods, the number of the sub-periods may be 5, 6, 10, and the like, and the specific number of the sub-periods is not specifically limited in the present exemplary embodiment.
The speed of each sub-period may be calculated by calculating speed data corresponding to the sub-period, and the speed at this time may include a speed and a direction, and in the present exemplary embodiment, the weight of the speed of a plurality of sub-periods may be determined first, and a higher weight may be assigned to the speed of the sub-period near the end time of the sliding operation, for example, the preset time duration is divided into three sub-periods with weights of 0.6, 0.3, and 0.1, respectively, where the sub-period with the weight of 0.6 is the sub-period near the end time of the sliding operation. The allocation mode of each sub-period may also be customized according to the requirement, and is not specifically limited in this exemplary embodiment. The weighted average of the speeds of the plurality of sub-periods can be performed to obtain the target direction and the target speed.
In another example embodiment of the present disclosure, the target direction and the target speed may be obtained by the server detecting a speed and a direction of a preset size before the sliding operation of the user on the graphical user interface is finished, where the preset size may be 1 mm, 2 mm, or the like, or 50 pixels, 100 pixels, or the like, or may be a percentage of a total size of the sliding operation, for example, 10% or 11% of the total size, or the like, and may be customized according to a requirement, which is not specifically limited in this example embodiment.
In still another example embodiment of the present disclosure, the server may directly obtain a displacement of the sliding operation within the preset time period, and calculate a target rate and a target direction according to the displacement and the preset time period, specifically, may define the direction of the displacement as the target direction, and may use a ratio of a vector value of the displacement and the preset time period as the target speed.
In the present exemplary embodiment, if the target rate is less than or equal to the preset threshold, the virtual camera is controlled to stop rotating, and if the target rate is greater than the preset threshold, the virtual camera is controlled to continue rotating, that is, step S220 is executed. The preset rate may be 1cm/s, 0.2cm/s, 0.5cm/s, or the like, or may be set by self-definition according to the needs of the user, which is not specifically limited in this exemplary embodiment.
In step S220, after the sliding operation is finished, the virtual camera is controlled to continue to rotate in the target direction at a rotation rate corresponding to the target rate.
After the sliding operation of the user is finished, the server can control the virtual camera to perform deceleration rotation in the obtained target direction at the rotation speed corresponding to the target speed as the initial speed.
In this example real-time manner, controlling the virtual camera to perform deceleration rotation in the obtained target direction at the rotation rate corresponding to the target rate as the initial rate may include controlling the virtual camera to perform deceleration rotation in the target direction at the rotation rate corresponding to the target rate as the initial rate. That is, the target speed is used as the initial speed to perform deceleration rotation at a fixed acceleration, wherein the fixed acceleration can be determined according to the requirement, and is not limited in this exemplary embodiment.
That is, the virtual camera may continue to rotate after the sliding operation notification, and may rotate according to inertia, that is, the virtual camera may stop after rotating for a certain time.
In another example embodiment of the present disclosure, controlling the virtual camera to perform deceleration rotation in the obtained target direction at a rotation rate corresponding to the target rate as an initial rate may include controlling the virtual camera to perform deceleration rotation in a manner that the rotation rate corresponding to the target rate is the initial rate, and multiplying the current speed by a coefficient smaller than 1 every preset time period, where the coefficient may be 0.8, 0.9, and the like, and may be customized according to a requirement, which is not specifically limited in this example embodiment.
The rotation rate corresponding to the target rate may be determined by the target rate and the sensitivity of the virtual camera, and specifically may be obtained by multiplying the target rate and the sensitivity of the virtual camera, where the sensitivity may be customized according to a requirement, and is not specifically limited in this exemplary embodiment.
In an exemplary embodiment of the disclosure, in the deceleration rotation of the virtual camera after the sliding operation is finished, the negative acceleration in each direction may be different, that is, the stop time of the deceleration rotation after the sliding operation is finished in each direction is different, for example, the rotation angle may be divided into an X direction and a Y direction, and each negative acceleration may be different, wherein the second coordinate direction may be an X direction, the first coordinate direction is a Y direction, and the absolute value of the acceleration in the first coordinate direction is greater than the absolute value of the acceleration in the second coordinate direction, for example, in a 3D game, the range in which the X direction needs to be moved is generally large, the rotation in the Y direction does not exceed 180 degrees, and therefore, the negative acceleration in the X direction may be greater than the negative acceleration in the Y direction, that is, in the case that the target speed is the same, the distance of the rotation in the X direction is farther, i.e. the angle of rotation is greater.
In step S150, when the touch parameter does not satisfy the touch parameter threshold, controlling to stop adjusting the game viewing angle or controlling the game viewing angle to return to an initial state.
In this example embodiment, if the direction of the last sub-period is the same as the direction of the second last sub-period, it is determined that the touch parameter satisfies the touch parameter threshold. And if the direction of the last sub-period is different from the direction of the penultimate sub-period, judging that the touch parameter does not meet the touch parameter threshold.
In an example embodiment of the present disclosure, if the direction of the last sub-period is different from the direction of the second last sub-period, the virtual camera is controlled to stop rotating, i.e., the adjustment of the game angle of view is notified.
In another exemplary embodiment of the present disclosure, if the direction of the last sub-period is different from the direction of the second last sub-period, the virtual camera may be controlled to return to the position before the touch operation, i.e., the position before the sliding operation, i.e., to gray the game angle to the initial state.
In an example embodiment of the present disclosure, the method for controlling a game viewing angle of the present disclosure further includes controlling the virtual camera to stop rotating in response to a touch operation of a user on the graphical user interface, that is, after the sliding operation is finished, the processor controls the virtual camera to continue rotating, and after the touch operation of the user on the graphical user interface is received, the virtual camera is controlled to stop rotating.
The following describes embodiments of the apparatus of the present disclosure, which can be used to implement the above-mentioned control method of the game perspective of the present disclosure. Further, in an exemplary embodiment of the present disclosure, a virtual camera rotating apparatus is also provided. Referring to fig. 3, the virtual camera rotating apparatus 300 includes a detecting module 310, an adjusting module 320, a responding module 330, a first control module 340, and a second control module 350.
The detection module 310 may be configured to detect a touch operation for adjusting a viewing angle, and acquire a touch trajectory of the touch operation; the adjusting module 320 may be configured to adjust the game viewing angle according to the touch trajectory; the response module 330 may be configured to respond to the end of the touch operation, determine that the first control module 340 of the touch parameter of the touch operation may be configured to control to continue adjusting the game perspective when the touch parameter satisfies a touch parameter threshold, and the second control module 350 may be configured to control to stop adjusting the game perspective or control the game perspective to return to an initial state when the touch parameter does not satisfy the touch parameter threshold.
As each functional module of the virtual camera rotating device in the exemplary embodiment of the present disclosure corresponds to the step of the exemplary embodiment of the control method of the game perspective, please refer to the embodiment of the control method of the game perspective of the present disclosure for details that are not disclosed in the embodiment of the device of the present disclosure.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above-described virtual camera rotation is also provided.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 400 according to such an embodiment of the disclosure is described below with reference to fig. 4. The electronic device 400 shown in fig. 4 is only an example and should not bring any limitations to the functionality and scope of use of the embodiments of the present disclosure.
As shown in fig. 4, electronic device 400 is embodied in the form of a general purpose computing device. The components of electronic device 400 may include, but are not limited to: the at least one processing unit 410, the at least one memory unit 420, a bus 430 connecting different system components (including the memory unit 420 and the processing unit 410), and a display unit 440.
Wherein the storage unit stores program code that is executable by the processing unit 410 to cause the processing unit 410 to perform steps according to various exemplary embodiments of the present disclosure as described in the above section "exemplary methods" of this specification. For example, the processing unit 410 may execute step S110 shown in fig. 1, detect a touch operation for adjusting a viewing angle, and acquire a touch trajectory of the touch operation; s120, adjusting the game visual angle according to the touch track; s130, responding to the end of the touch operation, and determining touch parameters of the touch operation; s140, when the touch parameter meets a touch parameter threshold value, controlling to continuously adjust the game visual angle; s150, when the touch parameter does not meet the touch parameter threshold, controlling to stop adjusting the game visual angle or controlling the game visual angle to return to an initial state.
As another example, the electronic device may implement the steps shown in fig. 1-2.
The storage unit 420 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)421 and/or a cache memory unit 422, and may further include a read only memory unit (ROM) 423.
The storage unit 420 may also include a program/utility 424 having a set (at least one) of program modules 425, such program modules 425 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 430 may be any bus representing one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 400 may also communicate with one or more external devices 470 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 400, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 400 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 450. Also, the electronic device 400 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 460. As shown, the network adapter 460 communicates with the other modules of the electronic device 400 over the bus 430. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with electronic device 400, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the present disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the present disclosure described in the "exemplary methods" section above of this specification, when the program product is run on the terminal device.
Referring to fig. 5, a program product 500 for implementing the above method according to an embodiment of the present disclosure is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (19)

1. A control method of game visual angles is characterized in that a terminal device displays a graphical user interface, a game picture comprising at least part of game scenes is displayed on the graphical user interface, and the game picture is determined by the game visual angles corresponding to virtual cameras in a game, and the control method comprises the following steps:
detecting touch operation for adjusting a visual angle, and acquiring a touch track of the touch operation;
adjusting the game visual angle according to the touch track;
responding to the end of the touch operation, and determining touch parameters of the touch operation;
when the touch parameter meets a touch parameter threshold value, controlling to continuously adjust the game visual angle;
and when the touch parameter does not meet the touch parameter threshold, controlling to stop adjusting the game visual angle or controlling the game visual angle to return to the initial state.
2. The method according to claim 1, wherein the touch parameter of the touch operation is a touch parameter of the touch operation for a preset duration.
3. The method according to claim 2, wherein the preset duration is a preset duration determined from the end of the touch operation to the front.
4. The method of claim 1, wherein the touch operation is a sliding operation.
5. The method of claim 1, wherein the step of controlling the continued adjustment of the game perspective comprises:
determining the current adjustment direction and the current adjustment speed of the game visual angle at the end of touch operation;
and continuously adjusting the game visual angle according to the current adjusting direction and the current adjusting speed.
6. The method of claim 5, wherein continuing to adjust the game perspective based on the current adjustment direction and the current adjustment speed comprises:
and controlling the virtual camera to continue rotating according to the current adjusting direction and the current adjusting speed.
7. The method of claim 1, wherein the step of controlling the continued adjustment of the game perspective comprises:
and detecting speed data of preset duration before the touch operation is finished, and controlling the virtual camera to continue rotating according to the speed data after the sliding operation is finished.
8. The method according to claim 7, wherein the detecting speed data of a preset time length before the sliding operation is finished, and controlling the virtual camera to continue rotating according to the speed data after the sliding operation is finished comprises:
detecting the displacement of a preset time length before the sliding operation is finished, and calculating a target direction and a target speed according to the displacement and the preset time length;
and after the sliding operation is finished, controlling the virtual camera to continuously rotate towards the target direction at a rotation speed corresponding to the target speed.
9. The method of claim 8, wherein controlling the virtual camera to continue to rotate in the target direction at a rotation rate corresponding to the target rate comprises:
and controlling the virtual camera to rotate in a decelerating way towards the target direction at the rotating speed corresponding to the target speed as the initial speed.
10. The method of claim 9, wherein controlling the virtual camera to perform a deceleration rotation in the target direction at an initial speed corresponding to the target speed comprises:
and controlling the virtual camera to perform uniform deceleration rotation by taking the target speed as an initial speed.
11. The method of claim 10, wherein the absolute value of the acceleration in the first coordinate direction is greater than the absolute value of the acceleration in the second coordinate direction in the decelerating motion.
12. The method of claim 9, wherein controlling the virtual camera to perform a deceleration rotation in the target direction at an initial speed corresponding to the target speed comprises:
and controlling the virtual camera to perform deceleration rotation in a mode of multiplying the current speed by a coefficient smaller than 1 every other preset time period by taking the target speed as an initial speed.
13. The method according to claim 7, wherein detecting speed data of a preset time length before the end of the sliding operation and controlling the virtual camera to continue rotating according to the speed data after the end of the sliding operation comprises:
detecting the speeds of a plurality of sub-periods in a preset time length before the sliding operation is finished, and carrying out weighted average on the speeds to obtain a target speed;
and when the target speed is greater than a preset threshold value, controlling the virtual camera to continue rotating according to the speed data.
14. The method of claim 13, further comprising:
and when the target speed is less than or equal to the preset threshold value, controlling the virtual camera to stop rotating.
15. The method of claim 1, further comprising:
detecting the direction of the last sub-period and the direction of the penultimate sub-period of the touch operation,
and if the direction of the last sub-period is the same as the direction of the penultimate sub-period, judging that the touch parameter meets the touch parameter threshold.
16. The method of claim 1, wherein controlling continues to adjust the game perspective, the method further comprising:
and responding to the touch operation of the user on the graphical user interface, and controlling to stop adjusting the game visual angle or controlling the game visual angle to return to the initial state.
17. A control device of game visual angle displays a graphical user interface through a terminal device, a game picture including at least part of game scenes is displayed on the graphical user interface, and the game picture is determined by the game visual angle corresponding to a virtual camera in a game, and the control device is characterized by comprising:
the detection module is used for detecting touch operation for adjusting a visual angle and acquiring a touch track of the touch operation;
the adjusting module is used for adjusting the game visual angle according to the touch track;
a response module for responding the end of the touch operation and determining the touch parameters of the touch operation
A first control module for controlling to continuously adjust the game visual angle when the touch parameter meets a touch parameter threshold value
And the second control module is used for controlling to stop adjusting the game visual angle or controlling the game visual angle to recover to an initial state when the touch parameter does not meet the touch parameter threshold value.
18. A computer-readable storage medium on which a computer program is stored, the program realizing a control method of a game perspective according to any one of claims 1 to 16 when executed by a processor.
19. An electronic device, comprising:
a processor; and
a memory for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement a method of controlling a game perspective as claimed in any one of claims 1 to 16.
CN202010902824.3A 2020-09-01 2020-09-01 Game visual angle control method and device, storage medium and electronic equipment Pending CN111921196A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010902824.3A CN111921196A (en) 2020-09-01 2020-09-01 Game visual angle control method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010902824.3A CN111921196A (en) 2020-09-01 2020-09-01 Game visual angle control method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN111921196A true CN111921196A (en) 2020-11-13

Family

ID=73309476

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010902824.3A Pending CN111921196A (en) 2020-09-01 2020-09-01 Game visual angle control method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN111921196A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016116830A (en) * 2014-12-18 2016-06-30 株式会社Cygames Program, method and electronic device for character operation in game
CN107754308A (en) * 2017-09-28 2018-03-06 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN107930105A (en) * 2017-10-23 2018-04-20 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN109806589A (en) * 2019-02-19 2019-05-28 Oppo广东移动通信有限公司 Virtual object control method and device, electronic equipment and storage medium
CN110851056A (en) * 2019-11-14 2020-02-28 珠海金山网络游戏科技有限公司 Cursor control method and device, computing equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016116830A (en) * 2014-12-18 2016-06-30 株式会社Cygames Program, method and electronic device for character operation in game
CN107754308A (en) * 2017-09-28 2018-03-06 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN107930105A (en) * 2017-10-23 2018-04-20 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN109806589A (en) * 2019-02-19 2019-05-28 Oppo广东移动通信有限公司 Virtual object control method and device, electronic equipment and storage medium
CN110851056A (en) * 2019-11-14 2020-02-28 珠海金山网络游戏科技有限公司 Cursor control method and device, computing equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111352565B (en) Apparatus and method for moving a current focus using a touch-sensitive remote control
CN113853570B (en) System and method for generating dynamic obstacle collision warning for head mounted display
US11809617B2 (en) Systems and methods for generating dynamic obstacle collision warnings based on detecting poses of users
CN110215685B (en) Method, device, equipment and storage medium for controlling virtual object in game
US10488918B2 (en) Analysis of user interface interactions within a virtual reality environment
CN107204044B (en) Picture display method based on virtual reality and related equipment
CN108776544B (en) Interaction method and device in augmented reality, storage medium and electronic equipment
US20230229283A1 (en) Dynamic display method and apparatus based on operating body, storage medium and electronic device
CN108549487A (en) Virtual reality exchange method and device
CN112156467A (en) Control method and system of virtual camera, storage medium and terminal equipment
CN110170167B (en) Picture display method, device, equipment and medium
CN111921196A (en) Game visual angle control method and device, storage medium and electronic equipment
CN111068309B (en) Display control method, device, equipment, system and medium for virtual reality game
CN111833391A (en) Method and device for estimating image depth information
CN111346373A (en) Method and device for controlling display of virtual joystick in game and electronic equipment
US11635808B2 (en) Rendering information in a gaze tracking device on controllable devices in a field of view to remotely control
CN114849234A (en) Virtual lens control method and device, storage medium and electronic equipment
CN112473138B (en) Game display control method and device, readable storage medium and electronic equipment
CN111784809B (en) Virtual character skeleton animation control method and device, storage medium and electronic equipment
CN111429519B (en) Three-dimensional scene display method and device, readable storage medium and electronic equipment
CN113457144A (en) Method and device for selecting virtual units in game, storage medium and electronic equipment
CN113780045A (en) Method and apparatus for training distance prediction model
CN109753143B (en) method and device for optimizing cursor position
CN113457117A (en) Method and device for selecting virtual units in game, storage medium and electronic equipment
CN107977071B (en) Operation method and device suitable for space system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination