CN115970282A - Virtual lens control method and device, storage medium and computer equipment - Google Patents

Virtual lens control method and device, storage medium and computer equipment Download PDF

Info

Publication number
CN115970282A
CN115970282A CN202211738216.9A CN202211738216A CN115970282A CN 115970282 A CN115970282 A CN 115970282A CN 202211738216 A CN202211738216 A CN 202211738216A CN 115970282 A CN115970282 A CN 115970282A
Authority
CN
China
Prior art keywords
virtual
display frame
virtual lens
rotating
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211738216.9A
Other languages
Chinese (zh)
Inventor
吴中奇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202211738216.9A priority Critical patent/CN115970282A/en
Publication of CN115970282A publication Critical patent/CN115970282A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a method, a device, a storage medium and computer equipment for controlling a virtual lens, wherein the method comprises the following steps: displaying a graphical user interface, wherein the graphical user interface comprises at least part of virtual scenes and virtual roles positioned in the virtual scenes, and the graphical user interface is an interface displayed by acquiring the virtual scenes and the virtual roles through virtual lenses associated with the virtual roles; the method comprises the steps that when a virtual character shoots each time, the current collecting direction of a virtual lens is obtained; allocating a rotation direction, an initial rotation speed rotating towards the rotation direction and a rotation acceleration rotating towards the current acquisition direction to the virtual lens; and controlling the virtual lens to rotate towards the rotating direction based on the rotating direction, the initial rotating speed and the rotating acceleration, and controlling the virtual lens to rotate towards the current collecting direction when the virtual lens does not rotate towards the rotating direction any more until the collecting direction of the virtual lens is the current collecting direction.

Description

Virtual lens control method and device, storage medium and computer equipment
Technical Field
The present application relates to the field of computers, and in particular, to a method and an apparatus for controlling a virtual lens, a computer-readable storage medium, and a computer device.
Background
In recent years, with the development and popularization of computer device technology, an increasing number of First person shooter games (FPSs) have emerged.
In the prior art, shooting play is the key of attracting players as the core of a battle module. In addition to representing the shot with an animation of a virtual firearm and a virtual character, games often choose to feed the recoil of the virtual firearm back onto the virtual lens to enhance the experience.
In the research and practice process of the prior art, the inventor of the application finds that the orientation of a virtual lens before the virtual firearm is shot in the prior art is inconsistent with the orientation of the virtual lens after the shooting recoil feedback expression, so that a player needs to manually adjust the attack orientation after shooting every time, and the operation is complicated.
Disclosure of Invention
The embodiment of the application provides a control method and device for a virtual lens, and the problems that the attack orientation needs to be manually adjusted after each shooting and the operation is complicated can be solved.
In order to solve the above technical problem, an embodiment of the present application provides the following technical solutions:
a control method of a virtual lens includes:
displaying a graphical user interface, wherein the graphical user interface comprises at least a part of virtual scenes and virtual roles positioned in the virtual scenes, and the graphical user interface is an interface displayed by the virtual scenes and the virtual roles acquired through virtual lenses associated with the virtual roles;
acquiring the current acquisition direction of the virtual lens when the virtual character shoots each time;
distributing a rotating direction, an initial rotating speed rotating to the rotating direction and a rotating acceleration rotating to the current collecting direction for the virtual lens;
and controlling the virtual lens to rotate towards the rotating direction based on the rotating direction, the initial rotating speed and the rotating acceleration, and controlling the virtual lens to rotate towards the current collecting direction when the virtual lens does not rotate towards the rotating direction any more until the collecting direction of the virtual lens is the current collecting direction.
An apparatus for controlling a virtual lens, comprising:
the display module is used for displaying a graphical user interface, the graphical user interface comprises at least part of virtual scenes and virtual roles positioned in the virtual scenes, and the graphical user interface is an interface displayed by the virtual scenes and the virtual roles acquired through virtual lenses associated with the virtual roles;
the acquisition module is used for acquiring the current acquisition direction of the virtual lens when the virtual role shoots each time;
the distribution module is used for distributing a rotating direction, an initial rotating speed rotating towards the rotating direction and a rotating acceleration rotating towards the current acquisition direction for the virtual lens;
and the control module is used for controlling the virtual lens to rotate in the rotating direction based on the rotating direction, the initial rotating speed and the rotating acceleration, and controlling the virtual lens to rotate in the current acquisition direction when the virtual lens does not rotate in the rotating direction any more until the acquisition direction of the virtual lens is the current acquisition direction.
A computer readable storage medium, wherein a plurality of instructions are stored, and the instructions are suitable for being loaded by a processor to execute the steps of the control method of the virtual lens.
A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps in the control method of the virtual lens as described above when executing the program.
The method comprises the steps that a graphical user interface is displayed, at least part of virtual scenes and virtual roles in the virtual scenes are included in the graphical user interface, and the graphical user interface is an interface displayed by the virtual scenes and the virtual roles and collected through virtual lenses related to the virtual roles; acquiring the current acquisition direction of the virtual lens when the virtual character shoots each time; allocating a rotation direction, an initial rotation speed rotating towards the rotation direction and a rotation acceleration rotating towards the current acquisition direction to the virtual lens; based on the rotating direction, the initial rotating speed and the rotating acceleration, the virtual lens is controlled to rotate towards the rotating direction, and when the virtual lens does not rotate towards the rotating direction any more, the virtual lens is controlled to rotate towards the current collecting direction until the collecting direction of the virtual lens is the current collecting direction. Therefore, the rotating acceleration rotating to the current collecting direction is configured for the virtual lens, so that the virtual lens is influenced by the rotating acceleration to pull the collecting direction back to the initial current collecting direction, and the problems that the attack orientation needs to be manually adjusted after each shooting and the operation is complicated are solved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1a is a system schematic diagram of a method for controlling a virtual lens according to an embodiment of the present application.
Fig. 1b is a schematic flowchart of a method for controlling a virtual lens according to an embodiment of the present application.
Fig. 1c is a schematic diagram of a graphical user interface provided in an embodiment of the present application.
Fig. 2 is a schematic structural diagram of a control device for a virtual lens according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a control method and device of a virtual lens, a storage medium and computer equipment. Specifically, the control method for the virtual lens according to the embodiment of the present application may be executed by a computer device, where the computer device may be a terminal or a device such as a computer device. The terminal may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like, and the terminal device may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client, and the like. The computer device may be an independent physical computer device, a computer device cluster or a distributed system formed by a plurality of physical computer devices, or a cloud computer device providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a network service, cloud communication, middleware service, a domain name service, a security service, a CDN, a big data and artificial intelligence platform, and the like.
For example, when the control method of the virtual lens is operated on the terminal, the terminal device stores a game application program and presents part of game scenes in a game through the display component. The terminal device is used for interacting with a user through a graphical user interface, for example, downloading and installing a game application program through the terminal device and running the game application program. The manner in which the terminal device provides the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including a game screen and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for executing the game, generating the graphical user interface, responding to the operation instructions, and controlling display of the graphical user interface on the touch display screen.
For example, when the control method of the virtual lens is executed on a computer device, the control method can be a cloud game. Cloud gaming refers to a gaming regime based on cloud computing. In the cloud game running mode, the running body of the game application program and the game picture presenting body are separated, and the storage and running of the control method of the virtual lens are finished on the cloud game computer equipment. The game screen presentation is performed at a cloud game client, which is mainly used for receiving and sending game data and presenting a game screen, for example, the cloud game client may be a display device with a data transmission function near a user side, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, and the like, but a terminal device for performing game data processing is a cloud game computer device at the cloud end. When a game is played, a user operates the cloud game client to send an operation instruction to the cloud game computer device, the cloud game computer device runs the game according to the operation instruction, data such as game pictures and the like are coded and compressed, the data are returned to the cloud game client through a network, and finally the data are decoded through the cloud game client and the game pictures are output.
Referring to fig. 1a, fig. 1a is a system schematic diagram of a control device of a virtual lens according to an embodiment of the present disclosure. The system may include at least one terminal 1000, at least one computer device 2000, at least one database 3000, and a network 4000. The terminal 1000 held by the user can be connected to computer devices of different games through the network 4000. Terminal 1000 can be any device having computing hardware capable of supporting and executing a software product corresponding to a game. In addition, terminal 1000 can have one or more multi-touch sensitive screens for sensing and obtaining user input through touch or slide operations performed at multiple points on one or more touch sensitive display screens. In addition, when the system includes a plurality of terminals 1000, a plurality of computer devices 2000, and a plurality of networks 4000, different terminals 1000 may be connected to each other through different networks 4000 and through different computer devices 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition, different terminals 1000 can also be connected to other terminals or to computer devices using their own bluetooth network or hotspot network. For example, a plurality of users may be online through different terminals 1000 to be connected and synchronized with each other through a suitable network to support multiplayer games. Additionally, the system can include multiple databases 3000, with multiple databases 3000 coupled to different computer devices 2000, and information related to the gaming environment can be continuously stored in database 3000 while different users are playing the multiplayer game online.
The embodiment of the application provides a control method of a virtual lens, which can be executed by a terminal or a computer device. The embodiment of the present application describes a control method of a virtual lens as an example executed by a terminal. The terminal comprises a display component and a processor, wherein the display component is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the display component. When the user operates the graphical user interface through the display component, the graphical user interface can control the local content of the terminal through responding to the received operation instruction, and can also control the content of the opposite-end computer equipment through responding to the received operation instruction. For example, the operation instruction generated by the user acting on the graphical user interface comprises an instruction for starting a game application, and the processor is configured to start the game application after receiving the instruction provided by the user for starting the game application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch display screen. A touch display screen is a multi-touch sensitive screen capable of sensing a touch or slide operation performed at a plurality of points on the screen at the same time. The user uses a finger to perform touch operation on the graphical user interface, and when the graphical user interface detects the touch operation, different virtual objects in the graphical user interface of the game are controlled to perform actions corresponding to the touch operation. For example, the game may be any one of a leisure game, an action game, a role playing game, a strategy game, a sports game, an educational game, a First person shooter game (FPS), and the like. Wherein the game may include a virtual scene of the game drawn on a graphical user interface. Further, one or more virtual objects, such as virtual characters, controlled by the user (or player) may be included in the virtual scene of the game. Additionally, one or more obstacles, such as railings, ravines, walls, etc., may also be included in the virtual scene of the game to limit movement of the virtual objects, e.g., to limit movement of one or more objects to a particular area within the virtual scene. Optionally, the virtual scene of the game also includes one or more elements, such as skills, points, character health, energy, etc., to provide assistance to the player, provide virtual services, increase points related to player performance, etc. In addition, the graphical user interface may also present one or more indicators to provide instructional information to the player. For example, a game may include a player-controlled virtual object and one or more other virtual objects (such as enemy characters). In one embodiment, one or more other virtual objects are controlled by other players of the game. For example, one or more other virtual objects may be computer controlled, such as a robot using Artificial Intelligence (AI) algorithms, to implement a human-machine fight mode. For example, the virtual objects possess various skills or capabilities that the game player uses to achieve the goal. For example, the virtual object possesses one or more weapons, props, tools, etc. that may be used to eliminate other objects from the game. Such skills or capabilities may be activated by a player of the game using one of a plurality of preset touch operations with a touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of a user.
It should be noted that the system schematic diagram of the control system of the virtual lens shown in fig. 1a is only an example, the control system of the virtual lens and the scene described in the embodiment of the present application are for more clearly illustrating the technical solution of the embodiment of the present application, and do not form a limitation on the technical solution provided in the embodiment of the present application, and as a person having ordinary skill in the art knows, the technical solution provided in the embodiment of the present application is also applicable to similar technical problems with the evolution of the control system of the virtual lens and the appearance of new service scenes.
In the present embodiment, the description will be made from the perspective of a control device of a virtual lens, which may be specifically integrated in a computer apparatus having a storage unit and a microprocessor mounted thereon and having an arithmetic capability.
Referring to fig. 1b, fig. 1b is a schematic flowchart illustrating a control method for a virtual lens according to an embodiment of the present disclosure. The control method of the virtual lens comprises the following steps:
in step 101, a graphical user interface is displayed, where the graphical user interface includes at least a part of a virtual scene and a virtual character located in the virtual scene, and the graphical user interface is an interface displayed by capturing the virtual scene and the virtual character through a virtual lens associated with the virtual character.
The graphical user interface is displayed as a three-dimensional virtual scene in a game scene, and the three-dimensional virtual scene is a virtual scene provided by an application program when the application program runs on a terminal, and can be a simulation scene of a real world, a semi-simulation semi-fictional scene or a pure fictional scene. The scene picture displayed on the graphical user interface is the scene picture displayed when the virtual object observes the three-dimensional virtual scene. A user controls a virtual object in a game scene through a terminal, the virtual object can observe a three-dimensional virtual scene through a virtual lens, for example, in an FPS game, when the virtual lens is positioned at a first person angle, the virtual lens is positioned at the head or the neck of a target virtual object, and only the arm part of a virtual character can be displayed in a graphical user interface; when the virtual lens is positioned behind the target virtual object in the third person perspective, only the upper body part of the virtual character can be displayed in the graphical user interface. The graphical user interface is a scene picture presented by observing the three-dimensional virtual scene through the virtual lens at a certain viewing angle.
Specifically, please refer to fig. 1c, and fig. 1c is a schematic diagram of a graphical user interface provided in an embodiment of the present application. The graphic user interface is presented by a screen of the computer device 2000, and includes a virtual character 10 manipulated by a user in the graphic user interface, the virtual character 10 is configured with a virtual firearm, firing through the virtual firearm, and a firing aiming identification 20 for assisting the virtual firearm in aiming, a cursor control 30 for prompting the user of current direction information of the virtual character 10, a movement control 40 for controlling the virtual character 10 to move in a three-dimensional virtual environment, an aiming control 50 that the virtual character 10 can use in an attack, and a map control 60 for prompting the user of a position of the virtual character 10 in the three-dimensional virtual environment, and a firing control 70 for controlling the virtual character 10 to fire the firearm in the three-dimensional virtual environment, and the like. An indication control 31 is further disposed in the direction control 30, and is used for indicating the direction in which the virtual character 10 is located in the cursor control 30. It is understood that the graphical user interface may include not only the above identifiers and controls, but also other functional controls or identifiers, which are determined according to specific game content, and is not limited herein.
In step 102, a current collecting direction of the virtual shot is obtained when the virtual character shoots each time.
When a user clicks a firing control or controls a virtual character to use the configured virtual firearm for attack through a physical key such as a mouse and a keyboard, the current acquisition direction of the virtual lens is acquired. The current acquisition direction is the aiming direction of the virtual bullet in the virtual firearm before the virtual bullet is fired.
In step 103, a rotation direction, an initial rotation speed of rotating to the rotation direction, and a rotation acceleration of rotating to the current collection direction are assigned to the virtual lens.
The recoil effect of the virtual lens refers to a lens shaking rule generated by the virtual firearm when being fired, so that the orientation of the lens is changed. The positions of the virtual gun and the shooting aiming mark are not influenced by the effect of a shooting lens and still keep unchanged at the central position of the picture. The basic principle is that a rotation direction is given to the virtual lens when the virtual lens is triggered, the virtual lens starts to move at a rotation initial speed rotating towards the rotation direction, and a rotation acceleration opposite to the rotation direction is applied to pull the virtual lens towards an initial origin position. Therefore, the effect that the virtual lens rotates towards the rotating direction firstly and then rotates towards the current collecting direction reversely is formed.
In step 104, the virtual lens is controlled to rotate towards the rotation direction based on the rotation direction, the initial rotation speed and the rotation acceleration, and when the virtual lens does not rotate towards the rotation direction any more, the virtual lens is controlled to rotate towards the current acquisition direction until the acquisition direction of the virtual lens is the current acquisition direction.
In order to avoid that the acquisition direction of the virtual lens returns to the initial acquisition direction quickly in the continuous shooting process, when the virtual lens rotates towards the current acquisition direction due to the fact that the initial rotation speed allocated to the virtual lens is shot next time in the continuous shooting process, the product of the initial rotation speed and the specified numerical value is calculated, and the result is used as the actual initial rotation speed of the next shooting.
Specifically, in order to design a trajectory formed after virtual shooter shoots, a rotation direction configured for the virtual lens for each shooting in a continuous process, for example, first rotating to the left and then rotating to the right, may be set, so as to form a customized trajectory.
The player can autonomously control the acquisition direction of the virtual lens in the shooting process, so that the screen sliding value of the player in the process of autonomously controlling the acquisition direction of the virtual lens can be recorded, the screen sliding value is converted into the rotation angle of the virtual lens, and the rotation angle is superposed, so that the acquisition direction after the shooting process is prevented from being misaligned with the current acquisition direction.
Wherein, after having been allocated rotation direction, initial slew velocity and transmission acceleration for virtual camera lens, control virtual camera lens according to initial slew velocity to the slew direction rotates, because carry out the pivoted in-process to the slew direction, initial slew velocity can receive reverse slew acceleration's influence for the pivoted speed reduces gradually, when being 0 to slew direction pivoted slew velocity, receives slew acceleration's influence to the current direction of gathering and rotates, thereby realizes that virtual camera lens keeps away from current direction of gathering earlier along with virtual firearms ' shooting process, the effect that is close to current direction of gathering afterwards.
In some embodiments, the step of controlling the virtual lens to rotate towards the rotation direction based on the rotation direction, the initial rotation speed, and the rotation acceleration, and when the virtual lens does not rotate towards the rotation direction, controlling the virtual lens to rotate towards the current collecting direction until the collecting direction of the virtual lens is the current collecting direction includes:
(1) Calculating the first frame rotating speed of the first shooting effect frame in the shooting effect calculation period based on the initial rotating speed and the rotating acceleration;
(2) Determining the total rotation angle of a shooting effect calculation period according to the rotation speed of the first frame;
(3) Determining a rotation angle corresponding to each display frame of the virtual lens in a shooting effect frame duration based on a frame rate of a client;
(4) And controlling the virtual lens to rotate along with the display of each display frame according to the corresponding rotation angle of each display frame until the acquisition direction of the virtual lens is the current acquisition direction.
Wherein, the shooting effect calculation period can be set to 1/30 second, and the first frame rotation speed of the first shooting effect frame in the shooting effect calculation period is calculated every 1/30 second based on the initial rotation speed and the rotation acceleration. The total rotating angle of the virtual lens, which can rotate from the current shooting effect calculation period to the next shooting effect calculation period, can be obtained based on the rotating speed of the first frame.
Specifically, since the actual display process virtual lens rotation angle is determined based on the frame rate of the player's client, and the frame rate of the client is greater than 30 frames per second, it is equivalent to that there are multiple frames of display frames in one shooting effect calculation period. Therefore, the rotation angle of the virtual lens corresponding to each display frame within the duration of a shooting effect frame needs to be determined according to the frame rate of the client. And after the rotation angle corresponding to each display frame is calculated, controlling the virtual lens to rotate along with the display of each display frame according to the rotation angle corresponding to each display frame until the acquisition direction of the virtual lens is the current acquisition direction.
In some embodiments, the step of determining, based on the frame rate of the client, a rotation angle corresponding to each display frame of the virtual lens within a duration of a shooting effect frame includes:
(1) Determining the display time difference between the current display frame and the previous display frame;
(2) Comparing the display time difference with the shooting effect calculation period to obtain a comparison result;
(3) And determining the corresponding rotation angle of each display frame of the virtual lens in the duration of a shooting effect frame based on the comparison result.
The specific way of calculating the rotation angle corresponding to each display frame is as follows: taking the current display frame as an example, determining the display time difference between the current display frame and the previous display frame, and comparing the display time difference with the shooting effect calculation period to obtain a comparison result. The comparison aims at that if the display time difference is smaller than the shooting effect calculation period, the frame rate of the client is larger than 30 frames per second, which is equivalent to that a plurality of display frames exist in the original shooting effect calculation period; if the display time difference is equal to the shooting effect calculation period, the frame rate of the client is equal to 30 frames per second, which is equivalent to that only the first shooting effect frame exists in the original shooting effect calculation period; and if the display time difference is larger than the shooting effect calculation period, indicating that the client has a frame loss condition currently. For different comparison results, there are different determination ways to determine the rotation angle corresponding to the current display frame.
In some embodiments, the step of determining, based on the comparison result, a rotation angle corresponding to each display frame of the virtual lens within a duration of a shooting effect frame includes:
(1.1) if the comparison result shows that the display time difference is smaller than the shooting effect calculation period, calculating the ratio of the display time difference to the shooting effect calculation period to obtain a distribution ratio value;
(1.2) determining a rotation angle corresponding to the virtual lens in the current display frame according to the distribution ratio value;
(1.3) judging whether the current display frame is the last display frame in the shooting effect calculation period in which the current display frame is positioned;
and (1.4) if not, determining a next display frame of the current display frame as a current display frame, and when the current display frame is determined as a previous display frame, returning to the step of determining the display time difference between the current display frame and the previous display frame until the current display frame is the last display frame in the shooting effect calculation period in which the current display frame is located, so as to obtain the rotation angle of the virtual lens corresponding to each frame.
If the comparison result shows that the display time difference is smaller than the shooting effect calculation period, the frame rate of the client is larger than 30 frames per second, and a plurality of display frames exist in each shooting effect calculation period, so that the ratio of the display time difference to the shooting effect calculation period needs to be calculated to obtain the distribution ratio. Determining a rotation angle corresponding to the virtual lens in the current display frame according to the distribution proportion value; and after the rotation angle corresponding to the current display frame is obtained, judging whether the current display frame is the last display frame in the shooting effect calculation period, if not, indicating that residual display frames exist in the current shooting effect calculation period, determining the next display frame of the current display frame as the current display frame, and when the current display frame is determined as the previous display frame, circularly executing the step of determining the display time difference between the current display frame and the previous display frame until the current display frame is the last display frame in the shooting effect calculation period, so as to obtain the rotation angle corresponding to each frame of the virtual lens. And if the current display frame is the last display frame in the shooting effect calculation period, indicating that the rotation angle corresponding to each display frame in the shooting effect calculation period is determined completely, ending the current shooting effect calculation period, and continuously executing the process of calculating the rotation angle corresponding to each display frame in the next shooting effect calculation period.
In some embodiments, the step of determining a rotation angle of the virtual lens in the current display frame according to the allocation proportion value includes:
(1.1) acquiring a target total rotation angle corresponding to the shooting effect calculation period where the current display frame is located;
and (1.2) calculating the product of the distribution ratio value and the target total rotation angle to obtain the rotation angle of the virtual lens corresponding to the current display frame.
Because the current shooting effect calculation period has multiple frames of display frames, the rotation angle corresponding to the current display frame can be obtained according to the total rotation angle corresponding to the current shooting effect calculation period multiplied by the distribution ratio value of the current display frame in the shooting effect calculation period.
In some embodiments, the method further comprises:
(1.1) if the comparison result shows that the display time difference is equal to the shooting effect calculation period, acquiring a target total rotation angle corresponding to the shooting effect calculation period in which the current display frame is located;
and (1.2) calculating the product of a preset numerical value and the target total rotation angle to obtain the rotation angle corresponding to the virtual lens in the current display frame.
If the display time difference is equal to the shooting effect calculation period, the frame rate of the client is equal to 30 frames per second, which is equivalent to that only the first shooting effect frame exists in the original shooting effect calculation period, no other display frame exists, and the client displays one display frame and then walks through one shooting effect calculation period. The rotation angle corresponding to the current display frame is the product of the preset numerical value (1) and the target total rotation angle.
In some embodiments, the method further comprises:
(1) If the comparison result is that the display time difference is larger than the shooting effect calculation period, acquiring the current display frame sequence number and the previous display frame sequence number of the current display frame and the previous display frame in the shooting effect calculation period;
(2) Determining the number of display frames of the lost display frame based on the serial number of the previous display frame and the current display frame;
(3) Acquiring a rotation angle corresponding to the previous display frame, and calculating the product of the rotation angle corresponding to the previous display frame and the number of the display frames to obtain a first rotation angle;
(4) Determining a missing display frame one frame before the current display frame based on the current display frame sequence number;
(5) And determining a rotation angle corresponding to the virtual lens in the current display frame based on the lost display frame.
And if the comparison result shows that the display time difference is greater than the shooting effect calculation period, indicating that the client currently has a frame loss condition. And acquiring the serial number of the current display frame and the serial number of the previous display frame in the shooting effect calculation period of the current display frame and the previous display frame. For example: the shooting effect calculation period of the current display frame is 3 rd frame, and the previous display frame sequence number of the actually displayed previous display frame is 1 st frame. And determining the number of the display frames of the lost display frame, such as 1 frame which is the 2 nd frame, based on the serial number of the previous display frame and the current display frame. Because the rotation angle corresponding to the previous display frame is already calculated in the display process of the previous display frame, and the rotation angles of a plurality of display frames in a shooting effect calculation period are the same, the rotation angle corresponding to the previous display frame can be obtained, and the rotation angle of the previous display frame is multiplied by the number of the display frames of the lost display frame to obtain a first rotation angle at which the lost display frame should rotate, for example, the 2 nd frame is lost, the 1 st frame rotates by 2 °, and the 2 nd frame also rotates by 2 °.
Specifically, after the first rotation angle corresponding to the lost frame is calculated, the rotation angle that the current display frame should rotate needs to be calculated. Determining a lost display frame one frame before a current display frame according to a current display frame sequence number; and determining a rotation angle corresponding to the virtual lens in the current display frame based on the lost display frame.
In some embodiments, the step of determining a rotation angle corresponding to the virtual lens in the current display frame based on the lost display frame includes:
(1.1) determining a target initial rotation speed corresponding to the lost display frame based on the initial rotation speed and the rotation acceleration;
(1.2) determining a total rotation angle from a time point corresponding to the lost display frame to a next shooting effect calculation period according to the target initial rotation speed;
(1.3) calculating the ratio of the display time difference between the current display frame and the lost display frame to the shooting effect calculation period to obtain a target distribution ratio value;
(1.4) calculating the product of the target distribution proportion value and the total rotation angle to obtain a second rotation angle;
and (1.5) calculating the sum of the first rotating angle and the second rotating angle to obtain the rotating angle corresponding to the virtual lens in the current display frame.
The lost frame sequence number corresponding to the lost display frame is known, so that the target initial rotating speed corresponding to the lost display frame is determined based on the initial rotating speed and the rotating acceleration; determining a total rotation angle from a time point corresponding to the lost display frame to a next shooting effect calculation period according to the target initial rotation speed; calculating the ratio of the display time difference between the current display frame and the lost display frame to the shooting effect calculation period to obtain a target distribution ratio value; calculating the product of the target distribution ratio value and the total rotation angle to obtain a second rotation angle; and calculating the sum of the first rotating angle and the second rotating angle to obtain the rotating angle corresponding to the virtual lens in the current display frame.
For example, if the lost frame number of the lost display frame is 3, the target initial rotation speed corresponding to the lost display frame is determined to be 5 according to the initial rotation speed and the rotation acceleration, and the total rotation angle from the lost display frame to the next shooting effect calculation period is calculated to be 30 ° according to 5. If the target allocation ratio is one third, the second rotation angle is 30 ° × 1/3=10 °, and the first rotation angle is 5 °, the rotation angle corresponding to the current display frame is 10 ° +5 ° =15 °.
As can be seen from the above, in the embodiment of the present application, by displaying a graphical user interface, the graphical user interface includes at least a part of a virtual scene and a virtual character located in the virtual scene, and the graphical user interface is an interface displayed by the virtual scene and the virtual character acquired through a virtual lens associated with the virtual character; acquiring the current acquisition direction of the virtual lens when the virtual character shoots each time; distributing a rotating direction, an initial rotating speed rotating to the rotating direction and a rotating acceleration rotating to the current collecting direction for the virtual lens; based on the rotating direction, the initial rotating speed and the rotating acceleration, the virtual lens is controlled to rotate towards the rotating direction, and when the virtual lens does not rotate towards the rotating direction any more, the virtual lens is controlled to rotate towards the current collecting direction until the collecting direction of the virtual lens is the current collecting direction. Therefore, the rotating acceleration rotating to the current collecting direction is configured for the virtual lens, so that the virtual lens is influenced by the rotating acceleration to pull the collecting direction back to the initial current collecting direction, and the problems that the attack orientation needs to be manually adjusted after each shooting and the operation is complicated are solved.
In order to better implement the control method of the virtual lens provided in the embodiments of the present application, an embodiment of the present application further provides a device based on the control method of the virtual lens. The meaning of the noun is the same as that in the above control method of the virtual lens, and the specific implementation details may refer to the description in the method embodiment.
Referring to fig. 2, fig. 2 is a schematic structural diagram of a control device for a virtual lens according to an embodiment of the present disclosure, where the control device for a virtual lens may include a display module 301, an obtaining module 302, an allocating module 303, a control module 304, and the like.
The display module 301 is configured to display a graphical user interface, where the graphical user interface includes at least a part of a virtual scene and a virtual character located in the virtual scene, and the graphical user interface is an interface displayed by collecting the virtual scene and the virtual character through a virtual lens associated with the virtual character;
an obtaining module 302, configured to obtain a current collecting direction of the virtual lens when the virtual character shoots at each time;
the allocating module 303 is configured to allocate a rotation direction, an initial rotation speed of rotating in the rotation direction, and a rotation acceleration of rotating in the current acquisition direction to the virtual lens;
the control module 304 is configured to control the virtual lens to rotate in the rotation direction based on the rotation direction, the initial rotation speed, and the rotation acceleration, and when the virtual lens does not rotate in the rotation direction any more, control the virtual lens to rotate in the current acquisition direction until the acquisition direction of the virtual lens is the current acquisition direction.
In some embodiments, the control module 304, comprises:
the first calculation submodule is used for calculating the first frame rotation speed of the first shooting effect frame in the shooting effect calculation period based on the initial rotation speed and the rotation acceleration;
the first determining submodule is used for determining the total rotating angle of the shooting effect calculating period according to the rotating speed of the first frame;
the second determining submodule is used for determining the corresponding rotation angle of each display frame of the virtual lens in the duration of a shooting effect frame based on the frame rate of the client;
and the rotation submodule is used for controlling the virtual lens to rotate along with the display of each display frame according to the rotation angle corresponding to each display frame until the acquisition direction of the virtual lens is the current acquisition direction.
In some embodiments, the second determining sub-module comprises:
the first determining unit is used for determining the display time difference between the current display frame and the previous display frame;
the comparison unit is used for comparing the display time difference with the shooting effect calculation cycle to obtain a comparison result;
and the second determining unit is used for determining the corresponding rotation angle of each display frame of the virtual lens in the duration of one shooting effect frame based on the comparison result.
In some embodiments, the second determining unit is configured to:
the first calculating subunit is configured to calculate a ratio of the display time difference to the shooting effect calculation period to obtain a distribution ratio value if the comparison result indicates that the display time difference is smaller than the shooting effect calculation period;
the first determining subunit is configured to determine, according to the distribution proportion value, a rotation angle corresponding to the virtual lens in the current display frame;
the judging subunit is used for judging whether the current display frame is the last display frame in the shooting effect calculation cycle in which the current display frame is positioned;
and the execution subunit is configured to, if not, determine a display frame subsequent to the current display frame as the current display frame, and when the current display frame is determined as the previous display frame, return to the step of determining the display time difference between the current display frame and the previous display frame until the current display frame is the last display frame in the shooting effect calculation period in which the current display frame is located, so as to obtain a rotation angle corresponding to each frame of the virtual lens.
In some embodiments, the determining subunit is configured to:
acquiring a target total rotation angle corresponding to the shooting effect calculation period of the current display frame;
and calculating the product of the distribution ratio value and the target total rotation angle to obtain the rotation angle of the virtual lens corresponding to the current display frame.
In some embodiments, the second determining unit further includes:
the first obtaining subunit is configured to obtain a total target rotation angle corresponding to the shooting effect calculation period in which the current display frame is located, if the comparison result indicates that the display time difference is equal to the shooting effect calculation period;
and the second calculating subunit is used for calculating the product of a preset numerical value and the target total rotating angle to obtain the rotating angle of the virtual lens corresponding to the current display frame.
In some embodiments, the second determining unit further includes:
a second obtaining subunit, configured to obtain, if the comparison result indicates that the display time difference is greater than the shooting effect calculation period, a current display frame number and a previous display frame number of the current display frame and the previous display frame in the shooting effect calculation period where the current display frame and the previous display frame are located;
a second determining subunit, configured to determine, based on a previous display frame number and the current display frame, a display frame number of the lost display frame;
a third obtaining subunit, configured to obtain a rotation angle corresponding to the previous display frame, and calculate a product of the rotation angle corresponding to the previous display frame and the number of display frames, to obtain a first rotation angle;
a third determining subunit, configured to determine, based on the current display frame sequence number, a missing display frame that is one frame before the current display frame;
and the fourth determining subunit is configured to determine, based on the lost display frame, a rotation angle corresponding to the virtual lens in the current display frame.
In some embodiments, the fourth determining subunit is configured to:
determining a target initial rotation speed corresponding to the lost display frame based on the initial rotation speed and the rotation acceleration;
determining a total rotation angle from a time point corresponding to the lost display frame to a next shooting effect calculation period according to the target initial rotation speed;
calculating the ratio of the display time difference between the current display frame and the lost display frame to the shooting effect calculation period to obtain a target distribution ratio value;
calculating the product of the target distribution ratio value and the total rotation angle to obtain a second rotation angle;
and calculating the sum of the first rotation angle and the second rotation angle to obtain the rotation angle corresponding to the virtual lens in the current display frame.
As can be seen from the above, in the embodiment of the present application, the display module 301 displays a graphical user interface, where the graphical user interface includes at least a part of a virtual scene and a virtual character located in the virtual scene, and the graphical user interface is an interface displayed by the virtual scene and the virtual character acquired through a virtual lens associated with the virtual character; the obtaining module 302 obtains a current collecting direction of the virtual lens when the virtual character shoots each time; the allocating module 303 allocates a rotation direction, an initial rotation speed for rotating in the rotation direction, and a rotation acceleration for rotating in the current collecting direction to the virtual lens; control module 304 controls the virtual lens to rotate towards the rotation direction based on the rotation direction, the initial rotation speed, and the rotation acceleration, and when the virtual lens does not rotate towards the rotation direction any more, controls the virtual lens to rotate towards the current collection direction until the collection direction of the virtual lens is the current collection direction. Therefore, the rotating acceleration rotating to the current collecting direction is configured for the virtual lens, so that the virtual lens is influenced by the rotating acceleration to pull the collecting direction back to the initial current collecting direction, and the problems that the attack orientation needs to be manually adjusted after each shooting and the operation is complicated are solved.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Correspondingly, the embodiment of the present application further provides a Computer device, where the Computer device may be a terminal or a server, and the terminal may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game console, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like. As shown in fig. 3, fig. 3 is a schematic structural diagram of a computer device according to an embodiment of the present application. The computer device 2000 includes a processor 401 having one or more processing cores, a memory 402 having one or more computer-readable storage media, and a computer program stored on the memory 402 and executable on the processor. The processor 401 is electrically connected to the memory 402. Those skilled in the art will appreciate that the computer device configurations illustrated in the figures are not meant to be limiting of computer devices and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The processor 401 is a control center of the computer device 2000, connects various parts of the entire computer device 2000 using various interfaces and lines, performs various functions of the computer device 2000 and processes data by running or loading software programs and/or modules stored in the memory 402 and calling data stored in the memory 402, thereby integrally monitoring the computer device 2000.
In this embodiment, the processor 401 in the computer device 2000 loads instructions corresponding to processes of one or more applications into the memory 402 according to the following steps, and the processor 401 executes the applications stored in the memory 402, thereby implementing various functions:
displaying a graphical user interface, wherein the graphical user interface comprises at least a part of virtual scenes and virtual roles positioned in the virtual scenes, and the graphical user interface is an interface displayed by the virtual scenes and the virtual roles acquired through virtual lenses associated with the virtual roles; acquiring the current acquisition direction of the virtual lens when the virtual character shoots each time; allocating a rotation direction, an initial rotation speed rotating towards the rotation direction and a rotation acceleration rotating towards the current acquisition direction to the virtual lens; and controlling the virtual lens to rotate towards the rotating direction based on the rotating direction, the initial rotating speed and the rotating acceleration, and controlling the virtual lens to rotate towards the current collecting direction when the virtual lens does not rotate towards the rotating direction any more until the collecting direction of the virtual lens is the current collecting direction.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 3, the computer device 2000 further includes: touch-sensitive display screen 403, radio frequency circuit 404, audio circuit 405, input unit 406 and power 407. The processor 401 is electrically connected to the touch display screen 403, the radio frequency circuit 404, the audio circuit 405, the input unit 406, and the power source 407. Those skilled in the art will appreciate that the computer device configuration illustrated in FIG. 3 does not constitute a limitation of computer devices, and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The touch display screen 403 may be used for displaying a graphical user interface and receiving operation instructions generated by a user acting on the graphical user interface. The touch display screen 403 may include a display panel and a touch panel. The display panel may be used, among other things, to display information entered by or provided to a user and various graphical user interfaces of the computer device, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus pen, and the like), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 401, and can receive and execute commands sent by the processor 401. The touch panel may overlay the display panel and, when the touch panel detects a touch operation thereon or nearby, transmit the touch operation to the processor 401 to determine the type of the touch event, and then the processor 401 provides a corresponding visual output on the display panel according to the type of the touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 403 to realize input and output functions. However, in some embodiments, the touch panel and the touch panel can be implemented as two separate components to perform the input and output functions. That is, the touch display screen 403 may also be used as a part of the input unit 406 to implement an input function.
In this embodiment of the application, a game application is executed by the processor 401 to generate a graphical user interface on the touch display screen 403, where a virtual scene on the graphical user interface includes at least one function control or a wheel control. The touch display screen 403 is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface.
The rf circuit 404 may be used for transceiving rf signals to establish wireless communication with a network device or other computer device via wireless communication, and for transceiving signals with the network device or other computer device.
The audio circuit 405 may be used to provide an audio interface between a user and a computer device through speakers, microphones. The audio circuit 405 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 405 and converted into audio data, and the audio data is processed by the audio data output processor 401, and then sent to another computer device through the radio frequency circuit 404, or the audio data is output to the memory 402 for further processing. The audio circuit 405 may also include an earbud jack to provide communication of a peripheral headset with the computer device.
The input unit 406 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 407 is used to power the various components of the computer device 2000. Optionally, the power supply 407 may be logically connected to the processor 401 through a power management system, so as to implement functions of managing charging, discharging, power consumption management, and the like through the power management system. The power supply 407 may also include one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, or any other component.
Although not shown in fig. 3, the computer device 2000 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which will not be described herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
As can be seen from the above, the computer device provided in this embodiment displays a graphical user interface, where the graphical user interface includes at least a part of a virtual scene and a virtual character located in the virtual scene, and the graphical user interface is an interface displayed by the virtual scene and the virtual character acquired through a virtual lens associated with the virtual character; acquiring the current acquisition direction of the virtual lens when the virtual character shoots each time; allocating a rotation direction, an initial rotation speed rotating towards the rotation direction and a rotation acceleration rotating towards the current acquisition direction to the virtual lens; and controlling the virtual lens to rotate towards the rotating direction based on the rotating direction, the initial rotating speed and the rotating acceleration, and controlling the virtual lens to rotate towards the current collecting direction when the virtual lens does not rotate towards the rotating direction any more until the collecting direction of the virtual lens is the current collecting direction. Therefore, the rotating acceleration rotating to the current collecting direction is configured for the virtual lens, so that the virtual lens is influenced by the rotating acceleration to pull the collecting direction back to the initial current collecting direction, and the problems that the attack orientation needs to be manually adjusted after each shooting and the operation is complicated are solved.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer-readable storage medium, in which a plurality of computer programs are stored, where the computer programs can be loaded by a processor to execute the steps in any method for controlling a virtual lens provided in embodiments of the present application. For example, the computer program may perform the steps of:
displaying a graphical user interface, wherein the graphical user interface comprises at least a part of virtual scenes and virtual roles positioned in the virtual scenes, and the graphical user interface is an interface displayed by the virtual scenes and the virtual roles acquired through virtual lenses associated with the virtual roles; acquiring the current acquisition direction of the virtual lens when the virtual character shoots each time; distributing a rotating direction, an initial rotating speed rotating to the rotating direction and a rotating acceleration rotating to the current collecting direction for the virtual lens; and controlling the virtual lens to rotate towards the rotating direction based on the rotating direction, the initial rotating speed and the rotating acceleration, and controlling the virtual lens to rotate towards the current collecting direction when the virtual lens does not rotate towards the rotating direction any more until the collecting direction of the virtual lens is the current collecting direction.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), random Access Memory (RAM), magnetic or optical disks, and the like.
Since the computer program stored in the storage medium can execute the steps in any method for controlling a virtual lens provided in the embodiments of the present application, beneficial effects that can be achieved by any method for controlling a virtual lens provided in the embodiments of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The foregoing detailed description is directed to a method, an apparatus, a storage medium, and a computer device for controlling a virtual lens provided in an embodiment of the present application, and a specific example is applied in the detailed description to explain the principle and an implementation of the present application, and the description of the foregoing embodiment is only used to help understanding the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, the specific implementation manner and the application scope may be changed, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (11)

1. A method for controlling a virtual lens, comprising:
displaying a graphical user interface, wherein the graphical user interface comprises at least a part of virtual scene and a virtual role positioned in the virtual scene, and the graphical user interface is an interface displayed by collecting the virtual scene and the virtual role through a virtual lens associated with the virtual role;
acquiring the current acquisition direction of the virtual lens when the virtual character shoots each time;
distributing a rotating direction, an initial rotating speed rotating to the rotating direction and a rotating acceleration rotating to the current collecting direction for the virtual lens;
and controlling the virtual lens to rotate towards the rotating direction based on the rotating direction, the initial rotating speed and the rotating acceleration, and controlling the virtual lens to rotate towards the current collecting direction when the virtual lens does not rotate towards the rotating direction any more until the collecting direction of the virtual lens is the current collecting direction.
2. The method for controlling the virtual lens according to claim 1, wherein the step of controlling the virtual lens to rotate towards the rotation direction based on the rotation direction, the initial rotation speed, and the rotation acceleration, and controlling the virtual lens to rotate towards the current capturing direction when the virtual lens does not rotate towards the rotation direction until the capturing direction of the virtual lens is the current capturing direction comprises:
calculating the first frame rotating speed of the first shooting effect frame in the shooting effect calculation period based on the initial rotating speed and the rotating acceleration;
determining the total rotation angle of a shooting effect calculation period according to the rotation speed of the first frame;
based on the frame rate of a client, determining a rotation angle corresponding to each display frame of the virtual lens in the duration of a shooting effect frame;
and controlling the virtual lens to rotate along with the display of each display frame according to the corresponding rotation angle of each display frame until the acquisition direction of the virtual lens is the current acquisition direction.
3. The method for controlling virtual lens according to claim 2, wherein the step of determining the rotation angle of the virtual lens corresponding to each display frame in a duration of a shooting effect frame based on the frame rate of the client comprises:
determining the display time difference between the current display frame and the previous display frame;
comparing the display time difference with the shooting effect calculation period to obtain a comparison result;
and determining the corresponding rotation angle of each display frame of the virtual lens in the duration of a shooting effect frame based on the comparison result.
4. The method for controlling virtual lens according to claim 3, wherein the step of determining the rotation angle of the virtual lens for each display frame in a shooting effect frame duration based on the comparison result comprises:
if the comparison result is that the display time difference is smaller than the shooting effect calculation period, calculating the ratio of the display time difference to the shooting effect calculation period to obtain a distribution ratio value;
determining a rotation angle corresponding to the virtual lens in the current display frame according to the distribution proportion value;
judging whether the current display frame is the last display frame in the shooting effect calculation period;
if not, determining the next display frame of the current display frame as the current display frame, and when the current display frame is determined as the previous display frame, returning to the step of determining the display time difference between the current display frame and the previous display frame until the current display frame is the last display frame in the shooting effect calculation period in which the current display frame is positioned, and obtaining the rotation angle corresponding to each frame of the virtual lens.
5. The method for controlling the virtual lens according to claim 4, wherein the step of determining the corresponding rotation angle of the virtual lens in the current display frame according to the distribution ratio value comprises:
acquiring a target total rotation angle corresponding to a shooting effect calculation period in which the current display frame is positioned;
and calculating the product of the distribution ratio value and the target total rotation angle to obtain the rotation angle of the virtual lens corresponding to the current display frame.
6. The method for controlling virtual shots according to claim 4, further comprising:
if the comparison result is that the display time difference is equal to the shooting effect calculation period, acquiring a target total rotation angle corresponding to the shooting effect calculation period in which the current display frame is located;
and calculating the product of a preset numerical value and the target total rotation angle to obtain the rotation angle of the virtual lens corresponding to the current display frame.
7. The method for controlling virtual shots according to claim 4, further comprising:
if the comparison result is that the display time difference is larger than the shooting effect calculation period, acquiring the current display frame sequence number and the previous display frame sequence number of the current display frame and the previous display frame in the shooting effect calculation period;
determining the number of display frames of the lost display frame based on the serial number of the previous display frame and the current display frame;
acquiring a rotation angle corresponding to the previous display frame, and calculating the product of the rotation angle corresponding to the previous display frame and the number of the display frames to obtain a first rotation angle;
determining a missing display frame one frame before the current display frame based on the current display frame sequence number;
and determining a rotation angle corresponding to the virtual lens in the current display frame based on the lost display frame.
8. The method for controlling virtual shots according to claim 7, wherein the step of determining the rotation angle of the virtual shot in the current display frame based on the lost display frame comprises:
determining a target initial rotating speed corresponding to the lost display frame based on the initial rotating speed and the rotating acceleration;
determining a total rotation angle from a time point corresponding to the lost display frame to a next shooting effect calculation period according to the target initial rotation speed;
calculating the ratio of the display time difference between the current display frame and the lost display frame to the shooting effect calculation period to obtain a target distribution ratio value;
calculating the product of the target distribution ratio value and the total rotation angle to obtain a second rotation angle;
and calculating the sum of the first rotating angle and the second rotating angle to obtain the rotating angle corresponding to the virtual lens in the current display frame.
9. An apparatus for controlling a virtual lens, comprising:
the display module is used for displaying a graphical user interface, the graphical user interface comprises at least part of virtual scenes and virtual roles positioned in the virtual scenes, and the graphical user interface is an interface displayed by the virtual scenes and the virtual roles acquired through virtual lenses associated with the virtual roles;
the acquisition module is used for acquiring the current acquisition direction of the virtual lens when the virtual role shoots each time;
the distribution module is used for distributing a rotation direction, an initial rotation speed rotating towards the rotation direction and a rotation acceleration rotating towards the current acquisition direction for the virtual lens;
and the control module is used for controlling the virtual lens to rotate in the rotating direction based on the rotating direction, the initial rotating speed and the rotating acceleration, and controlling the virtual lens to rotate in the current acquisition direction when the virtual lens does not rotate in the rotating direction any more until the acquisition direction of the virtual lens is the current acquisition direction.
10. A computer-readable storage medium storing instructions adapted to be loaded by a processor to perform the steps of the method for controlling virtual lens according to any one of claims 1 to 8.
11. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps in the control method of the virtual lens according to any one of claims 1 to 8 when executing the program.
CN202211738216.9A 2022-12-30 2022-12-30 Virtual lens control method and device, storage medium and computer equipment Pending CN115970282A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211738216.9A CN115970282A (en) 2022-12-30 2022-12-30 Virtual lens control method and device, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211738216.9A CN115970282A (en) 2022-12-30 2022-12-30 Virtual lens control method and device, storage medium and computer equipment

Publications (1)

Publication Number Publication Date
CN115970282A true CN115970282A (en) 2023-04-18

Family

ID=85962287

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211738216.9A Pending CN115970282A (en) 2022-12-30 2022-12-30 Virtual lens control method and device, storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN115970282A (en)

Similar Documents

Publication Publication Date Title
CN113082712A (en) Control method and device of virtual role, computer equipment and storage medium
CN113398590B (en) Sound processing method, device, computer equipment and storage medium
CN113426124B (en) Display control method and device in game, storage medium and computer equipment
WO2024011894A1 (en) Virtual-object control method and apparatus, and storage medium and computer device
CN114522423A (en) Virtual object control method and device, storage medium and computer equipment
CN115040873A (en) Game grouping processing method and device, computer equipment and storage medium
CN115869623A (en) Virtual weapon processing method and device, computer equipment and storage medium
CN115970284A (en) Attack method and device of virtual weapon, storage medium and computer equipment
CN115999153A (en) Virtual character control method and device, storage medium and terminal equipment
CN112245914B (en) Viewing angle adjusting method and device, storage medium and computer equipment
CN115212572A (en) Control method and device of game props, computer equipment and storage medium
CN114225412A (en) Information processing method, information processing device, computer equipment and storage medium
CN115382201A (en) Game control method and device, computer equipment and storage medium
CN114159788A (en) Information processing method, system, mobile terminal and storage medium in game
CN114522429A (en) Virtual object control method and device, storage medium and computer equipment
CN116139483A (en) Game function control method, game function control device, storage medium and computer equipment
CN115970282A (en) Virtual lens control method and device, storage medium and computer equipment
CN113398564B (en) Virtual character control method, device, storage medium and computer equipment
CN116474367A (en) Virtual lens control method and device, storage medium and computer equipment
CN115518375A (en) Game word skipping display method and device, computer equipment and storage medium
CN116870472A (en) Game view angle switching method and device, computer equipment and storage medium
CN114042322A (en) Animation display method and device, computer equipment and storage medium
CN116850594A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN116115991A (en) Aiming method, aiming device, computer equipment and storage medium
CN115212566A (en) Virtual object display method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination