CN110732135B - Virtual scene display method and device, electronic equipment and storage medium - Google Patents

Virtual scene display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN110732135B
CN110732135B CN201910992462.9A CN201910992462A CN110732135B CN 110732135 B CN110732135 B CN 110732135B CN 201910992462 A CN201910992462 A CN 201910992462A CN 110732135 B CN110732135 B CN 110732135B
Authority
CN
China
Prior art keywords
visual angle
force
virtual object
target
adsorption
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910992462.9A
Other languages
Chinese (zh)
Other versions
CN110732135A (en
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201910992462.9A priority Critical patent/CN110732135B/en
Publication of CN110732135A publication Critical patent/CN110732135A/en
Application granted granted Critical
Publication of CN110732135B publication Critical patent/CN110732135B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user

Abstract

The application discloses a virtual scene display method and device, electronic equipment and a storage medium, and belongs to the technical field of computers. The method comprises the following steps: when the visual angle adjusting operation is detected and the aiming point is located in the adsorption area of the target virtual object, acquiring the adsorption force borne by the visual angle and the visual angle adjusting force corresponding to the visual angle adjusting operation according to the motion state of the target virtual object; acquiring the target rotation speed of the visual angle of the virtual scene according to the adsorption force and the visual angle adjusting force; and displaying the virtual scene which changes along with the rotation of the visual angle in the process of controlling the visual angle to rotate according to the target rotation speed. The motion state of the target virtual object is considered, when the condition of providing the auxiliary aiming service is met, the force borne by the visual angle can be acquired according to the motion state, and good auxiliary aiming can be provided when the virtual object is in a moving state, so that the virtual scene display meets the expectation of a user, the user requirements are met, and the display effect is good.

Description

Virtual scene display method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for displaying a virtual scene, an electronic device, and a storage medium.
Background
With the development of computer technology and the diversification of terminal functions, more and more games can be played on the terminal. The shooting game is a more popular game, and generally, an aiming point is displayed in the center of a terminal screen in the game, and a user can adjust an area aimed by the aiming point by adjusting the visual angle of a virtual scene, so as to adjust the currently displayed virtual scene.
Currently, a virtual scene display method generally includes changing sensitivity of a viewing angle adjustment operation when a user performs the viewing angle adjustment operation when a target virtual object is detected and an aiming point is located in an adsorption area of the target virtual object, so as to assist the user in moving the aiming point to the adsorption area.
According to the method, only the user operation is adjusted, the assistance force is poor, and if the target virtual object moves, the user still has difficulty in moving the aiming point to the body of the target virtual object, so that accurate striking of the target virtual object cannot be completed, and therefore the virtual scene display does not meet the user expectation, the user requirements cannot be met, and the display effect is poor.
Disclosure of Invention
The embodiment of the application provides a virtual scene display method and device, electronic equipment and a storage medium, and can solve the problems that the requirements of users cannot be met and the display effect is poor in the related technology. The technical scheme is as follows:
in one aspect, a method for displaying a virtual scene is provided, where the method includes:
when visual angle adjusting operation is detected and an aiming point is located in an adsorption area of a target virtual object, acquiring adsorption force borne by the visual angle and visual angle adjusting force corresponding to the visual angle adjusting operation according to the motion state of the target virtual object, wherein the adsorption force is used for moving the aiming point to the target virtual object;
acquiring the target rotating speed of the visual angle of the virtual scene according to the adsorption force and the visual angle adjusting force;
and in the process of controlling the visual angle to rotate according to the target rotating speed, displaying the virtual scene which changes along with the rotation of the visual angle.
In one aspect, a virtual scene display apparatus is provided, the apparatus including:
the acquisition module is used for acquiring the adsorption force borne by a visual angle and the visual angle adjustment force corresponding to the visual angle adjustment operation according to the motion state of a target virtual object when the visual angle adjustment operation is detected and an aiming point is positioned in the adsorption area of the target virtual object, wherein the adsorption force is used for moving the aiming point to the target virtual object;
the acquisition module is further used for acquiring the target rotating speed of the visual angle of the virtual scene according to the adsorption force and the visual angle adjusting force;
and the display module is used for displaying the virtual scene which changes along with the rotation of the visual angle in the process of controlling the visual angle to rotate according to the target rotation speed.
In one possible implementation, the obtaining module is configured to:
when the visual angle adjusting operation is detected, acquiring an adsorption area of the target virtual object;
emitting rays from the position of the aiming point along the current visual angle;
and when the ray passes through the adsorption area, acquiring the adsorption force borne by the aiming point and the visual angle adjusting force corresponding to the visual angle adjusting operation according to the motion state of the target virtual object.
In a possible implementation manner, the obtaining module is configured to obtain, as the adsorption force applied to the viewing angle, an adsorption force with a largest value of the first adsorption force and the second adsorption force.
In a possible implementation manner, the obtaining module is configured to obtain, when the target virtual object is stationary, a second adsorption force applied to the viewing angle as an adsorption force applied to the viewing angle, where the second adsorption force is used to assist the aiming point to move to the target virtual object.
In one possible implementation, the obtaining module is configured to perform any one of:
when the damping force borne by the visual angle is larger than the visual angle adjusting force, acquiring zero as the target rotating speed of the visual angle of the virtual scene;
and when the resultant force of the damping force borne by the visual angle and the adsorption force is greater than the visual angle adjusting force, acquiring zero as the target rotating speed of the visual angle of the virtual scene.
In one possible implementation manner, the obtaining module is further configured to:
when the end of the visual angle adjusting operation is detected and the aiming point is located in the adsorption area of the target virtual object, if the target virtual object moves, the adsorption force borne by the visual angle is acquired;
and acquiring the target rotating speed of the visual angle of the virtual scene according to the adsorption force borne by the visual angle.
In a possible implementation manner, the obtaining module is configured to, when detecting a viewing angle adjustment operation and when an aiming point is located in an adsorption area of a target virtual object, execute the step of obtaining an adsorption force applied to a viewing angle and a viewing angle adjustment force corresponding to the viewing angle adjustment operation according to a motion state of the target virtual object if a motion state, a health state, or a virtual scene where the currently controlled virtual object is located is in a first state.
In one possible implementation manner, the obtaining module is further configured to:
when the visual angle adjusting operation is detected and the aiming point is positioned in the adsorption area of the target virtual object, if the motion state and the health state of the currently controlled virtual object or the virtual scene in which the currently controlled virtual object is positioned are in a second state, the visual angle adjusting force corresponding to the visual angle adjusting operation is acquired;
and acquiring the target rotation speed of the visual angle of the virtual scene according to the visual angle adjusting force.
In one aspect, an electronic device is provided and includes one or more processors and one or more memories having at least one program code stored therein, the program code being loaded and executed by the one or more processors to implement operations performed by the virtual scene display method.
In one aspect, a computer-readable storage medium having at least one program code stored therein is provided, the program code being loaded and executed by a processor to implement the operations performed by the virtual scene display method.
In the embodiment of the application, the motion state of the target virtual object is considered, when the condition of providing the auxiliary aiming service is met, the force borne by the visual angle can be acquired according to the motion state, the force comprises the adsorption force and the visual angle adjusting force, so that the target rotating speed of the visual angle can be determined according to the force, how to display the virtual scene is determined, and good auxiliary aiming can be provided when the virtual object is in the moving state, so that the virtual scene display meets the expectation of a user, the requirement of the user is met, and the display effect is good.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment of a virtual scene display method provided in an embodiment of the present application;
fig. 2 is a flowchart of a virtual scene display method provided in an embodiment of the present application;
FIG. 3 is a schematic view of an adsorption zone provided in an embodiment of the present application;
FIG. 4 is a schematic view of an adsorption zone provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of a radiation detection process provided by an embodiment of the present application;
FIG. 6 is a schematic view of the absorption force provided by the embodiments of the present application;
FIG. 7 is a schematic illustration of a damping force provided by an embodiment of the present application;
FIG. 8 is a schematic diagram illustrating a relationship between a damping force and a viewing angle adjustment force according to an embodiment of the present disclosure;
FIG. 9 is a schematic view of a magnetic adsorption process provided in an embodiment of the present application;
FIG. 10 is a schematic view of a magnetic adsorption process provided in an embodiment of the present application;
FIG. 11 is a schematic view of a rotation angle of a viewing angle provided by an embodiment of the present application;
fig. 12 is a schematic structural diagram of a virtual scene display apparatus according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in this application are used for distinguishing between similar items and items that have substantially the same function or similar functionality, and it should be understood that "first," "second," and "nth" do not have any logical or temporal dependency or limitation on the number or order of execution.
The term "at least one" in this application means one or more, and the meaning of "a plurality" means two or more, for example, a plurality of first locations means two or more first locations.
Hereinafter, terms related to the present application are explained.
Virtual scene: is a virtual scene that is displayed (or provided) by an application program when the application program runs on a terminal. The virtual scene may be a simulation environment of a real world, a semi-simulation semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, and a three-dimensional virtual scene, which is not limited in this application. For example, a virtual scene may include sky, land, ocean, etc., the land may include environmental elements such as deserts, cities, etc., and a user may control a virtual object to move in the virtual scene.
Virtual object: refers to a movable object in a virtual environment. The movable object can be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in the virtual environment. The virtual object may be an avatar in the virtual scene that is virtual to represent the user. The virtual scene may include a plurality of virtual objects, each virtual object having its own shape and volume in the virtual scene and occupying a portion of the space in the virtual scene. Alternatively, the virtual Character may be a Character controlled by an operation on the client, an Artificial Intelligence (AI) set in the virtual environment battle by training, or a Non-Player Character (NPC) set in the virtual environment battle. Optionally, the virtual character is a virtual character that competes in a virtual environment. Optionally, the number of virtual characters in the virtual environment battle may be preset, or may be dynamically determined according to the number of clients joining the battle.
Taking a shooting game as an example, the user may control the virtual object to freely fall, glide or open a parachute to fall in the sky of the virtual scene, to run, jump, crawl, bow to move ahead on land, or to swim, float or dive in the sea, or the like, and of course, the user may also control the virtual object to move in the virtual scene by riding a virtual vehicle, which is only exemplified here, but the embodiment of the present invention is not limited to this. The user can also control the virtual object to fight with other virtual objects through the virtual prop, and the virtual prop can be used for simulating cold weapons and can also be used for simulating hot weapons, and the application is not specifically limited to this.
The viewing angle adjustment operation referred to in the present application may include various operation modes. In one possible implementation manner, the viewing angle adjusting operation may be a sliding operation, the terminal detects the sliding operation, and the rotation direction, the rotation angle, and the rotation speed of the viewing angle corresponding to the sliding operation may be determined based on the sliding direction, the sliding distance, and the sliding speed of the sliding operation. For example, the sliding direction of the sliding operation may correspond to the rotation direction of the viewing angle, and the sliding distance of the sliding operation may be positively correlated with the rotation angle of the viewing angle, and of course, the sliding speed of the sliding operation may also be positively correlated with the rotation speed of the viewing angle.
In another possible implementation manner, the view angle adjusting operation may also be a pressing operation, specifically, a control area may be preset on the terminal, the user may perform the pressing operation in the control area, and when the terminal detects the pressing operation in the control area, the terminal may determine a rotation direction, a rotation speed, and a rotation angle of the view angle corresponding to the pressing operation based on a specific position of the pressing operation relative to the control area, a pressing force degree of the pressing operation, and a pressing time. For example, a direction of the pressing operation with respect to the center of the control region may correspond to a rotation direction of the viewing angle, a pressing force of the pressing operation may be positively correlated with a rotation speed of the viewing angle, and a pressing time of the pressing operation may be positively correlated with a rotation angle of the viewing angle.
In another possible implementation manner, the angle-of-view adjustment operation may also be a rotation operation on the terminal, and when an angular velocity sensor (e.g., a gyroscope) in the terminal detects the rotation operation, the rotation direction, the rotation angle, and the rotation speed of the angle of view may be determined according to the rotation direction, the rotation angle, and the rotation speed of the rotation operation. For example, the rotation direction of the rotation operation may be a rotation direction of a viewing angle, the rotation angle of the rotation operation may be positively correlated with the rotation angle of the viewing angle, and the rotation speed of the rotation operation may be positively correlated with the rotation speed of the viewing angle. In practical application, the angle-of-view adjustment operation may also be a key operation or a toggle operation on a real joystick device, and the like, which is not specifically limited in this application.
Of course, when the user controls the virtual object, different control effects may also be achieved through a combination of the above several kinds of viewing angle adjustment operations, for example, the viewing angle adjustment operation of the user on the viewing angle is a sliding operation, and when the sliding operation is performed, the terminal detects a pressing force degree of the operation in the sliding operation process, and determines whether to perform shooting or not based on whether the pressing force degree is greater than a preset pressing force degree or not. The above is merely an exemplary illustration, and how to combine the above several viewing angle adjusting operations in a specific implementation to achieve what kind of control effect, the present application is not limited in detail herein.
Hereinafter, a system architecture according to the present application will be described.
Referring to fig. 1, fig. 1 is a schematic diagram of an implementation environment of a virtual scene display method according to an embodiment of the present disclosure. The implementation environment includes: a first terminal 120, a server 140, and a second terminal 160.
The first terminal 120 is installed and operated with an application program supporting a virtual scene. The application program may be any one of a virtual reality application program, a three-dimensional map program, a First-person shooter game (FPS), a Multiplayer Online Battle Arena game (MOBA), and a Multiplayer gunfight type survival game. The first terminal 120 is a terminal used by a first user, and the first user uses the first terminal 120 to operate a first virtual object located in a virtual scene for activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the first virtual object is a first virtual character, such as a simulated persona or an animated persona.
The first terminal 120 is connected to the server 140 through a wireless network or a wired network.
The server 140 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 140 is used to provide background services for applications that support virtual scenarios. Alternatively, the server 140 undertakes primary computational work and the first and second terminals 120, 160 undertake secondary computational work; alternatively, the server 140 undertakes the secondary computing work and the first terminal 120 and the second terminal 160 undertakes the primary computing work; alternatively, the server 140, the first terminal 120, and the second terminal 160 perform cooperative computing by using a distributed computing architecture.
The second terminal 160 is installed and operated with an application program supporting a virtual scene. The application program can be any one of a virtual reality application program, a three-dimensional map program, an FPS, an MOBA and a multi-player gun battle type survival game. The second terminal 160 is a terminal used by a second user, and the second user uses the second terminal 160 to operate a second virtual object located in the virtual scene for activities, including but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the second virtual object is a second virtual character, such as a simulated persona or an animated persona.
The second terminal 160 is connected to the server 140 through a wireless network or a wired network.
Optionally, the first virtual object controlled by the first terminal 120 and the second virtual object controlled by the second terminal 160 are in the same virtual scene. In some embodiments, the first virtual object and the second virtual object may be in an adversary relationship, for example, the first virtual object and the second virtual object may belong to different teams and organizations, and the first terminal 120 may control the first virtual object to attack the second virtual object. In other embodiments, the first virtual object and the second virtual object may be in a teammate relationship, for example, the first virtual character and the second virtual character may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights.
Alternatively, the applications installed on the first terminal 120 and the second terminal 160 are the same, or the applications installed on the two terminals are the same type of application of different operating system platforms. The first terminal 120 may generally refer to one of a plurality of terminals, and the second terminal 160 may generally refer to one of a plurality of terminals, and this embodiment is only illustrated by the first terminal 120 and the second terminal 160. The device types of the first terminal 120 and the second terminal 160 are the same or different, and include: at least one of a smart phone, a tablet computer, an e-book reader, an MP3(Moving Picture Experts Group Audio Layer III, motion Picture Experts compression standard Audio Layer 3) player, an MP4(Moving Picture Experts Group Audio Layer IV, motion Picture Experts compression standard Audio Layer 4) player, a laptop portable computer, and a desktop computer. For example, the first terminal 120 and the second terminal 160 may be smart phones, or other handheld portable gaming devices. The following embodiments are illustrated with the terminal comprising a smartphone.
Those skilled in the art will appreciate that the number of terminals described above may be greater or fewer. For example, the number of the terminals may be only one, or several tens or hundreds of the terminals, or more. The number of terminals and the type of the device are not limited in the embodiments of the present application.
Fig. 2 is a flowchart of a virtual scene display method provided in an embodiment of the present application, and referring to fig. 2, this embodiment illustrates that the method is applied to a terminal, which may be the first terminal 120 shown in fig. 1, and the method includes:
201. when detecting the operation of adjusting the view angle, the terminal detects whether a virtual scene includes a target virtual object, and when the virtual scene includes the target virtual object, the step 202 is executed; when the target virtual object is not included in the virtual scene, step 205 is performed.
The user can carry out visual angle adjustment operation on the terminal to adjust the visual angle of virtual scene, because the aiming point is usually located the screen center, adjust the visual angle and can adjust the position in the corresponding virtual scene of aiming point, can adjust the aiming position and the attack placement of the virtual object of current control through this visual angle adjustment operation when aiming the shooting to the virtual object in the virtual scene like this, thereby realize the accurate striking to the target.
In the embodiment of the application, when the user performs the above-mentioned visual angle adjustment operation, an auxiliary aiming service may be provided to assist the user to quickly move an aiming point to a virtual object to be aimed at, and if the virtual object is moving, corresponding auxiliary adjustment may be performed according to the moving state of the virtual object to reduce the operation difficulty of the user. Furthermore, when the view angle adjusting operation is detected, the terminal may detect whether the target virtual object is included in the virtual scene, so as to determine whether the auxiliary aiming service needs to be provided.
It is understood that if the target virtual object is not included in the virtual scene, that is, if no other virtual object exists in the field of view of the currently controlled virtual object, the virtual object does not have a target to aim or shoot, the angle-of-view adjustment operation may be only an operation of adjusting the angle of view by the user, and is not aiming, so that the auxiliary aiming service may not be provided, and the following steps 205 and 206 may be performed to perform the angle-of-view adjustment directly based on the angle-of-view adjustment operation.
If the virtual scene includes a target virtual object, that is, if there are other virtual objects in the field of view of the currently controlled virtual object, and the virtual object may want to aim at the other virtual object, the following determination may be made to determine how to provide the auxiliary aiming service, and specifically, the following step 202 may be executed to make the determination.
The target virtual object may be any virtual object except the currently controlled virtual object. The virtual scene may be displayed with one or more virtual objects, and the terminal may take the virtual object closest to the aiming point as the target virtual object. Of course, the terminal may also use the virtual object selected by the user as the target virtual object.
In one possible implementation, the currently controlled virtual object may also be grouped with other virtual objects as virtual objects in the same team, and generally the currently controlled virtual object does not need to aim or shoot at a virtual object in the same team, and thus the target virtual object may also be any virtual object different from the team to which the currently controlled virtual object belongs. The embodiment of the present application does not limit the specific determination method of the target virtual object.
202. The terminal acquires the adsorption area of the target virtual object, and executes step 203 when the aiming point is located in the adsorption area of the target virtual object, and executes step 205 when the aiming point is located outside the adsorption area of the target virtual object.
After determining that the target virtual object exists in the visual field range of the currently controlled virtual object, the terminal may further determine whether the target virtual object has a condition for assisting aiming. In determining whether to provide the auxiliary aiming service, the terminal may determine a distance between the aiming point and the target virtual object, thereby determining whether to provide the auxiliary aiming.
It can be understood that if the aiming point is far away from the target virtual object, auxiliary aiming is provided, and the aiming point is absorbed to the vicinity of the target virtual object, the fairness of the video game is lost, and the significance of the user operation is lost. Therefore, the target virtual object can be provided with the adsorption area, auxiliary aiming is provided when the aiming point is positioned in the adsorption area, the complexity of user operation is reduced, and meanwhile fairness of the electronic game is guaranteed.
The terminal may first perform step 202 to obtain an absorption area of the target virtual object and then determine whether the aiming point is located in the absorption area, if so, the following step 203 may be performed to provide auxiliary aiming, and if so, the following step 205 may be performed to directly perform viewing angle adjustment according to the viewing angle adjustment operation.
The adsorption region may be a region around the target virtual object, and the adsorption region may be a region having a target size and centered on the target virtual object. The target size may be preset by a person skilled in the art, for example, as shown in fig. 3 and fig. 4, the adsorption region may be a cylindrical region around the target virtual object, or may be an ellipsoidal region, and the shape of the adsorption region is not limited in the embodiments of the present application.
In one possible implementation, the target size may also be determined based on a distance between the currently controlled virtual object and the target virtual object. Accordingly, this step 202 may be: and the terminal acquires an adsorption area of the target virtual object according to the distance between the currently controlled virtual object and the target virtual object, wherein the size of the adsorption area is positively correlated with the distance. The larger the distance, the larger the size of the adsorption region, and the smaller the distance, the smaller the size of the adsorption region. When the distance between the first virtual object and the second virtual object is far, the display size of the first virtual object is small, but the display size of the adsorption area of the first virtual object is not too small, so that a user can easily adjust the visual angle through visual angle adjustment operation, the position of the aiming point is moved to the adsorption area of the target virtual object, and the auxiliary function of aiming assistance is achieved.
The terminal may determine a relationship between the aiming point and the absorption area according to a plurality of ways, and in a possible implementation manner, the terminal may employ a ray detection way, specifically, the terminal may emit a ray from a position where the aiming point is located along a current viewing angle, and when the ray passes through the absorption area, the terminal may determine that the aiming point is located in the absorption area, and then the terminal may perform the following step 203, and obtain an absorption force borne by the aiming point and a viewing angle adjustment force corresponding to the viewing angle adjustment operation according to a motion state of the target virtual object. When the ray does not pass through the absorption region, the terminal may determine that the aiming point is located outside the absorption region, and then step 205 may be performed. For example, as shown in fig. 5, the adsorption area of the target virtual object may be a collision box, and when the ray collides with the collision box, the aiming point may be located within the adsorption area. Specifically, a ray is emitted from the position of the aiming point, the ray collides with a collision box of the target virtual object, and the aiming point is located in the adsorption area.
In another possible implementation manner, the terminal may determine whether there is an intersection between the two according to the position of the aiming point in the virtual scene and the position of the adsorption area, for example, the aiming point may correspond to a straight line in the virtual scene, and the adsorption area may correspond to an ellipsoid in the virtual scene, and if there is an intersection between coordinate ranges of the aiming point and the adsorption area, it is determined that the aiming point is located within the adsorption area, and if there is no intersection, it is determined that the aiming point is located outside the adsorption area.
The foregoing only provides two determination manners, and the terminal may also determine the position relationship between the aiming point and the adsorption area by using other manners, which is not limited in this embodiment of the application.
203. And the terminal acquires the adsorption force borne by the visual angle and the visual angle adjusting force corresponding to the visual angle adjusting operation according to the motion state of the target virtual object, wherein the adsorption force is used for moving the aiming point to the target virtual object.
In the embodiment of the application, when the motion states of the target virtual object are different, the terminal can provide different auxiliary aiming services. In step 203, the terminal may first obtain a motion state of the target virtual object, and then analyze a force applied to the viewing angle according to the motion state, so as to determine how to adjust the viewing angle.
The motion state of the target virtual object may include two types: moving and stationary. Specifically, the process of the terminal acquiring the adsorption force applied to the viewing angle in step 203 may include the following two cases:
the first condition is as follows: when the target virtual object moves, acquiring a first adsorption force and a second adsorption force borne by the visual angle, wherein the first adsorption force is used for controlling the aiming point to follow the target virtual object, the second adsorption force is used for assisting the aiming point to move towards the target virtual object, and the terminal can acquire the first adsorption force or the second adsorption force as the adsorption force borne by the visual angle.
In case one, the aiming point is located in the absorption area, the terminal can provide a second absorption force for assisting the user to aim at the target virtual object, the target virtual object is moving, and in order to assist the user to aim at the moving target virtual object more quickly, the terminal can provide the first absorption force, if the two forces are superposed, the auxiliary force is too large, so that the user can easily move the aiming point to the target virtual object and quickly get away from the target virtual object, and therefore the aiming point cannot be aimed accurately, and therefore, the terminal can adopt one of the two forces as the absorption force. For example, as shown in fig. 6, the suction force is directed from the aiming point to the position of the target virtual object.
In one possible implementation, during the selection of one of the two forces, the terminal may use the force with the greater value as the suction force, thereby better assisting the user in operating. Specifically, the terminal may acquire, as the adsorption force applied to the viewing angle, the adsorption force with the largest value among the first adsorption force and the second adsorption force.
Case two: when the target virtual object is static, the terminal acquires a second adsorption force borne by the visual angle as an adsorption force borne by the visual angle, and the second adsorption force is used for assisting the aiming point to move towards the target virtual object.
In the second case, since the target virtual object is stationary, the terminal may not need to provide the first adsorption force for the target virtual object, and naturally, may not need to perform the screening between the two forces, and may directly use the second adsorption force for the auxiliary aiming as the adsorption force applied to the viewing angle.
The above steps 201 to 203 are processes of acquiring the absorption force applied to the viewing angle and the viewing angle adjustment force corresponding to the viewing angle adjustment operation according to the motion state of the target virtual object when the viewing angle adjustment operation is detected and the aiming point is located in the absorption area of the target virtual object, and the motion state of the target virtual object is considered, so as to provide a more appropriate auxiliary aiming service for the user according to the motion state.
In a possible implementation manner, when the motion state, the health state or the virtual scene where the currently controlled virtual object is located are different, even if the aiming point and the adsorption area conform to the position relationship, the auxiliary aiming service is not provided.
Specifically, the motion state, health state or virtual scene in which the virtual object is currently controlled may be divided into two states: a first state and a second state.
When the visual angle adjusting operation is detected and the aiming point is located in the adsorption area of the target virtual object, if the motion state, the health state or the virtual scene of the currently controlled virtual object is in the first state, the terminal can provide auxiliary aiming, and step 203 can be executed to obtain the adsorption force borne by the visual angle and the visual angle adjusting force corresponding to the visual angle adjusting operation according to the motion state of the target virtual object.
For example, when the current virtual object stands or lies on the ground or a building in the virtual scene, or the virtual object is in an environment capable of aiming at shooting, or the virtual object is not eliminated and still can compete with the virtual object in the virtual scene, the terminal can provide the auxiliary aiming service.
When the angle-of-view adjustment operation is detected and the aiming point is located in the adsorption area of the target virtual object, if the motion state, health state or virtual scene of the currently controlled virtual object is in the second state, the terminal may not provide auxiliary aiming, the terminal may execute step 205, obtain an angle-of-view adjustment force corresponding to the angle-of-view adjustment operation, and obtain a target rotation speed of the angle of view of the virtual scene according to the angle-of-view adjustment force.
For example, when the current virtual object is in a falling state or a flying state in the virtual scene, or the virtual object is hit by a smoke bomb and is in smoke, or the virtual object is eliminated and cannot continue to compete with the virtual object in the virtual scene, the terminal may not provide the auxiliary aiming service.
204. And the terminal acquires the target rotating speed of the visual angle of the virtual scene according to the adsorption force and the visual angle adjusting force.
After the terminal acquires the adsorption force and the visual angle adjusting force, the target rotation speed of the visual angle can be determined according to the adsorption force and the visual angle adjusting force.
In a possible implementation manner, the terminal may first obtain a resultant force of the adsorption force and the viewing angle adjustment force, and determine a rotation direction and a rotation speed of the viewing angle according to the resultant force. In another possible implementation manner, the terminal may also obtain corresponding rotation speeds according to the two forces, respectively, so as to obtain a combined rotation speed of the two rotation speeds as the target rotation speed. Wherein, the rotating speed and the target rotating speed are both vectors, including direction and magnitude.
In a possible implementation manner, when the operation directions of the above-mentioned viewing angle adjusting operations are different, the terminal may also provide different auxiliary aiming services. For example, when the angle-of-view adjustment operation is used to move the aiming point toward the target virtual object, the terminal may perform the above step 203 and step 204, and when the angle-of-view adjustment operation is used to move the aiming point away from the target virtual object, the terminal may further obtain, in addition to step 203, a force opposite to the direction of the angle-of-view adjustment operation as a damping force applied to the angle of view, and in step 204, the following steps may be correspondingly performed: and the terminal acquires the target rotating speed of the visual angle of the virtual scene according to the adsorption force and the damping force of the visual angle and the visual angle adjusting force.
The aiming point is close to the target virtual object, and if the moving speed of the aiming point is too high, misoperation is easy to occur, so that the aiming point is far away from the target virtual object quickly. Thus, a damping effect can be provided to provide a reaction force to reduce the rotational speed of the viewing angle. And the acting force or the counterforce is provided in the mode, so that the user can easily master the operation skill and can better adjust the visual angle for aiming.
For the damping force, the adsorption force, and the visual angle adjustment force, an arrangement may also be provided: when the damping force borne by the visual angle is larger than the visual angle adjusting force, the terminal obtains zero as the target rotating speed of the visual angle of the virtual scene, so that when the user moves the aiming point to the direction far away from the target virtual object, the aiming point cannot move reversely due to the damping force, the user operation is better respected, and the user requirements are met.
For example, as shown in fig. 7, the direction of the damping force may be opposite to the direction of the viewing angle adjustment operation by the user. As shown in fig. 8, when the damping force (the generated reaction force) is larger than the viewing angle adjustment force (for example, the force generated by dragging the mouse), the aiming point cannot be removed from the target virtual object, and when the damping force is smaller than the viewing angle adjustment force, the aiming point is removed from the target virtual object.
In a possible implementation manner, the setting may also be: and when the resultant force of the damping force borne by the visual angle and the adsorption force is greater than the visual angle adjusting force, the terminal acquires zero as the target rotating speed of the visual angle of the virtual scene.
In a specific possible embodiment, if the user's angle-of-view adjustment operation is finished while the target virtual object is moving, the aiming point may also continue to move along with the target virtual object until the aiming point moves onto the target virtual object. Specifically, when it is detected that the operation of adjusting the view angle is finished and the aiming point is located in the adsorption area of the target virtual object, if the target virtual object is moving, the terminal may obtain the adsorption force applied to the view angle, and obtain the target rotation speed of the view angle of the virtual scene according to the adsorption force applied to the view angle.
205. And the terminal acquires the visual angle adjusting force corresponding to the visual angle adjusting operation, and acquires the target rotating speed of the visual angle of the virtual scene according to the visual angle adjusting force.
Different from the step 203, the terminal may determine the target rotation speed of the viewing angle directly according to the viewing angle adjustment force corresponding to the viewing angle adjustment operation when the terminal does not need to provide the auxiliary aiming. That is, when the user performs the operation of adjusting the angle of view, the angle of view normally rotates according to the operation.
206. And the terminal displays a virtual scene which changes along with the rotation of the visual angle in the process of controlling the visual angle to rotate according to the target rotation speed.
Through the steps, after the terminal acquires the target rotating speed of the visual angle, the visual angle can be controlled to rotate according to the target rotating speed, the virtual scene observed through the visual angle changes when the visual angle rotates, and in the process, the terminal can display the changed virtual scene in the graphical user interface.
Two specific examples are provided below, and in one specific example, the above-mentioned adsorption force is explained, as shown in fig. 9, the adsorption force may be referred to as magnetic adsorption, and the magnetic adsorption process may include the following steps:
firstly, whether a player (a user of the current terminal) has screen input operation is detected, because the adsorption of the magnetic force can be acted under the condition that the player operates, and the next detection can be carried out after the player touches the screen.
And step two, judging whether the current state of the player (the currently controlled virtual object) can generate adsorption, wherein the player can not generate magnetic adsorption when falling, dying or under the smoke bomb, and if the state is not the state, the adsorption can be generated.
And step three, when the current state is judged to generate magnetic force, judging whether the current aiming point position aims at the collision box of the target, wherein the player can emit a ray to detect the collision box of the target from the muzzle (aiming point) position, and magnetic force adsorption can be generated when the collision box of a certain target is detected.
And fourthly, acquiring the current position of the player and the position of the muzzle, and calculating the angle required for the current direction to rotate to the target central axis. The target center axis is also the center position of the adsorption region, i.e., the target center position. The magnetic attraction is completed when the aiming point is attracted to the central axis of the target. Of course, if the target is moving, the view angle is rotated by the calculated angle, and the target center axis is not absorbed, but the view angle is rotated by the calculated angle when the target is moving, so that the aiming point is absorbed on the target center axis. The calculation is only an example, and of course, the rotation direction may be determined by the position without calculating the angle, which is not limited in the embodiment of the present application.
As shown in fig. 10, assuming that O is the muzzle position (aiming point position), OA is the direction of the gun pointing, that is, the viewing angle direction, and OB is the direction from the center axis of the target to the muzzle, when magnetic attraction is generated, if the target is not moved, the target is rotated from OA to OB by an angle C, and when the target is rotated to OB, the magnetic attraction is stopped.
And step five, calculating the current auxiliary force adsorption force when magnetic force adsorption is generated, judging the magnitude of the two forces, and taking the maximum force.
As for the damping force, as shown in fig. 11, only the damping force and the view angle adjusting force are taken as an example for description, and as for the adsorption force, the calculation may be superimposed on the example shown in fig. 10, which is not described herein again. The damping force is referred to as damping adsorption, and the damping adsorption process can be as shown in fig. 11, specifically as follows:
firstly, detecting whether a player has screen input operation, wherein the absorption of the damping can only be acted under the condition that the player operates, and the next detection can be carried out after the player touches the screen.
And step two, judging whether the current player state can generate adsorption, and when the player falls, dies or is under a smoke bomb, the player cannot generate damping adsorption. Damping adsorption may be generated when the player is not in these states.
And step three, when the player drags the mouse to the direction far away from the target, a counterforce is generated.
The mouse sliding deviation is visual angle adjusting force, the counterforce is damping force, the difference value of the two forces is calculated, if the acting force is larger than the mouse deviation, namely the damping force is larger than the visual angle adjusting force, the aiming point cannot be moved away from the target, and therefore dragging of the mouse is ineffective.
And fourthly, finishing damping adsorption when the player leaves the screen to click or the player cannot drag.
In the embodiment of the application, the motion state of the target virtual object is considered, when the condition of providing the auxiliary aiming service is met, the force borne by the visual angle can be acquired according to the motion state, the force comprises the adsorption force and the visual angle adjusting force, so that the target rotating speed of the visual angle can be determined according to the force, how to display the virtual scene is determined, and good auxiliary aiming can be provided when the virtual object is in the moving state, so that the virtual scene display meets the expectation of a user, the requirement of the user is met, and the display effect is good.
All the above optional technical solutions may be combined arbitrarily to form optional embodiments of the present application, and are not described herein again.
Fig. 12 is a schematic structural diagram of a virtual scene display apparatus according to an embodiment of the present application, and referring to fig. 12, the apparatus includes:
an obtaining module 1201, configured to, when a viewing angle adjustment operation is detected and an aiming point is located in an adsorption area of a target virtual object, obtain, according to a motion state of the target virtual object, an adsorption force borne by the viewing angle and a viewing angle adjustment force corresponding to the viewing angle adjustment operation, where the adsorption force is used to move the aiming point toward the target virtual object;
the obtaining module 1201 is further configured to obtain a target rotation speed of a visual angle of the virtual scene according to the adsorption force and the visual angle adjusting force;
and a display module 1202, configured to display a virtual scene that changes with the rotation of the angle of view in the process of controlling the angle of view to rotate according to the target rotation speed.
In one possible implementation, the obtaining module 1201 is configured to:
when the visual angle adjusting operation is detected, acquiring an adsorption area of the target virtual object;
emitting rays from the position of the aiming point along the current visual angle;
when the ray passes through the adsorption area, the adsorption force borne by the aiming point and the visual angle adjusting force corresponding to the visual angle adjusting operation are obtained according to the motion state of the target virtual object.
In one possible implementation, the obtaining module 1201 is configured to:
when the target virtual object moves, acquiring a first adsorption force and a second adsorption force borne by the visual angle, wherein the first adsorption force is used for controlling the aiming point to follow the target virtual object, and the second adsorption force is used for assisting the aiming point to move towards the target virtual object;
and acquiring the first adsorption force or the second adsorption force as the adsorption force borne by the visual angle.
In a possible implementation manner, the obtaining module 1201 is configured to obtain, as the adsorption force applied to the viewing angle, an adsorption force with a largest value of the first adsorption force and the second adsorption force.
In a possible implementation manner, the obtaining module 1201 is configured to obtain, when the target virtual object is stationary, a second absorption force applied to the viewing angle as an absorption force applied to the viewing angle, where the second absorption force is used to assist the aiming point to move towards the target virtual object.
In a possible implementation manner, the obtaining module 1201 is further configured to:
when the visual angle adjusting operation is used for keeping the aiming point away from the target virtual object, acquiring a force opposite to the visual angle adjusting operation direction as a damping force borne by the visual angle;
and acquiring the target rotating speed of the visual angle of the virtual scene according to the adsorption force and the damping force of the visual angle and the visual angle adjusting force.
In one possible implementation, the obtaining module 1201 is configured to perform any one of the following:
when the damping force borne by the visual angle is larger than the visual angle adjusting force, acquiring zero as the target rotating speed of the visual angle of the virtual scene;
and when the resultant force of the damping force borne by the visual angle and the adsorption force is greater than the visual angle adjusting force, acquiring zero as the target rotating speed of the visual angle of the virtual scene.
In a possible implementation manner, the obtaining module 1201 is further configured to:
when the end of the visual angle adjusting operation is detected and the aiming point is positioned in the adsorption area of the target virtual object, if the target virtual object moves, the adsorption force borne by the visual angle is acquired;
and acquiring the target rotating speed of the visual angle of the virtual scene according to the adsorption force borne by the visual angle.
In a possible implementation manner, the obtaining module 1201 is configured to, when detecting an angle-of-view adjustment operation and when an aiming point is located in an adsorption area of a target virtual object, execute the step of obtaining an adsorption force applied to an angle of view and an angle-of-view adjustment force corresponding to the angle-of-view adjustment operation according to a motion state, a health state, or a virtual scene in which the current controlled virtual object is located when the current controlled virtual object is in a first state.
In a possible implementation manner, the obtaining module 1201 is further configured to:
when the visual angle adjusting operation is detected and the aiming point is positioned in the adsorption area of the target virtual object, if the motion state and the health state of the currently controlled virtual object or the virtual scene in which the currently controlled virtual object is positioned are in a second state, the visual angle adjusting force corresponding to the visual angle adjusting operation is acquired;
and acquiring the target rotation speed of the visual angle of the virtual scene according to the visual angle adjusting force.
The device that this application embodiment provided has considered the motion state of target virtual object, when being accorded with the condition that provides supplementary service of aiming, can obtain the power that the visual angle received according to this motion state, power includes adsorption affinity, visual angle adjustment power to can confirm the target slew velocity at visual angle according to power, thereby confirm how to show virtual scene, also can provide fine supplementary aim when virtual object is in the mobile state, thereby virtual scene shows and accords with user's expectation, satisfy user's demand, the display effect is good.
It should be noted that: in the virtual scene display apparatus provided in the foregoing embodiment, when displaying a virtual scene, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the electronic device may be divided into different functional modules to complete all or part of the functions described above. In addition, the virtual scene display apparatus provided in the above embodiments and the virtual scene display method embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments in detail and are not described herein again.
Fig. 13 is a schematic structural diagram of an electronic device according to an embodiment of the present application, where the electronic device 1300 may generate a relatively large difference due to a difference in configuration or performance, and may include one or more processors (CPUs) 1301 and one or more memories 1302, where at least one program code is stored in the one or more memories 1302, and the at least one program code is loaded and executed by the one or more processors 1301 to implement the virtual scene display method provided by each of the method embodiments. Certainly, the electronic device 1300 may further include components such as a wired or wireless network interface, a keyboard, and an input/output interface, so as to perform input and output, and the electronic device 1300 may further include other components for implementing device functions, which are not described herein again.
In an exemplary embodiment, there is also provided a computer-readable storage medium, such as a memory, including program code executable by a processor to perform the virtual scene display method in the above embodiments. For example, the computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, and the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (8)

1. A method for displaying a virtual scene, the method comprising:
when visual angle adjusting operation is detected, and an aiming point is located in an adsorption area of a target virtual object, acquiring adsorption force borne by a visual angle and visual angle adjusting force corresponding to the visual angle adjusting operation according to the motion state of the target virtual object, wherein when the target virtual object moves, the adsorption force is the adsorption force with the largest value of a first adsorption force and a second adsorption force; when the target virtual object is stationary, the adsorption force is the second adsorption force; the first adsorption force is used for controlling the aiming point to follow the target virtual object, and the second adsorption force is used for assisting the aiming point to move towards the target virtual object;
when the visual angle adjusting operation is used for enabling the aiming point to be close to the target virtual object, acquiring a target rotating speed of a visual angle of a virtual scene according to the adsorption force and the visual angle adjusting force;
when the visual angle adjusting operation is used for keeping the aiming point away from the target virtual object, acquiring a target rotating speed of a visual angle of the virtual scene according to an adsorption force, a damping force and the visual angle adjusting force, wherein the adsorption force, the damping force and the visual angle adjusting force are borne by the visual angle, and the damping force is a force opposite to the visual angle adjusting operation direction;
in the process of controlling the visual angle to rotate according to the target rotating speed, displaying a virtual scene which changes along with the rotation of the visual angle;
the method comprises the following steps of obtaining a target rotating speed of a visual angle of the virtual scene according to the adsorption force and the damping force of the visual angle and the visual angle adjusting force, wherein the target rotating speed comprises any one of the following items:
when the damping force borne by the visual angle is larger than the visual angle adjusting force, acquiring zero as the target rotating speed of the visual angle of the virtual scene;
and when the resultant force of the damping force borne by the visual angle and the adsorption force is greater than the visual angle adjusting force, acquiring zero as the target rotating speed of the visual angle of the virtual scene.
2. The method according to claim 1, wherein the acquiring of the absorption force applied to the viewing angle and the viewing angle adjustment force corresponding to the viewing angle adjustment operation comprises:
acquiring an adsorption area of the target virtual object;
emitting rays from the position of the aiming point along the current visual angle;
and when the ray passes through the adsorption area, acquiring the adsorption force borne by the aiming point and the visual angle adjusting force corresponding to the visual angle adjusting operation.
3. The method of claim 1, further comprising:
when the end of the visual angle adjusting operation is detected and the aiming point is located in the adsorption area of the target virtual object, if the target virtual object moves, the adsorption force borne by the visual angle is acquired;
and acquiring the target rotating speed of the visual angle of the virtual scene according to the adsorption force borne by the visual angle.
4. The method according to claim 1, wherein when the angle-of-view adjustment operation is detected and the aiming point is located in the absorption area of the target virtual object, acquiring the absorption force applied to the angle of view and the angle-of-view adjustment force corresponding to the angle-of-view adjustment operation according to the motion state of the target virtual object, comprises:
when the visual angle adjusting operation is detected and the aiming point is located in the adsorption area of the target virtual object, if the motion state, the health state or the virtual scene of the currently controlled virtual object is in the first state, the step of acquiring the adsorption force applied to the visual angle and the visual angle adjusting force corresponding to the visual angle adjusting operation according to the motion state of the target virtual object is executed.
5. The method of claim 1, further comprising:
when the visual angle adjusting operation is detected and the aiming point is positioned in the adsorption area of the target virtual object, if the motion state and the health state of the currently controlled virtual object or the virtual scene in which the currently controlled virtual object is positioned are in a second state, the visual angle adjusting force corresponding to the visual angle adjusting operation is acquired;
and acquiring the target rotation speed of the visual angle of the virtual scene according to the visual angle adjusting force.
6. An apparatus for displaying a virtual scene, the apparatus comprising:
the acquisition module is used for acquiring an adsorption force borne by a visual angle and a visual angle adjustment force corresponding to visual angle adjustment operation according to the motion state of a target virtual object when the visual angle adjustment operation is detected and an aiming point is positioned in an adsorption area of the target virtual object, wherein when the target virtual object moves, the adsorption force is the adsorption force with the largest numerical value of a first adsorption force and a second adsorption force; when the target virtual object is stationary, the adsorption force is the second adsorption force; the first adsorption force is used for controlling the aiming point to follow the target virtual object, and the second adsorption force is used for assisting the aiming point to move towards the target virtual object;
the acquisition module is further configured to acquire a target rotation speed of a visual angle of a virtual scene according to the adsorption force and the visual angle adjustment force when the visual angle adjustment operation is used to bring the aiming point close to the target virtual object;
the obtaining module is further configured to obtain a target rotation speed of a viewing angle of the virtual scene according to an absorption force, a damping force and the viewing angle adjusting force, which are borne by the viewing angle, when the viewing angle adjusting operation is used to keep the aiming point away from the target virtual object, where the damping force is a force opposite to the viewing angle adjusting operation direction;
the display module is used for displaying a virtual scene which changes along with the rotation of the visual angle in the process of controlling the visual angle to rotate according to the target rotation speed;
the method comprises the following steps of obtaining a target rotating speed of a visual angle of the virtual scene according to the adsorption force and the damping force of the visual angle and the visual angle adjusting force, wherein the target rotating speed comprises any one of the following items:
when the damping force borne by the visual angle is larger than the visual angle adjusting force, acquiring zero as the target rotating speed of the visual angle of the virtual scene;
and when the resultant force of the damping force borne by the visual angle and the adsorption force is greater than the visual angle adjusting force, acquiring zero as the target rotating speed of the visual angle of the virtual scene.
7. An electronic device, comprising one or more processors and one or more memories having at least one program code stored therein, the program code being loaded and executed by the one or more processors to perform operations performed by the virtual scene display method of any of claims 1 to 5.
8. A computer-readable storage medium having at least one program code stored therein, the program code being loaded and executed by a processor to implement the operations performed by the virtual scene display method of any one of claims 1 to 5.
CN201910992462.9A 2019-10-18 2019-10-18 Virtual scene display method and device, electronic equipment and storage medium Active CN110732135B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910992462.9A CN110732135B (en) 2019-10-18 2019-10-18 Virtual scene display method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910992462.9A CN110732135B (en) 2019-10-18 2019-10-18 Virtual scene display method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110732135A CN110732135A (en) 2020-01-31
CN110732135B true CN110732135B (en) 2022-03-08

Family

ID=69269255

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910992462.9A Active CN110732135B (en) 2019-10-18 2019-10-18 Virtual scene display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110732135B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111408132B (en) * 2020-02-17 2023-04-07 网易(杭州)网络有限公司 Game picture display method, device, equipment and storage medium
CN111589132A (en) * 2020-04-26 2020-08-28 腾讯科技(深圳)有限公司 Virtual item display method, computer equipment and storage medium
CN111784844B (en) * 2020-06-09 2024-01-05 北京五一视界数字孪生科技股份有限公司 Method and device for observing virtual object, storage medium and electronic equipment
CN111888762A (en) * 2020-08-13 2020-11-06 网易(杭州)网络有限公司 Method for adjusting visual angle of lens in game and electronic equipment
CN113144593A (en) * 2021-03-19 2021-07-23 网易(杭州)网络有限公司 Target aiming method and device in game, electronic equipment and storage medium
CN113633976B (en) * 2021-08-16 2023-06-20 腾讯科技(深圳)有限公司 Operation control method, device, equipment and computer readable storage medium
CN117170504B (en) * 2023-11-01 2024-01-19 南京维赛客网络科技有限公司 Method, system and storage medium for viewing with person in virtual character interaction scene

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1086729A2 (en) * 1999-09-24 2001-03-28 Konami Corporation Shooting video game system and image displaying method in shooting video game
CN107913515A (en) * 2017-10-25 2018-04-17 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN108415639A (en) * 2018-02-09 2018-08-17 腾讯科技(深圳)有限公司 Visual angle regulating method, device, electronic device and computer readable storage medium
CN109847336A (en) * 2019-02-26 2019-06-07 腾讯科技(深圳)有限公司 Virtual scene display method, apparatus, electronic equipment and storage medium
CN110147159A (en) * 2017-09-21 2019-08-20 腾讯科技(深圳)有限公司 Object localization method, device and electronic equipment in virtual interacting scene

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1086729A2 (en) * 1999-09-24 2001-03-28 Konami Corporation Shooting video game system and image displaying method in shooting video game
CN110147159A (en) * 2017-09-21 2019-08-20 腾讯科技(深圳)有限公司 Object localization method, device and electronic equipment in virtual interacting scene
CN107913515A (en) * 2017-10-25 2018-04-17 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN108415639A (en) * 2018-02-09 2018-08-17 腾讯科技(深圳)有限公司 Visual angle regulating method, device, electronic device and computer readable storage medium
CN109847336A (en) * 2019-02-26 2019-06-07 腾讯科技(深圳)有限公司 Virtual scene display method, apparatus, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN110732135A (en) 2020-01-31

Similar Documents

Publication Publication Date Title
CN110732135B (en) Virtual scene display method and device, electronic equipment and storage medium
CN113181650B (en) Control method, device, equipment and storage medium for calling object in virtual scene
CN111714886B (en) Virtual object control method, device, equipment and storage medium
WO2022105474A1 (en) State switching method and apparatus in virtual scene, device, medium, and program product
CN110507990B (en) Interaction method, device, terminal and storage medium based on virtual aircraft
CN111399639B (en) Method, device and equipment for controlling motion state in virtual environment and readable medium
US20230013014A1 (en) Method and apparatus for using virtual throwing prop, terminal, and storage medium
WO2022242400A1 (en) Method and apparatus for releasing skills of virtual object, device, medium, and program product
US20220161138A1 (en) Method and apparatus for using virtual prop, device, and storage medium
CN111921198B (en) Control method, device and equipment of virtual prop and computer readable storage medium
US20230072503A1 (en) Display method and apparatus for virtual vehicle, device, and storage medium
US20230405466A1 (en) Method and apparatus for operating virtual prop in virtual environment, device and readable medium
CN110882545A (en) Virtual object control method and device, electronic equipment and storage medium
US20230033530A1 (en) Method and apparatus for acquiring position in virtual scene, device, medium and program product
US20230364502A1 (en) Method and apparatus for controlling front sight in virtual scenario, electronic device, and storage medium
CN113633964A (en) Virtual skill control method, device, equipment and computer readable storage medium
CN110585706A (en) Interactive property control method, device, terminal and storage medium
TWI803147B (en) Virtual object control method, device, apparatus, storage medium, and program product thereof
WO2022007567A1 (en) Virtual resource display method and related device
CN114432701A (en) Ray display method, device and equipment based on virtual scene and storage medium
CN112138385B (en) Virtual shooting prop aiming method and device, electronic equipment and storage medium
CN111265876B (en) Method, device, equipment and storage medium for using props in virtual environment
WO2023071808A1 (en) Virtual scene-based graphic display method and apparatus, device, and medium
CN112121433B (en) Virtual prop processing method, device, equipment and computer readable storage medium
CN112138392B (en) Virtual object control method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40021454

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant