CN118512766A - Virtual prop display method, device, equipment, medium and product - Google Patents
Virtual prop display method, device, equipment, medium and product Download PDFInfo
- Publication number
- CN118512766A CN118512766A CN202310178330.9A CN202310178330A CN118512766A CN 118512766 A CN118512766 A CN 118512766A CN 202310178330 A CN202310178330 A CN 202310178330A CN 118512766 A CN118512766 A CN 118512766A
- Authority
- CN
- China
- Prior art keywords
- prop
- emission
- filling
- virtual
- props
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 84
- 230000004044 response Effects 0.000 claims abstract description 14
- ZMRUPTIKESYGQW-UHFFFAOYSA-N propranolol hydrochloride Chemical compound [H+].[Cl-].C1=CC=C2C(OCC(O)CNC(C)C)=CC=CC2=C1 ZMRUPTIKESYGQW-UHFFFAOYSA-N 0.000 claims abstract description 8
- 230000005540 biological transmission Effects 0.000 claims description 42
- 238000003860 storage Methods 0.000 claims description 16
- 230000033001 locomotion Effects 0.000 claims description 9
- 238000004590 computer program Methods 0.000 claims description 3
- 238000010304 firing Methods 0.000 description 89
- 230000008569 process Effects 0.000 description 26
- 238000010586 diagram Methods 0.000 description 15
- 238000012545 processing Methods 0.000 description 10
- 238000011084 recovery Methods 0.000 description 8
- 210000000988 bone and bone Anatomy 0.000 description 7
- 239000011159 matrix material Substances 0.000 description 6
- 238000009825 accumulation Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 230000006835 compression Effects 0.000 description 4
- 238000007906 compression Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000013473 artificial intelligence Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 238000009792 diffusion process Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000004083 survival effect Effects 0.000 description 3
- 208000015041 syndromic microphthalmia 10 Diseases 0.000 description 3
- 208000027418 Wounds and injury Diseases 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000006378 damage Effects 0.000 description 2
- 208000014674 injury Diseases 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000001186 cumulative effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
- A63F13/5375—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for graphically or textually suggesting an action, e.g. by displaying an arrow indicating a turn in a driving game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/56—Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/64—Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
- A63F2300/646—Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car for calculating the trajectory of an object
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8076—Shooting
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8082—Virtual reality
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application discloses a method, a device, equipment, a medium and a product for displaying virtual props, and relates to the technical field of virtual worlds. The method comprises the following steps: displaying a virtual scene picture; in response to receiving prop continuous transmitting operation, displaying n corresponding prop transmitting animations after sequentially transmitting filled props, wherein n is a positive integer; determining candidate emission tracks corresponding to the (n+1) th filling prop based on prop continuous operation; under the condition that the degrees of included angles between the emission angles corresponding to the prop emission tracks and the emission angles corresponding to the candidate emission tracks respectively correspond to the n filling props meet the emission angle conditions, determining the candidate emission tracks as target emission tracks corresponding to the n+1th filling props; and displaying prop launching animation launched by the (n+1) th filling prop along the target launching track. Namely, according to the prop emission tracks respectively corresponding to the first n filling props, the target emission track corresponding to the (n+1) th filling prop is determined, and the prop emission accuracy is improved.
Description
Technical Field
The application relates to the technical field of virtual worlds, in particular to a method, a device, equipment, a medium and a product for displaying virtual props.
Background
With rapid development of computer technology and functional diversification of terminals, electronic games that can be applied to terminals are becoming more and more widespread, and shooting games are a popular game. And displaying the virtual scene in the terminal, displaying the virtual object in the virtual scene, and controlling the virtual object to interact with other virtual objects for fight.
In the related art, in the process of controlling a virtual object to launch a filling prop through a virtual launching prop to fight, a prop launching track corresponding to the filling prop is randomly generated according to the current launching angle of the virtual launching prop, so that the filling prop flies according to the prop launching track after being launched.
However, in the related art, because the virtual transmitting prop has a recoil force when transmitting the filling prop, in the process of continuously transmitting a plurality of filling props, the transmitting track of the filling prop is deviated, so that the final falling point of the filling prop is far away from the falling points of other filling props, the transmitting accuracy of the virtual prop is low, and the real effect is poor.
Disclosure of Invention
The embodiment of the application provides a display method, a device, equipment, a medium and a product for virtual props, which can improve the reality of virtual props emission. The technical scheme is as follows.
According to an aspect of the present application, there is provided a method for displaying a virtual prop, the method comprising:
Displaying a virtual scene picture, wherein a virtual scene corresponding to the virtual scene picture comprises a main control virtual object, the main control virtual object holds a virtual transmitting prop, and a plurality of filling props are assembled in the virtual transmitting prop;
In response to receiving prop continuous transmitting operation, displaying prop transmitting animations respectively corresponding to n filling props after sequential transmitting, wherein the prop continuous transmitting operation is used for indicating the virtual transmitting props to continuously transmit the filling props, the n filling props respectively correspond to prop transmitting tracks, and n is a positive integer;
determining candidate emission tracks corresponding to the (n+1) th filling prop based on the prop continuous emission operation;
Under the condition that the degrees of included angles between the emission angles corresponding to the prop emission tracks respectively corresponding to the n filling props and the emission angles corresponding to the candidate emission tracks meet the emission angle conditions, determining the candidate emission tracks as target emission tracks corresponding to the (n+1) th filling props;
and displaying prop launching animation launched by the n+1th filling prop along the target launching track.
According to another aspect of the present application, there is provided a display device of a virtual prop, the device comprising:
The display module is used for displaying a virtual scene picture, wherein a virtual scene corresponding to the virtual scene picture comprises a main control virtual object, the main control virtual object holds a virtual transmitting prop, and a plurality of filling props are assembled in the virtual transmitting prop;
The display module is further used for displaying prop transmitting animations respectively corresponding to n filling props after being sequentially transmitted in response to receiving prop continuous transmitting operation, the prop continuous transmitting operation is used for indicating the virtual transmitting props to be used for continuously transmitting the filling props, the n filling props respectively correspond to prop transmitting tracks, and n is a positive integer;
the determining module is used for determining candidate emission tracks corresponding to the (n+1) th filling prop based on the prop continuous emission operation;
the determining module is further configured to determine, when degrees of included angles between an emission angle corresponding to the prop emission tracks corresponding to the n filling props and an emission angle corresponding to the candidate emission track meet an emission angle condition, the candidate emission track as a target emission track corresponding to the n+1th filling prop;
And the display module is also used for displaying prop launching animation launched by the (n+1) th filling prop along the target launching track.
According to another aspect of the present application, there is provided a computer device including a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or an instruction set, and the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by the processor to implement a method for displaying a virtual prop according to any one of the embodiments of the present application.
According to another aspect of the present application, there is provided a computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, loaded and executed by a processor to implement a method for displaying a virtual prop according to any of the embodiments of the present application described above.
According to another aspect of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the method for displaying a virtual prop according to any of the above embodiments.
The technical scheme provided by the embodiment of the application has the beneficial effects that at least:
In the process of displaying a main control virtual object with virtual transmitting props in a virtual scene picture, after receiving prop continuous transmitting operation, firstly displaying prop transmitting animations respectively corresponding to n filling props in the virtual transmitting props, determining candidate transmitting tracks corresponding to n+1th filling props according to prop continuous transmitting operation, and determining the candidate transmitting track as a target transmitting track corresponding to n+1th filling props under the condition that the included angle degree between the transmitting angle of the prop transmitting track respectively corresponding to the n filling props and the transmitting angle of the candidate transmitting track corresponding to the n+1th filling props accords with the transmitting angle, so as to display prop transmitting animations transmitted by the n+1th filling props along the target transmitting track. That is, the target emission track corresponding to the n+1th filling prop is determined according to the emission angles of the prop emission tracks corresponding to the first n filling props, so that starting from the n+1th filling prop, the situation that the distance between the target emission track corresponding to each filling prop and the prop emission track corresponding to the first n filling props is far can not exist, and therefore prop emission accuracy is improved, and prop emission authenticity can also be improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic illustration of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 2 is a block diagram of an electronic device provided in an exemplary embodiment of the application;
FIG. 3 is a flow chart of a method for displaying virtual props provided by an exemplary embodiment of the present application;
FIG. 4 is a flow chart of a method for displaying virtual props provided by another exemplary embodiment of the present application;
FIG. 5 is a schematic diagram of a method for determining a target emission trajectory according to an exemplary embodiment of the present application;
FIG. 6 is a flow chart of a method for displaying virtual props provided by an exemplary embodiment of the present application;
FIG. 7 is a schematic view of scattering angle parameters provided by another exemplary embodiment of the present application;
FIG. 8 is a schematic view of a reference landing point region provided by another exemplary embodiment of the present application;
FIG. 9 is a flow chart of a method for displaying virtual props provided by an exemplary embodiment of the present application;
FIG. 10 is a flowchart of a squat force accumulation phase provided by an exemplary embodiment of the application;
FIG. 11 is a flowchart of a recoil recovery phase provided by an exemplary embodiment of the present application;
FIG. 12 is a flow chart of a human body posture update provided by another exemplary embodiment of the present application;
FIG. 13 is a schematic illustration of firing results provided by an exemplary embodiment of the present application;
FIG. 14 is a schematic illustration of firing results provided by an exemplary embodiment of the present application;
FIG. 15 is a schematic diagram of launch trajectory and prop launch animation consistency provided by an exemplary embodiment of the present application;
FIG. 16 is a block diagram of a display device for a virtual prop provided by an exemplary embodiment of the present application;
FIG. 17 is a block diagram of a display device for a virtual prop provided in accordance with another exemplary embodiment of the present application;
fig. 18 is a block diagram of a terminal structure according to an exemplary embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail with reference to the accompanying drawings.
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with aspects of the application as detailed in the accompanying claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that, although the terms first, second, etc. may be used in this disclosure to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, a first parameter may also be referred to as a second parameter, and similarly, a second parameter may also be referred to as a first parameter, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "in response to a determination" depending on the context.
First, an environment in which the present application is implemented will be described. FIG. 1 provides a schematic illustration of an implementation environment provided by an exemplary embodiment of the present application. The implementation environment comprises the following steps: terminal 110, server 120, and communication network 130, wherein terminal 110 and server 120 are connected through communication network 130.
A target application 111 is installed and operated in the terminal 110, wherein the target application 111 is an application supporting a two-dimensional virtual environment or a three-dimensional virtual environment. The target application 111 may be any one of a virtual reality application, a three-dimensional map application, a self-propelled chess game, a strategy game, a Third person shooter game (Third-Person Shooting game, TPS), a First-person shooter game (First-Person Shooting game, FPS), a multiplayer online tactical competition game (Multiplayer Online Battle ARENA GAMES, MOBA), and a multiplayer gunclass survival game. In one implementation, the target application 111 may be a stand-alone application, such as a stand-alone three-dimensional game, or a network-on-line application.
Optionally, when the target application is implemented as a stand-alone application, a master virtual object in the virtual scene is displayed in an operation interface of the current target application, the master virtual object being a virtual object controlled by the terminal and being equipped with a virtual launch prop in which a plurality of filling props are assembled.
When a terminal receives prop continuous transmitting operation, displaying prop transmitting animations respectively corresponding to n filling props after sequential transmitting, wherein n is a positive integer, the n props respectively correspond to prop transmitting tracks, determining candidate transmitting tracks of the (n+1) th filling prop according to the prop continuous transmitting operation, and determining the candidate transmitting tracks as target transmitting tracks corresponding to the (n+1) th filling props under the condition that transmitting angles between the prop transmitting tracks respectively corresponding to the n filling props and the candidate transmitting tracks meet transmitting angle conditions; and displaying prop launching animation launched by the (n+1) th filling prop along the target launching track.
Alternatively, when the target application is implemented as an application on-line to the network, as shown in fig. 1, the current target application 111 is implemented as a game of game (e.g., TPS game, FPS game), and the terminal 110 displays a master virtual object in a virtual scene during the running of the target application 111, wherein the master virtual object is a virtual object that is master by the terminal and is equipped with a virtual launching prop in which a plurality of filling props are installed. When terminal 110 receives the prop-launch operation, a prop-launch request is generated and sent to server 120, where the prop-launch request is used to request continuous launching of a plurality of filling props.
After receiving the prop continuous request, the server 120 determines n prop launching animations corresponding to the filled props respectively, and feeds back the prop launching animations to the terminal 110 in sequence as prop launching results, and the terminal 110 displays the prop launching animations corresponding to the n filled props respectively according to the prop launching results, wherein the n filled props correspond to prop launching tracks respectively.
Server 120 also determines a candidate emission track corresponding to the (n+1) th filling prop according to prop continuous emission operation; under the condition that the degrees of included angles between the emission angles corresponding to the prop emission tracks and the emission angles corresponding to the candidate emission tracks respectively correspond to the n filling props meet the emission angle conditions, determining the candidate emission tracks as target emission tracks corresponding to the n+1th filling props; and (3) generating a prop launching animation of the n+1st filling prop launching along the target launching track, and feeding back the prop launching animation to the terminal 110 for display as a prop launching result.
The above-mentioned terminal 110 may be optional, and the terminal may be a desktop computer, a laptop computer, a mobile phone, a tablet computer, an electronic book reader, an MP3 (Moving Picture Experts Group Audio Layer III, dynamic image expert compression standard audio layer 3) player, an MP4 (Moving Picture Experts Group Audio Layer IV, dynamic image expert compression standard audio layer 4) player, a smart television, a smart car, or other terminal devices in various forms, which are not limited in this embodiment of the present application.
Server 120 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. Optionally, the server 120 takes over primary computing work and the terminal 110 takes over secondary computing work; or server 120 takes on secondary computing work and terminal 110 takes on primary computing work; or the server 120 and the terminal 110 perform cooperative computing by using a distributed computing architecture.
It should be noted that the server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, a content delivery network (Content Delivery Network, CDN), and basic cloud computing services such as big data and an artificial intelligence platform.
Cloud Technology (Cloud Technology) refers to a hosting Technology that unifies serial resources such as hardware, software, network and the like in a wide area network or a local area network to realize calculation, storage, processing and sharing of data.
In some embodiments, the servers described above may also be implemented as nodes in a blockchain system.
It should be noted that, the information (including but not limited to user equipment information, user personal information, etc.), data (including but not limited to data for analysis, stored data, presented data, etc.), and signals related to the present application are all authorized by the user or are fully authorized by the parties, and the collection, use, and processing of the related data is required to comply with the relevant laws and regulations and standards of the relevant countries and regions. For example, prop burst operations involved in the present application are acquired with sufficient authorization.
Fig. 2 shows a block diagram of an electronic device according to an exemplary embodiment of the present application. The electronic device 200 includes: an operating system 220 and application programs 222.
Operating system 220 is the underlying software that provides applications 222 with secure access to computer hardware.
The application 222 is an application supporting a virtual environment. Alternatively, application 222 is an application that supports a three-dimensional virtual environment. The application 222 may be any one of a virtual reality application, a three-dimensional map application, a Third person shooter game (TPS), a First person shooter game (FPS), a multiplayer online tactical game (Multiplayer Online Battle ARENA GAMES, MOBA), a multiplayer warfare survival game, a educational game, a strategy game. The application 222 may be a stand-alone application, such as a stand-alone game, or a network-connected application.
In combination with the above brief description, the method for displaying virtual props provided by the present application is described, where the method may be executed by a server or a terminal, or may be executed by both the server and the terminal.
Step 310, displaying the virtual scene picture.
The virtual scene corresponding to the virtual scene picture comprises a main control virtual object, wherein the main control virtual object holds a virtual transmitting prop, and a plurality of filling props are assembled in the virtual transmitting prop.
Illustratively, the terminal runs a target application program, and the target application program can be any one of a virtual reality application program, a three-dimensional map program, a self-propelled chess game, a strategy game, a Third person shooter game (Third-Person Shooting game, TPS), a First-person shooter game (FPS), a multiplayer online tactical competition game (Multiplayer Online Battle ARENA GAMES, MOBA) and a multiplayer gunfight survival game.
In some embodiments, the virtual scene is a display scene in a running interface corresponding to a target application running in the terminal.
Illustratively, the master virtual object refers to a virtual object of a target account control action registered by the current terminal.
Optionally, the virtual scene picture is a picture obtained by observing the virtual scene from a first person perspective of the master virtual object; or the virtual scene is a picture obtained by observing the virtual scene from a third person who hosts the virtual object.
In some embodiments, the virtual launch prop is a prop having prop launch functionality, such as: at least one of the types of virtual firearms, virtual arches, virtual stone throwers, virtual artillery, virtual tanks, and the like. Filling props in the virtual transmitting props refer to props which are transmitted by the virtual transmitting props and influence other virtual objects in the virtual scene, for example: when the virtual launch prop is implemented as a submachine gun, the filling prop is implemented as a submachine gun bullet; also for example: when the virtual launch prop is implemented as a virtual gun, the filling prop is implemented as a projectile.
Wherein prop emission functions include single point emission and continuous emission. Single point launch refers to when a prop launch operation is received, the virtual launch prop launches only one filling prop at a time, for example: when the virtual shooting prop is a sniping gun, the sniping gun shoots a bullet after receiving prop shooting operation. Continuous emission refers to that when an emission operation of a prop is received, a virtual emission prop emits a specified number of multiple filling props once, and emission is not stopped until an emission cancellation operation is received or the specified number of multiple filling props in the virtual emission props are emitted, for example: when the virtual shooting prop is realized as a continuous shooting rifle, three bullets can be shot continuously at a time by the continuous shooting rifle, therefore, when five bullets are assembled in the continuous shooting rifle, the continuous shooting rifle finishes the shooting process after receiving prop shooting operation on the continuous shooting rifle and continuously shooting three bullets without receiving shooting cancellation operation. Also for example: when the virtual firing prop is realized as a continuous firing rifle, three rounds of bullets can be continuously fired by the continuous firing rifle at a time, so when two rounds of bullets are assembled in the continuous firing rifle, the continuous firing rifle immediately ends the firing process after receiving the prop firing operation of the continuous firing rifle and continuously firing the two rounds of bullets without receiving the firing cancellation operation.
In another possible case, the continuous emission of the virtual emission props refers to stopping emission after all of the assembled plurality of filling props have been emitted without receiving an emission cancellation operation.
The emission cancellation operation includes at least one of the operations of actively triggering the emission cancellation control by a user, triggering other state controls to interrupt the emission process (such as triggering a running control and triggering a groveling control), and interrupting the emission process by the attack operation of other virtual objects in the virtual scene.
Optionally, the virtual launching prop is a launching prop currently held by the master virtual object; or the virtual launch prop is a launch prop in a backpack owned by the master virtual object.
Optionally, the plurality of filling props assembled in the virtual launch props belong to a same type of props; or a plurality of filling props assembled in the virtual transmitting props belong to different types of props.
Step 320, in response to receiving the prop continuous transmitting operation, displaying n prop transmitting animations corresponding to the n filling props after being sequentially transmitted.
The prop continuous operation is used for indicating to continuously emit filling props by using virtual emission props, n filling props respectively correspond to prop emission tracks, and n is a positive integer.
Illustratively, prop burst operation refers to a transmission operation in which a virtual transmission prop is instructed to continuously transmit a plurality of filling props in prop transmission operation.
In some embodiments, after receiving a prop continuous transmission operation on the virtual transmission props, the n filling props are sequentially transmitted according to the arrangement sequence in the virtual transmission props, so that prop transmitting animations respectively corresponding to the n filling props are displayed.
Illustratively, the prop launching trajectory refers to a flight trajectory between when a filling prop is launched from a virtual launching prop and when the filling prop reaches a landing point.
Optionally, the prop emission track is a track randomly generated by the development engine according to prop configuration parameters of the virtual emission prop; or the prop launching track is a fixed track which is directly set by a development engine in the development process.
The prop configuration parameters of the virtual transmitting props comprise at least one of the types of parameters such as the orientation of a transmitting port during prop transmission, the accurate center position of the virtual transmitting props, the transmitting range of the virtual transmitting props, the transmitting speed of the virtual transmitting props, the recoil of the virtual transmitting props, the scattering value of the virtual transmitting props and the like.
Optionally, the prop launching track is a straight track, i.e., the filling prop flies along a straight line after being launched from the virtual launching prop; or the prop launching track is a curved track, i.e. the filled prop flies along a parabola after being launched from the virtual launching prop.
Illustratively, prop launching animation refers to animation of a filling prop from being launched by a virtual launching prop to reaching a landing point after flying along a prop launching trajectory.
The drop point comprises at least one of a destination which is reached by a user controlling the main control virtual object to emit the filling prop by using the virtual emission prop and a drop point type such as an obstacle which is collided in the flight process of the main control virtual object to emit the filling prop by using the virtual emission prop.
Step 330, determining a candidate emission track corresponding to the (n+1) th filling prop based on prop continuous emission operation.
In some embodiments, the (n+1) th filling prop is a filling prop that fits in the virtual launch prop.
Illustratively, candidate emission trajectories are randomly generated according to a scattering law prior to the emission of the n+1th filling prop.
The scattering law is also called ballistic deflection or quasi-center diffusion, when the virtual launching prop launches, the cross quasi-center of the virtual launching prop can be dispersed to the periphery, when shooting is stopped, the cross quasi-center can be contracted to an initial position, and the scattering range is randomly distributed in a hollow area circle corresponding to the cross quasi-center. The larger the cross center diffusion is, the larger the scattering random range is, the more uncontrollable the landing point of the filling prop is, and the cross center is the intersection point of the cross center when the cross center diffusion is small to the limit, and no scattering exists at the moment.
Optionally, after receiving the prop continuous launching operation, randomly generating prop launching tracks corresponding to the plurality of filling props respectively, and after the nth filling prop is launched, taking the prop launching track randomly generated by the (n+1) th filling prop as a candidate launching track corresponding to the (n+1) th filling prop.
Optionally, after receiving the prop continuous transmitting operation, the prop transmitting track corresponding to each filling prop is randomly generated before each filling prop is transmitted.
It is noted that when a single continuous emission of a virtual emission prop has a specified number of requirements for a filling prop, then the n+1th filling prop should belong to a filling prop within the specified number of requirements.
Step 340, determining the candidate emission track as the target emission track corresponding to the n+1th filling prop under the condition that the degrees of the included angles between the emission angles corresponding to the n filling prop emission tracks and the emission angles corresponding to the candidate emission tracks meet the emission angle condition.
In some embodiments, the degrees of the included angle refer to the included angle between the emission angle corresponding to the emission track of the prop and the emission angle corresponding to the candidate emission track in the case that the emission track of the prop and the candidate emission track belong to the track emitted by the same emission point. In one example, when the filling prop is continuously emitted from the first-person viewing angle, different filling props are emitted from the fixed position (i.e. the emitting port) of the virtual emitting prop, that is, the emitting points of the different filling props are the same, but the emitting directions of the different filling props are different due to different emitting directions, and the emitting angles corresponding to the emitting tracks of the different filling props are different, so that the included angle between the emitting angles of the same emitting point but corresponding to different directions is taken as the included angle degree.
Schematically, when the included angle degree is larger, the candidate emission track corresponding to the n+1th filling prop deviates from the prop emission track corresponding to the n filling props more.
Illustratively, the target launch trajectory refers to the trajectory of the final launch of the n+1th filling prop. The target emission track comprises at least one of parameters such as an emission direction, an emission angle, an emission initial speed, a flight track and the like.
Optionally, the method for determining the target emission trajectory includes at least one of the following:
1. Generating a preset track range according to the emission angle corresponding to the candidate emission track of the n+1th filling prop and the designated track position, and taking the candidate emission track of the n+1th filling prop as a target emission track when the number of prop emission tracks existing in the preset track range reaches a quantity threshold;
2. Calculating the degrees of included angles between the emission angles corresponding to the prop emission tracks corresponding to the n filling props and the emission angles corresponding to the candidate emission tracks corresponding to the n+1th filling props, and taking the candidate emission track corresponding to the n+1th filling props as a target emission track when the degrees of included angles between the prop emission tracks corresponding to the n filling props and the emission angles corresponding to the candidate emission tracks corresponding to the n+1th filling props do not reach a preset angle threshold;
3. Obtaining track overlapping results between the prop emitting tracks corresponding to n filling props and the candidate emitting tracks corresponding to n+1th filling props, wherein when track overlapping exists between the prop emitting tracks corresponding to the filling props and the candidate emitting tracks corresponding to n+1th filling props in a specified number, the candidate emitting tracks corresponding to the n+1th filling props are used as target emitting tracks, and the track overlapping results refer to the existence of intersection points between the prop emitting tracks and the candidate emitting tracks.
It should be noted that the above method for determining the emission track of the target is merely a schematic distance, which is not limited in this embodiment of the present application.
Step 350, displaying the prop launching animation of the n+1st filling prop launching along the target launching track.
Schematically, after the target emission track corresponding to the n+1th filling prop is determined, displaying the animation reaching the landing point after the n+1th filling prop is ejected from the virtual emission prop and flies along the target emission track, and taking the animation as prop emission animation.
In summary, in the method for displaying virtual props provided by the embodiment of the present application, in the process of displaying a main control virtual object with a virtual transmission prop in a virtual scene, after a prop continuous transmission operation is received, prop transmitting animations corresponding to n filling props in the virtual transmission prop are displayed first, candidate transmitting tracks corresponding to n+1th filling props are determined according to the prop continuous transmission operation, and under the condition that the degrees of included angles between the transmitting angles of the prop transmitting tracks corresponding to the n filling props and the transmitting angles of the candidate transmitting tracks corresponding to the n+1th filling props are consistent with the transmitting angles, the candidate transmitting tracks are determined as target transmitting tracks corresponding to the n+1th filling props, so that prop transmitting animations transmitted along the target transmitting tracks by the n+1th filling props are displayed. That is, the target emission track corresponding to the n+1th filling prop is determined according to the emission angles of the prop emission tracks corresponding to the first n filling props, so that starting from the n+1th filling prop, the situation that the distance between the target emission track corresponding to each filling prop and the prop emission track corresponding to the first n filling props is far can not exist, and therefore prop emission accuracy is improved, and prop emission authenticity can also be improved.
In some embodiments, two ways of determining the target emission track are included, and referring to fig. 4, a method for displaying a virtual prop according to an exemplary embodiment of the present application is shown schematically, that is, step 340 further includes step 341 and step 342, or step 340 includes step 340a to step 340c, and step 350 further includes step 350a before step 350, as shown in fig. 4, and the method includes the following steps.
In this embodiment, the determination method of the target emission track includes the following two methods.
First, a target emission track is determined according to a preset track range.
Step 341, determining a preset track range based on the emission angle corresponding to the candidate emission track.
In this embodiment, an emission track (including a candidate emission track and a prop emission track) is taken as an example and described as a straight line track.
Illustratively, the preset trajectory range is a specified range framed based on the emission angle corresponding to the candidate emission trajectory.
In some embodiments, a preset angular range is obtained; taking a designated track position in the candidate emission track as a starting position, and determining a target radius based on a preset angle range; a preset track range is determined based on the specified track position and the target radius.
In the present embodiment, since the candidate emission track is a straight line track, the intermediate position of the candidate emission track according to the straight line track is regarded as a track point as the designated track position of the candidate emission track.
In another possible case, when the candidate transmission trajectory is a parabolic trajectory, the specified trajectory position may be implemented as a middle position of the parabolic trajectory.
In this embodiment, an angle range (for example: 30 degrees) is preset, a specified track position (a middle position of the track) corresponding to the candidate emission track is taken as a range center, and a target radius is generated according to the angle range, so that a circular area generated by taking the specified track position as a circle center is taken as a preset track range (an area with other shapes can be also realized as a preset range track, for example, a square area).
Referring to fig. 5, a schematic diagram of determining a target emission track according to an exemplary embodiment of the present application is shown, and as shown in fig. 5, an emission track schematic diagram 500 is currently displayed, where four filling props are included first, and the four filling props correspond to prop emission tracks (reference numeral 1, reference numeral 2, reference numeral 3 and reference numeral 4), respectively. The currently displayed emission track schematic 500 includes a candidate emission track 510 corresponding to the fifth filling prop, where the candidate emission track 510 corresponds to the preset track range 511.
In step 342, in response to at least k filling props in the n filling props, the prop emission tracks corresponding to the at least k filling props are within the preset track range, the candidate emission track is determined to be the target emission track corresponding to the n+1th filling prop.
Wherein k is more than 0 and less than or equal to n, and k is an integer.
In this embodiment, since the emission track is implemented as one track point corresponding to the midpoint position of the track, when at least k track points corresponding to the filling props exist in the n filling props, the candidate emission track is determined as the target emission track corresponding to the n+1th filling props.
Schematically, as shown in fig. 5, the currently displayed emission track schematic diagram 500 includes a candidate emission track 520 (reference numeral 6) corresponding to the sixth filling prop, where the candidate emission track 520 corresponds to a preset track range 521, k=2, and as can be seen from fig. 5, there are prop emission tracks (reference numeral 3 and reference numeral 4) corresponding to two filling props in the preset track range 521, so that the candidate emission track 521 corresponding to the sixth filling prop is taken as the target emission track.
Second, the target emission track is determined according to the degrees of the included angles.
Step 340a, obtaining track distance included angle degrees between the emission angles corresponding to the prop emission tracks and the emission angles corresponding to the candidate emission tracks respectively corresponding to the n filling props.
In this embodiment, an emission track (including a candidate emission track and a prop emission track) is taken as an example and described as a straight line track.
Illustratively, since the candidate transmission track is a straight track, the candidate transmission track is regarded as a track point corresponding to the middle position of the track as the designated track position of the candidate transmission track. In this embodiment, the designated track position of the candidate transmission track is a track point.
Illustratively, since the prop launching track is a straight track, the prop launching track is regarded as a track point as a designated track position of the prop launching track. In this embodiment, the designated track position of the prop launching track is a track point.
In this embodiment, the emission angles corresponding to the emission tracks of the n filling props respectively and the straight clip angles between the emission angles corresponding to the candidate emission tracks are obtained.
Step 340b, obtain the emission angle threshold.
In this embodiment, an angle threshold (e.g., 40 degrees) is obtained in advance according to the candidate transmission trajectory.
Step 340c, determining the candidate emission track as the target emission track corresponding to the n+1th filling prop in response to the track distance included angle between the emission angle corresponding to the emission track of the at least j filling props and the emission angle corresponding to the candidate emission track, which are respectively corresponding to the n filling props, reaching the preset distance angle threshold.
Wherein j is more than 0 and less than or equal to n, and j is an integer.
In this embodiment, j=2 is taken as an example for explanation, and when the included angle between the emission angle of the prop emission track and the emission angle of the candidate emission track corresponding to at least two filling props in the n filling props respectively does not reach the preset angle threshold, the n+1th candidate emission track is determined as the target emission track.
And 350a, eliminating the candidate emission tracks corresponding to the n+1th filling props under the condition that the degrees of the included angles between the emission angles corresponding to the prop emission tracks corresponding to the n filling props and the emission angles corresponding to the candidate emission tracks do not meet the emission angle condition.
Illustratively, the emission angle condition refers to that the number of prop emission tracks of the filling props, of which the included angle degrees between the emission angles corresponding to the candidate emission tracks reach a preset angle threshold, does not reach a preset number threshold.
In this embodiment, when the number of property emission tracks of a filling property whose family education number reaches a preset angle threshold value does not reach a preset number threshold value, the candidate emission track corresponding to the n+1th filling property is removed, a new candidate emission track is randomly generated again, and determination is performed according to the above steps.
In summary, in the method for displaying virtual props provided by the embodiment of the present application, in the process of displaying a main control virtual object with a virtual transmission prop in a virtual scene, after a prop continuous transmission operation is received, prop transmitting animations corresponding to n filling props in the virtual transmission prop are displayed first, candidate transmitting tracks corresponding to n+1th filling props are determined according to the prop continuous transmission operation, and under the condition that the degrees of included angles between the transmitting angles of the prop transmitting tracks corresponding to the n filling props and the transmitting angles of the candidate transmitting tracks corresponding to the n+1th filling props are consistent with the transmitting angles, the candidate transmitting tracks are determined as target transmitting tracks corresponding to the n+1th filling props, so that prop transmitting animations transmitted along the target transmitting tracks by the n+1th filling props are displayed. That is, the target emission track corresponding to the n+1th filling prop is determined according to the emission angles of the prop emission tracks corresponding to the first n filling props, so that starting from the n+1th filling prop, the situation that the distance between the target emission track corresponding to each filling prop and the prop emission track corresponding to the first n filling props is far can not exist, and therefore prop emission accuracy is improved, and prop emission authenticity can also be improved.
In this embodiment, the preset track range is obtained according to the candidate emission track, so that it is determined that the number of prop emission tracks in the preset track range reaches the specified number, and the candidate emission track is judged to be the target emission track, so that the situation that the prop emission track filled with props deviates from other prop emission tracks from the n+1th emission track can be avoided, and prop emission accuracy is improved.
In this embodiment, by calculating the included angle between the prop emission track of the first n filled props and the candidate emission track of the n+1th filled props, when the degree of the included angle between the prop emission track with the specified number and the candidate emission track is within the preset angle threshold, the candidate emission track is used as the target emission track, so that the situation that the prop emission track with the filled props deviates from the other prop emission tracks from the n+1th can be avoided, and the prop emission accuracy is improved.
In this embodiment, the accuracy of track determination can be improved by a preset track range generated with the specified track position in the candidate transmission track as the center of the range.
In this embodiment, the candidate emission tracks which do not meet the emission angle condition are removed, so that the candidate emission tracks are regenerated, and the accuracy of the prop emission tracks can be improved.
In an alternative embodiment, a plurality of filling objects ejected by the virtual launch vehicle will eventually all fall into the target landing area, schematically, please refer to fig. 6, which shows a flowchart of a method for displaying a virtual vehicle according to an exemplary embodiment of the present application, as shown in fig. 6, and the method is described, for example, before step 320, and the method includes the following steps.
Before explaining this embodiment, two prop parameters related to the present application will be explained.
Recoil parameters: when the virtual transmitting prop continuously transmits a plurality of filling props, the transmitting power of the virtual transmitting prop can generate a reaction force on the virtual transmitting prop, so that the virtual transmitting prop cannot be stably transmitted at one position, and the condition that the transmitting port is lifted (or the transmitting port is moved downwards) is generated.
Scattering angle parameter: when the virtual transmitting prop is in a state of continuously transmitting a plurality of filling props, accuracy is reduced, so that an angle offset occurs between the filling props and a cross sight of the virtual transmitting prop, wherein the offset process can be also called scattering.
Step 610, obtain a scattering angle parameter corresponding to the virtual emission prop.
In some embodiments, the scattering angle parameter refers to an angle parameter for controlling a virtual camera orientation corresponding to a virtual launch prop, thereby launching a filling prop, such as: taking the virtual firearm as an example, if the reference direction of the muzzle of the virtual firearm before firing is 0 degrees, the virtual firearm does not have any scattering angle at the moment, and when the virtual firearm has a scattering angle of 30 degrees, the virtual camera corresponding to the virtual firearm is lifted by 30 degrees, so that the scattering rule in the subsequent firing process of the virtual firearm is realized, that is, the actual transmitting position of the filling prop is offset from the cross sight of the virtual firearm.
In some embodiments, a movement state of a master virtual object is acquired based on prop transmitting operation, and a transmitting state corresponding to a virtual transmitting prop is acquired; determining a scattering parameter based on the movement state and the emission state; establishing a target coordinate system based on the scattering parameters; randomly generating a target angle and a target radius in a target coordinate system; the scattering angle parameter is determined based on the region constituted by the target angle and the target radius in the target coordinate system.
In this embodiment, taking a virtual firearm as an example, after receiving a continuous operation on a prop of the virtual firearm, a moving state (such as running, standing, lying on the ground, squatting on the ground, etc.) of a main control virtual object in a virtual scene at the current moment is determined, and a firing state (such as muzzle orientation, included angle of muzzle based on horizontal plane, cross sight position of the virtual firearm, etc.) of the virtual firearm is determined. The scattering parameter is determined based on the movement state and the emission state, wherein the scattering parameter may be implemented as a fixed value (e.g., 0.3), or the scattering parameter may be implemented as a range of parameters (e.g., 0.3 to 0.5).
In this embodiment, after the scattering value is obtained by calculation, a circular coordinate system is established as the target coordinate system according to the scattering value, wherein the origin in the circular coordinate system is a cross-shaped sight of the virtual emission prop, and also represents the minimum scattering angle, the distance from the point on the circle to the origin in the circular coordinate system is referred to as the reference radius, and the reference radius value is determined by the scattering parameter.
In this embodiment, the target angle is randomly generated in the circular coordinate system (an angle is randomly generated between-180 degrees and 180 degrees), and the target radius is randomly generated in the circular coordinate system (a radius is randomly generated between 0 and a reference radius value), so that the scattering angle parameter is determined according to the target angle and the region in which the target radius is constructed in the circular coordinate system.
In this embodiment, the scattering angle parameter includes a scattering direction parameter of the virtual emission prop, and an emission direction parameter of the virtual emission prop. Taking a virtual firearm as an example, the scattering direction parameter refers to a firing direction of a muzzle, the firing direction parameter refers to a bone model corresponding to a rotating hand when a main control virtual object holds the virtual firearm to fire a bullet, so that the whole firing direction rotation of the virtual firearm is realized, namely, in a robot coordinate system corresponding to the main control virtual object hand, a Roll axis (a rolling angle for visual angle rotation), a Pitch axis (a Pitch angle for upward and downward rotation) and a Yaw axis (a side navigation angle for leftward and rightward rotation), the firing direction rotation is realized along the Yaw axis by a specified angle (Yaw angle), and the firing scattering is realized along the Pitch axis by a specified angle (Pitch angle).
The scattering angle parameter may refer to the following formula one.
Equation one:
wherein, the scattering Angle parameter is a class for packaging three Euler angles Picth, yaw, roll, wherein, the Pitch Angle is that Radius represents the target Radius, angle represents the target Angle, the Yaw Angle is that Roll Angle is 0.
Thus, as can be seen from equation one, the rotation Pitch angle is implemented as a scattering process and the rotation Yaw implements firing direction change.
Referring to fig. 7, a schematic view of scattering angle parameters according to an exemplary embodiment of the present application is shown, and as shown in fig. 7, a circular coordinate system 700 is currently displayed, and when a target angle 711 and a target radius 712 are obtained, a scattering angle parameter 713 is calculated.
Step 620, determining a corresponding reference drop point region after the filling prop is emitted based on the scattering angle parameter.
In this embodiment, according to the scattering direction parameter and the emission direction parameter in the scattering angle parameter, a scattering circle area formed by using a cross sight before the first filling prop is emitted by the virtual emission prop as an origin is determined as a reference landing point area.
Referring to fig. 8, a schematic diagram of a reference drop point area provided by an exemplary embodiment of the present application is shown, and a reference drop point area 800 (gray area) is currently displayed, where the reference drop point area 800 is implemented as an overlapping area of two concentric circles, the origin of the two concentric circles is a cross-shaped star of a virtual emission prop, and the radii of the two concentric circles are obtained according to the scattering percentage.
Step 630, obtaining recoil parameters corresponding to the virtual launching prop and a preset deflection angle.
Schematically, the recoil parameter corresponding to the virtual transmission prop belongs to the attribute parameter of the virtual transmission prop, and can be obtained from the attribute file of the virtual transmission prop in the server.
Illustratively, a deflection angle is preset as a preset deflection angle by taking the origin of the concentric circles as an angle starting point.
In some embodiments, obtaining reference camera parameters corresponding to a virtual camera of a virtual launch prop; determining a camera offset parameter corresponding to the virtual camera based on the recoil parameter; and acquiring a target emission parameter corresponding to the virtual camera based on the reference camera parameter and the camera offset parameter, wherein the target emission parameter is used for indicating a lens offset distance and a lens rotation angle corresponding to the virtual camera when the virtual emission prop emits the filling prop.
In this embodiment, after the recoil parameters corresponding to the virtual launching prop are obtained, a process of changing the prop launching animation corresponding to the virtual launching prop along with the recoil parameters of the launching prop is determined.
In this embodiment, the prop launching animation is generally obtained by shooting a prop launching process of a virtual launching prop by a virtual camera installed on a hand of a master virtual object, so that a camera parameter corresponding to the virtual launching prop at the moment is obtained as a reference camera parameter before the virtual launching prop launches a filling prop. The camera parameters include the lens position of the virtual camera in the virtual scene, and the virtual lens rotation (the left and right deflection of the lens before the prop is launched is displayed in the picture).
In this embodiment, the reference camera parameters corresponding to the virtual camera are superimposed with the rotation generated by the recoil parameters, so as to achieve the effect of lifting the transmitting port, where the lifting angle of the transmitting port is the upward offset angle of the lens of the virtual camera, that is, the camera offset parameter.
In this embodiment, the target emission parameters corresponding to the current filling prop are determined according to the camera offset parameters and the reference offset parameters, where the target emission parameters include the distance of upward lens offset (lens offset distance, pitch angle) and the lens rotation angle (Yaw angle).
Step 640, determining a target landing point area after the filling prop is launched based on the preset deflection angle and the reference landing point area by taking the recoil parameter as a vector direction.
In this embodiment, the direction difference between the emission vector of the previous filling prop and the emission vector of the filling prop to be currently emitted is obtained as the vector direction, and the fan-shaped area formed on the reference landing point area according to the preset deflection angle is used as the target landing point area.
Referring to fig. 8, a target landing area 810 (hatched sector) in the reference landing area 800 is determined after determining the vector direction and the preset deflection angle.
In some embodiments, an animation is displayed that reaches the target landing area after the n+1th filling prop is launched along the target launch trajectory.
In this embodiment, when the degrees of the included angle between the emission angles corresponding to the prop emission tracks and the emission angles of the candidate emission tracks corresponding to the n+1th filling props respectively meet the emission angle conditions, the candidate emission track is used as the target emission track of the n+1th filling props, wherein the landing point corresponding to the target emission track is located in the target landing area, so that in the process that no obstacle exists in the n+1th filling props in the flight process, the animation that the n+1th filling props reach the target landing area after flying along the target emission track is displayed.
In summary, in the method for displaying virtual props provided by the embodiment of the present application, in the process of displaying a main control virtual object with a virtual transmission prop in a virtual scene, after a prop continuous transmission operation is received, prop transmitting animations corresponding to n filling props in the virtual transmission prop are displayed first, candidate transmitting tracks corresponding to n+1th filling props are determined according to the prop continuous transmission operation, and under the condition that the degrees of included angles between the transmitting angles of the prop transmitting tracks corresponding to the n filling props and the transmitting angles of the candidate transmitting tracks corresponding to the n+1th filling props are consistent with the transmitting angles, the candidate transmitting tracks are determined as target transmitting tracks corresponding to the n+1th filling props, so that prop transmitting animations transmitted along the target transmitting tracks by the n+1th filling props are displayed. That is, the target emission track corresponding to the n+1th filling prop is determined according to the emission angles of the prop emission tracks corresponding to the first n filling props, so that starting from the n+1th filling prop, the situation that the distance between the target emission track corresponding to each filling prop and the prop emission track corresponding to the first n filling props is far can not exist, and therefore prop emission accuracy is improved, and prop emission authenticity can also be improved.
In this embodiment, after the scattering angle parameter is obtained, the target landing point areas of the final plurality of filling props are generated according to the reference landing point areas and the recoil parameter, so that the landing point areas of the filling props meet the recoil condition, and the reality of prop emission is improved.
In this embodiment, the emission parameters of the virtual camera, including the lens offset distance and the lens rotation angle, are adjusted by the recoil parameters, so that the display fidelity of the prop emission animation can be improved.
In this embodiment, a circular coordinate system is established according to the scattering parameters, and a target angle and a target radius are randomly generated therefrom to finally obtain the scattering angle parameters, so as to improve the realism of the prop emission along with scattering distribution.
For an example of the application of the method for displaying a virtual prop provided by the present application to a first-person game scenario, please refer to fig. 9, which shows a flowchart of the method for displaying a virtual prop provided by an exemplary embodiment of the present application, as shown in fig. 9, the current method includes the following steps.
Step 901, a user triggers a firing operation.
And currently displaying a game scene picture under the first-person view angle of the first-person game, wherein the game scene picture is obtained by observing a game scene at the first-person view angle of the main control virtual object controlled by a user.
In this embodiment, the master control virtual object holds a virtual firearm, so a first virtual camera is set on the hand of the master control virtual object to shoot a game scene, thereby displaying a picture of the master control virtual object holding the virtual firearm under a first person viewing angle.
In this embodiment, when the user triggers the firing control corresponding to the virtual firearm, the firing control is used as the firing operation. Wherein the firing operation is implemented as a continuous firing operation, i.e. a continuous firing of a plurality of rounds using a virtual firearm.
Step 902, a firing operation is passed to a master virtual object.
And sending the firing instruction to the client by the server through triggering the triggering operation of the firing control as a firing request.
In step 903, the virtual launch pad updates state.
The client and the server synchronously update the state of the virtual firearm from an un-fired state to a fired ready state.
At step 904, the virtual launch prop enters an firing state.
The client and the server synchronously update the virtual firearm to enter a firing state, and at the moment, a second virtual camera is installed on the virtual firearm and used for shooting a virtual scene, so that a scene picture of the current virtual firearm in the firing state is displayed.
Step 905 triggers virtual launch prop firing related logic.
Firstly, acquiring a scattering angle parameter corresponding to a virtual firearm, generating a reference falling area according to the scattering angle parameter, determining a target falling point area of a bullet according to a preset deflection angle and the reference falling point area by taking a recoil parameter as a vector direction, and randomly generating prop firing tracks respectively corresponding to the front n bullets in the target falling point area according to a scattering rule.
At step 906, the virtual launch prop fire counts, and recoil and scatter related data is refreshed.
After each round the virtual firearm fires, the firing operation is counted, thereby refreshing recoil and scatter-related data.
The client and the server synchronously trigger firing related logic of the virtual firearm, including logic of firearm firing following recoil parameter adjustment and logic of firearm firing following firearm scattering parameter adjustment.
Next, the logic of firearm firing followed by recoil parameter adjustment will be described first.
In the logic of firearm firing followed by recoil parameter adjustment, there are two phases, a recoil accumulation phase and a recoil recovery phase.
The recoiling force accumulation stage is to update the current recoiling force parameter according to the time between two firing and the lens offset parameter, so as to adjust the camera orientation of the first virtual camera.
The recoil recovery stage refers to a process of recovering the camera affected by the recoil parameter after stopping firing.
Referring to fig. 10 schematically, a flowchart of a squat force accumulation phase provided by an exemplary embodiment of the application is shown, as shown in fig. 10, and includes the following steps.
In step 1010, the time t between the previous firing to the current time is calculated.
And acquiring the time between the current moment and the last firing operation.
Step 1020, calculating a lift rotation angle.
And reading a pre-configured lifting integral curve according to the number and time t of the fired corresponding to the current moment of the virtual firearm, and calculating the lifting rotation angle of the current moment corresponding to the current firing.
The lifting rotation angle refers to an angle that a muzzle of the virtual firearm rotates upwards along the Pitch axis.
In this embodiment, the integration curve is read, because the jitter performance of one frame number magnitude in the firing process of the virtual firearm can generate worse errors on different frame numbers, and the integration curve can ensure that the jitter rotation angles calculated under the same time are the same.
Step 1030, calculate the left-right offset rotation angle.
And reading a pre-configured left-right offset curve according to the number and time of fired at the current moment, so as to calculate the rotation angle along the Yaw axis as a left-right offset rotation angle.
Step 1040, calculate the regression amplitude, direction, orientation and starting point after firing.
And calculating the amplitude, direction, orientation and starting point of muzzle regression after each gun is started according to the number of fired at the current moment and the accumulated lifting rotation angle in the steps.
Step 1050 updates the accumulated squat force value at the current time.
And updating the accumulated recoiling force value at the current moment according to the calculated regression amplitude, direction, orientation and starting point, and modifying the orientation position of the first camera.
Referring to fig. 11 schematically, a flowchart of a recoil recovery phase provided by an exemplary embodiment of the present application is shown, as shown in fig. 11, and includes the following steps.
Step 1110, a recoil recovery phase is entered.
And after judging that the virtual firearm is in a pause state of firing, entering a recoil recovery stage.
Step 1120, calculating the time t elapsed from the beginning of the recoil recovery phase to the current time.
At step 1130, the profile configuration is read.
And reading the lift-up integral curve configuration according to the time t, and superposing a preset time scaling coefficient.
In step 1140, target skills are superimposed.
And after the lift integral curve configuration is read, superimposing the influence of the skill attribute corresponding to the target skill released by the main control virtual object on the recoil force regression.
In step 1150, the virtual camera orientation is modified.
Decreasing the cumulative squat force value at the current time and modifying the orientation of the first virtual camera.
In step 907, the location information of the master virtual object is refreshed.
In the process of changing the direction of the first virtual camera according to the recoil accumulation stage and the recoil recovery stage, the human body posture corresponding to the main control virtual object is updated. Referring to fig. 12, a flowchart of a human body posture update according to an exemplary embodiment of the present application is shown, and the method includes the following steps, as shown in fig. 12.
In step 1210, an initial position and an initial rotation angle of the first virtual camera are obtained.
And acquiring the initial position and the initial rotation angle of the first virtual camera of the main control virtual object under the world coordinates.
Step 1220, the angle of rotation created by the recoil is superimposed, processing the ballistic separation of the lift gun effect.
The angle of rotation of the first virtual camera is then directly added to the recoil gun-up rotational offset (side-to-side and up-down).
In step 1230, the camera orientation is obtained.
And recording the initial orientation of the first virtual camera, and directly using the initial orientation of the first virtual camera to assign a value to the initial orientation.
In step 1240, the camera position is acquired.
And calculating the position of the main control virtual object corresponding to the first person model according to the rotation of the first virtual camera. And obtaining a new rotation angle of the first person model by acquiring a default rotation angle of the first person model as an initial value and then rotating around the position of the first virtual camera.
In step 1250, lens shift parameters are obtained.
And adjusting the height difference between the first person model and the first virtual camera as a lens offset parameter according to the relative position between the main control virtual object and the third person model.
Step 1260, updating the human body pose.
The camera position and lens shift parameters described above are used to update the position and rotation angle of the first person model.
Step 1270, superimposing the squat force generated positional offset parameter.
The position shift parameters of the recoil of the virtual firearm are superimposed on the basis of the updated position and rotation angle of the first-person model.
Next, the logic of firearm firing following adjustment of the scattering angle parameters will be described.
Referring to fig. 13, a flowchart of adjustment of scattering angle parameters according to an exemplary embodiment of the present application is shown, and the method includes the following steps as shown in fig. 13.
At step 1310, a scattering parameter is calculated.
And calculating according to the moving state of the main control virtual object and the firing state of the virtual firearm to obtain the scattering parameter. Wherein the scattering parameter is used to determine the size of the scattering range of the bullet.
In step 1320, a circular coordinate system is established.
And establishing a round coordinate system with a muzzle as an origin according to the scattering parameters.
Step 1330, randomly generating a target angle.
The target angle is randomly generated in a circular coordinate system.
Step 1340, randomly generating a target radius.
The target radius is randomly generated in a circular coordinate system.
In step 1350, the scatter angle parameter is calculated based on the radius and the angle.
And according to the target radius and the target angle, calculating to obtain a scattering angle parameter through a formula I.
Step 1360, rotating the firing direction according to the scattering angle parameter.
And the firing direction of the muzzle is rotated according to the scattering angle parameters, so that the effect of firing scattering is achieved.
Step 1370, rotating the bone according to the scattering angle parameter.
And rotating a skeleton model corresponding to the hand of the main control virtual object according to the scattering angle parameters, so as to achieve the effect of adjusting the firing direction.
In this embodiment, the flow of the prop firing animation following the bullet scattering portion is as follows.
(1) A matrix rotated about an arbitrary point is calculated, assuming that the angle alpha is rotated about the point P (x 1, y 1).
(2) A matrix T1 for translating the P point to the origin is calculated, wherein the matrix T1 represents an initial coordinate position corresponding to the bone model on the hand hosting the virtual object.
(3) And calculating a rotation matrix R1 of the rotation alpha angle according to the scattering angle parameter.
(4) A translation matrix T2 is calculated that will translate from the origin to point P, wherein matrix T2 represents the coordinate locations corresponding to the bone model on the virtual firearm.
(5) The position information of the virtual firearm is rotated according to the scattering orientation.
(6) First, the position information thereof is acquired.
(7) Translation to a rotation point Tm1 is based on the positional information, tm1 representing the rotation point corresponding to the bone model on the hand hosting the virtual object.
(8) And obtaining a rotation point Tm2 according to the scattering direction, wherein Tm2 represents a rotation point corresponding to the bone model on the virtual firearm.
(9) And then translate back to the original position Tm3, tm3 being the origin corresponding to the bone model on the hand hosting the virtual object.
(10) Weapon position information after rotation = Tm1 Tm2 Tm3 position information.
It is noted that the above-described process of adjusting the subsequent sitting and scattering angle parameters, respectively, with respect to the prop launching animation is simultaneously adjusted.
Step 908, firing direction is calculated.
And the client calculates the firing direction of the next firing according to the firing state of the virtual firearm at the current moment, so as to generate a candidate firing track corresponding to the bullet in the next firing (n+1th firing).
Step 909, fire judgment.
If n rounds of bullets are shot currently, if the corresponding prop shooting track in the n rounds of bullets is positioned in a neighborhood formed by taking the candidate shooting track as a circle center and a preset radius, the number of the n rounds of bullets in the neighborhood meets a preset number threshold, and the candidate shooting track corresponding to the n+1th round of bullets is taken as a target shooting track.
The client performs firing verification (for determining a target firing track corresponding to the n+1th bullet and firing the n+1th bullet) on the target firing track, and performs injury treatment (calculates injury caused by the n+1th bullet reaching the drop point position), and sends hit information to the client.
The client displays the hit animation according to the hit information.
Step 910, filling in prop deduction.
After the client determines the target firing trajectory of the n+1 shot, the shot is subtracted from the virtual firearm assembled round.
Step 911 simulates the firing performance.
The client displays the animation of firing the n+1th bullet of the virtual firearm, wherein the n+1th bullet flies along the target firing track to reach the drop point after being fired.
Step 912, calculate the next firing time.
After the virtual firearm fires the n+1th round, the firing time at the next moment is calculated.
Step 913, stopping firing.
Referring to fig. 14, a schematic diagram of firing results provided by an exemplary embodiment of the present application is shown, as shown in fig. 14, a firing result 1400 is currently displayed, and it can be seen from the firing result 1400 that the situation that the distribution of the positions of the currently fired multiple bullets is far away from other bullets, which accords with the scattering rule.
And stopping firing after the specified number of bullets corresponding to the single firing operation of the virtual firearm are fired.
Referring to fig. 15, a schematic diagram of the consistency of the firing trajectory and the prop firing animation provided by an exemplary embodiment of the present application is shown, as shown in fig. 15, currently showing a first firing schematic 1510 and a second firing schematic 1520, wherein, during the first firing, the landing 1502 of the first bullet finally fired is located in the left-hand area because the body 1501 of the virtual firearm is biased to the left in the prop firing animation, and during the second firing, the landing 1503 of the second bullet finally fired is located in the right-hand area because the body 1501 of the virtual firearm is biased to the right.
In summary, in the method for displaying virtual props provided by the embodiment of the present application, in the process of displaying a main control virtual object with a virtual transmission prop in a virtual scene, after a prop continuous transmission operation is received, prop transmitting animations corresponding to n filling props in the virtual transmission prop are displayed first, candidate transmitting tracks corresponding to n+1th filling props are determined according to the prop continuous transmission operation, and under the condition that the degrees of included angles between the transmitting angles of the prop transmitting tracks corresponding to the n filling props and the transmitting angles of the candidate transmitting tracks corresponding to the n+1th filling props are consistent with the transmitting angles, the candidate transmitting tracks are determined as target transmitting tracks corresponding to the n+1th filling props, so that prop transmitting animations transmitted along the target transmitting tracks by the n+1th filling props are displayed. That is, the target emission track corresponding to the n+1th filling prop is determined according to the emission angles of the prop emission tracks corresponding to the first n filling props, so that starting from the n+1th filling prop, the situation that the distance between the target emission track corresponding to each filling prop and the prop emission track corresponding to the first n filling props is far can not exist, and therefore prop emission accuracy is improved, and prop emission authenticity can also be improved.
When a player uses the firearm to fire continuously, the problem that an included angle is avoided when the firing animation of the firearm and the bullet trajectory are in the same motion trajectory is guaranteed, and finally, the firearm hit sensing and gun control operation are greatly improved, so that the firearm hit rate is guaranteed.
And the scattering parameters of the firearm are distributed regularly, so that the scattering distribution of the bullets is ensured to be consistent with the recoil direction, abnormal scattering data are removed, and the consistency of firing animation and ballistic performance of the firearm and the hit stability are ensured.
Fig. 16 is a block diagram of a display device for a virtual prop according to an exemplary embodiment of the present application, and as shown in fig. 1, the device includes the following parts:
A display module 1610, configured to display a virtual scene picture, where a virtual scene corresponding to the virtual scene picture includes a master virtual object, where the master virtual object holds a virtual transmitting prop, and a plurality of filling props are assembled in the virtual transmitting prop;
the display module 1610 is further configured to display, in response to receiving a prop continuous transmission operation, prop transmission animations corresponding to n filling props after being sequentially transmitted, where the prop continuous transmission operation is used to instruct the virtual transmission props to continuously transmit the filling props, the n filling props correspond to prop transmission tracks respectively, and n is a positive integer;
A determining module 1620, configured to determine a candidate emission track corresponding to an n+1st filling prop based on the prop burst operation;
the determining module 1620 is further configured to determine, when the degrees of the included angles between the emission angles corresponding to the prop emission tracks corresponding to the n filling props and the emission angles corresponding to the candidate emission tracks meet the emission angle condition, the candidate emission track as a target emission track corresponding to the n+1th filling prop;
the display module 1610 is further configured to display a prop launching animation that is launched by the n+1st filling prop along the target launching track.
In some embodiments, as shown in fig. 17, the determining module 1620 includes:
A determining unit 1621, configured to determine a preset track range based on the emission angle corresponding to the candidate emission track;
the determining unit 1621 is further configured to determine the candidate emission track as a target emission track corresponding to the (n+1) th filling prop, where k is greater than 0 and less than or equal to n, and k is an integer, in response to at least k filling props among the n filling props being in the preset track range.
In some embodiments, the determining unit 1621 is further configured to obtain a preset angle range; taking a designated track position in the candidate emission tracks as a starting position, and determining a target radius based on the preset angle range; and determining the preset track range based on the specified track position and the target radius.
In some embodiments, the determining module 1620 is further configured to obtain degrees of included angles between the emission angles corresponding to the prop emission tracks corresponding to the n filling props and the emission angles corresponding to the candidate emission tracks; acquiring a preset angle threshold; and determining the candidate emission track as a target emission track corresponding to the (n+1) th filling prop in response to the fact that the included angle between the emission angles corresponding to the prop emission tracks corresponding to at least j filling props respectively and the emission angles corresponding to the candidate emission tracks does not reach the preset angle threshold, wherein j is more than 0 and less than or equal to n, and j is an integer.
In some embodiments, the apparatus further comprises:
And the rejection module 1630 is configured to reject the candidate emission track corresponding to the (n+1) th filling prop when the degrees of the included angles between the emission angles corresponding to the prop emission tracks corresponding to the n filling props and the emission angles corresponding to the candidate emission tracks do not meet the emission angle condition.
In some embodiments, the apparatus further comprises:
an acquisition module 1640, configured to acquire a scattering angle parameter corresponding to the virtual emission prop;
The determining module 1620 is further configured to determine a corresponding reference drop point area after the filling prop is emitted based on the scattering angle parameter;
The acquiring module 1640 is further configured to acquire a recoil parameter corresponding to the virtual launching prop, and a preset deflection angle;
the determining module 1620 is further configured to determine a target landing area after the filling prop is launched based on the preset deflection angle and the reference landing area by using the recoil parameter as a vector direction.
In some embodiments, the acquiring module 1640 is further configured to acquire reference camera parameters corresponding to a virtual camera of the virtual launch prop;
The determining module 1620 is further configured to determine a camera offset parameter corresponding to the virtual camera based on the recoil parameter;
the acquiring module 1640 is further configured to acquire a target emission parameter corresponding to the virtual camera based on the reference camera parameter and the camera offset parameter, where the target emission parameter is used to instruct a lens offset distance and a lens rotation angle corresponding to the virtual camera when the virtual emission prop emits the filling prop.
In some embodiments, the acquiring module 1640 is further configured to acquire a movement state of the master virtual object based on the prop transmitting operation, and acquire a transmission state corresponding to the virtual transmission prop; determining a scattering parameter based on the movement state and the emission state; establishing a target coordinate system based on the scattering parameters; randomly generating a target angle and a target radius in the target coordinate system; the scattering angle parameter is determined based on the region in the target coordinate system constituted by the target angle and the target radius.
In some embodiments, the display module 1610 is further configured to display an animation of the n+1st filling prop reaching the target landing area after being launched along the target launching track.
In summary, in the display device for virtual props provided by the embodiment of the present application, in the process of displaying a main control virtual object with a virtual transmitting prop in a virtual scene, after a prop continuous transmitting operation is received, prop transmitting animations corresponding to n filling props in the virtual transmitting prop are displayed first, candidate transmitting tracks corresponding to n+1th filling props are determined according to the prop continuous transmitting operation, and under the condition that the degrees of included angles between the transmitting angles of the prop transmitting tracks corresponding to the n filling props and the transmitting angles of the candidate transmitting tracks corresponding to the n+1th filling props are consistent with the transmitting angles, the candidate transmitting tracks are determined as target transmitting tracks corresponding to the n+1th filling props, so that prop transmitting animations transmitted along the target transmitting tracks by the n+1th filling props are displayed. That is, the target emission track corresponding to the n+1th filling prop is determined according to the emission angles of the prop emission tracks corresponding to the first n filling props, so that starting from the n+1th filling prop, the situation that the distance between the target emission track corresponding to each filling prop and the prop emission track corresponding to the first n filling props is far can not exist, and therefore prop emission accuracy is improved, and prop emission authenticity can also be improved.
It should be noted that: in the display device for virtual props provided in the above embodiment, only the division of the above functional modules is used as an example, and in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the display device of the virtual prop provided in the above embodiment and the display method embodiment of the virtual prop belong to the same concept, and detailed implementation processes of the display device and the display method embodiment of the virtual prop are detailed in the method embodiment, and are not described herein.
Fig. 18 shows a block diagram of a terminal 1800 according to an exemplary embodiment of the present application. The terminal 1800 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. The terminal 1800 may also be referred to as a user device, portable terminal, laptop terminal, desktop terminal, or the like.
In general, the terminal 1800 includes: a processor 1801 and a memory 1802.
Processor 1801 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 1801 may be implemented in at least one hardware form of DSP (DIGITAL SIGNAL Processing), FPGA (Field-Programmable gate array) GATE ARRAY, PLA (Programmable Logic Array ). The processor 1801 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also called a CPU (Central Processing Unit ); a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 1801 may integrate a GPU (Graphics Processing Unit, image processor) for rendering and drawing of content required to be displayed by the display screen. In some embodiments, the processor 1801 may also include an AI (ARTIFICIAL INTELLIGENCE ) processor for processing computing operations related to machine learning.
The memory 1802 may include one or more computer-readable storage media, which may be non-transitory. The memory 1802 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1802 is used to store at least one instruction for execution by processor 1801 to implement the virtual-game-based control method provided by the method embodiments of the present application.
In some embodiments, terminal 1800 may include other components, and those skilled in the art will appreciate that the structure shown in fig. 18 is not limiting of terminal 1800 and may include more or fewer components than shown, or may combine certain components, or may employ a different arrangement of components.
Those of ordinary skill in the art will appreciate that all or part of the steps in the various methods of the above embodiments may be implemented by a program for instructing related hardware, and the program may be stored in a computer readable storage medium, which may be a computer readable storage medium included in the memory of the above embodiments; or may be a computer-readable storage medium, alone, that is not incorporated into the terminal. The computer readable storage medium stores at least one instruction, at least one program, a code set, or an instruction set, where the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by the processor to implement the method for displaying a virtual prop according to any of the foregoing embodiments.
Alternatively, the computer-readable storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), solid state disk (SSD, solid STATE DRIVES), or optical disk, etc. The random access memory may include resistive random access memory (ReRAM, RESISTANCE RANDOM ACCESS MEMORY) and dynamic random access memory (DRAM, dynamic Random Access Memory), among others. The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments of the present application is not intended to limit the application, but rather, the application is to be construed as limited to the appended claims.
Claims (13)
1. A method for displaying a virtual prop, the method comprising:
Displaying a virtual scene picture, wherein a virtual scene corresponding to the virtual scene picture comprises a main control virtual object, the main control virtual object holds a virtual transmitting prop, and a plurality of filling props are assembled in the virtual transmitting prop;
In response to receiving prop continuous transmitting operation, displaying prop transmitting animations respectively corresponding to n filling props after sequential transmitting, wherein the prop continuous transmitting operation is used for indicating the virtual transmitting props to continuously transmit the filling props, the n filling props respectively correspond to prop transmitting tracks, and n is a positive integer;
determining candidate emission tracks corresponding to the (n+1) th filling prop based on the prop continuous emission operation;
Under the condition that the degrees of included angles between the emission angles corresponding to the prop emission tracks respectively corresponding to the n filling props and the emission angles corresponding to the candidate emission tracks meet the emission angle conditions, determining the candidate emission tracks as target emission tracks corresponding to the (n+1) th filling props;
and displaying prop launching animation launched by the n+1th filling prop along the target launching track.
2. The method according to claim 1, wherein determining the candidate emission track as the target emission track corresponding to the n+1th filling prop when the degrees of the included angle between the emission angles corresponding to the n filling prop emission tracks and the emission angles corresponding to the candidate emission tracks meet the emission angle condition, includes:
determining a preset track range based on the transmission angle corresponding to the candidate transmission track;
And determining the candidate emission track as a target emission track corresponding to the (n+1) th filling prop in response to the fact that at least k filling prop emission tracks respectively corresponding to the n filling props are in the preset track range, wherein k is more than 0 and less than or equal to n, and k is an integer.
3. The method of claim 2, wherein the determining the preset track range based on the emission angle corresponding to the candidate emission track comprises:
acquiring a preset angle range;
taking a designated track position in the candidate emission tracks as a starting position, and determining a target radius based on the preset angle range;
And determining the preset track range based on the specified track position and the target radius.
4. The method according to claim 1, wherein determining the candidate emission track as the target emission track corresponding to the n+1th filling prop when the degrees of the included angle between the emission angles corresponding to the n filling prop emission tracks and the emission angles corresponding to the candidate emission tracks meet the emission angle condition, includes:
Acquiring the degrees of included angles between the corresponding emission angles of the prop emission tracks and the corresponding emission angles of the candidate emission tracks, which are respectively corresponding to the n filling props;
Acquiring a preset angle threshold;
And determining the candidate emission track as a target emission track corresponding to the (n+1) th filling prop in response to the fact that the included angle between the emission angles corresponding to the prop emission tracks corresponding to at least j filling props respectively and the emission angles corresponding to the candidate emission tracks does not reach the preset angle threshold, wherein j is more than 0 and less than or equal to n, and j is an integer.
5. The method according to any one of claims 1 to 4, further comprising:
and eliminating the candidate emission tracks corresponding to the (n+1) th filling prop under the condition that the degrees of included angles between the emission angles corresponding to the prop emission tracks corresponding to the n filling props and the emission angles corresponding to the candidate emission tracks do not meet the emission angle condition.
6. The method of any one of claims 1 to 4, wherein before displaying the respective prop launch animation after sequential launch of the n filling props, further comprising:
acquiring scattering angle parameters corresponding to the virtual emission props;
determining a corresponding reference drop point area after the filling prop is transmitted based on the scattering angle parameter;
Acquiring recoil parameters corresponding to the virtual launching prop and a preset deflection angle;
And determining a target landing point area after the filling prop is launched based on the preset deflection angle and the reference landing point area by taking the recoil parameter as a vector direction.
7. The method of claim 6, wherein the method further comprises:
Obtaining reference camera parameters corresponding to a virtual camera of the virtual transmitting prop;
determining a camera offset parameter corresponding to the virtual camera based on the recoil parameter;
And acquiring a target emission parameter corresponding to the virtual camera based on the reference camera parameter and the camera offset parameter, wherein the target emission parameter is used for indicating a lens offset distance and a lens rotation angle corresponding to the virtual camera when the virtual emission prop emits the filling prop.
8. The method of claim 6, wherein the obtaining the scattering angle parameter corresponding to the virtual emission prop comprises:
Acquiring a moving state of the main control virtual object based on the prop transmitting operation, and acquiring a transmitting state corresponding to the virtual transmitting prop;
determining a scattering parameter based on the movement state and the emission state;
establishing a target coordinate system based on the scattering parameters;
randomly generating a target angle and a target radius in the target coordinate system;
the scattering angle parameter is determined based on the region in the target coordinate system constituted by the target angle and the target radius.
9. The method of claim 6, wherein said displaying a prop launch animation of the n+1th filled prop launch along the target launch trajectory comprises:
And displaying the animation of the n+1th filling prop reaching the target landing area after being launched along the target launching track.
10. A display device for a virtual prop, the device comprising:
The display module is used for displaying a virtual scene picture, wherein a virtual scene corresponding to the virtual scene picture comprises a main control virtual object, the main control virtual object holds a virtual transmitting prop, and a plurality of filling props are assembled in the virtual transmitting prop;
The display module is further used for displaying prop transmitting animations respectively corresponding to n filling props after being sequentially transmitted in response to receiving prop continuous transmitting operation, the prop continuous transmitting operation is used for indicating the virtual transmitting props to be used for continuously transmitting the filling props, the n filling props respectively correspond to prop transmitting tracks, and n is a positive integer;
the determining module is used for determining candidate emission tracks corresponding to the (n+1) th filling prop based on the prop continuous emission operation;
the determining module is further configured to determine, when degrees of included angles between an emission angle corresponding to the prop emission tracks corresponding to the n filling props and an emission angle corresponding to the candidate emission track meet an emission angle condition, the candidate emission track as a target emission track corresponding to the n+1th filling prop;
And the display module is also used for displaying prop launching animation launched by the (n+1) th filling prop along the target launching track.
11. A computer device comprising a processor and a memory, wherein the memory has stored therein at least one program that is loaded and executed by the processor to implement a method of displaying virtual props according to any of claims 1 to 9.
12. A computer readable storage medium having stored therein at least one program loaded and executed by a processor to implement a method of displaying a virtual prop as claimed in any one of claims 1 to 9.
13. A computer program product comprising computer instructions which, when executed by a processor, implement a method of displaying virtual props according to any one of claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310178330.9A CN118512766A (en) | 2023-02-20 | 2023-02-20 | Virtual prop display method, device, equipment, medium and product |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310178330.9A CN118512766A (en) | 2023-02-20 | 2023-02-20 | Virtual prop display method, device, equipment, medium and product |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118512766A true CN118512766A (en) | 2024-08-20 |
Family
ID=92274675
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310178330.9A Pending CN118512766A (en) | 2023-02-20 | 2023-02-20 | Virtual prop display method, device, equipment, medium and product |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118512766A (en) |
-
2023
- 2023-02-20 CN CN202310178330.9A patent/CN118512766A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11712634B2 (en) | Method and apparatus for providing online shooting game | |
CN110732135B (en) | Virtual scene display method and device, electronic equipment and storage medium | |
WO2022057624A1 (en) | Method and apparatus for controlling virtual object to use virtual prop, and terminal and medium | |
WO2022083449A1 (en) | Virtual throwing prop using method and device, terminal, and storage medium | |
US20220168647A1 (en) | Virtual prop control method and apparatus, storage medium and electronic device | |
WO2022242400A1 (en) | Method and apparatus for releasing skills of virtual object, device, medium, and program product | |
US12303786B2 (en) | Method and apparatus for displaying aiming mark | |
US20150375110A1 (en) | Systems and methods for shooting in online action games using automatic weapon aiming | |
CN113827967B (en) | Game control method and device, electronic equipment and storage medium | |
CN112156472B (en) | Control method, device and equipment of virtual prop and computer readable storage medium | |
CN110876849A (en) | Virtual vehicle control method, device, equipment and storage medium | |
CN112121433A (en) | Method, device and equipment for processing virtual prop and computer readable storage medium | |
US12303787B2 (en) | Controlling a virtual object based on strength values | |
WO2023142617A1 (en) | Virtual environment-based ray display method and apparatus, device, and storage medium | |
CN114146412A (en) | Display control method and device in game, electronic equipment and storage medium | |
CN114247140B (en) | Information display method, device, equipment and medium | |
CN113663329B (en) | Shooting control method and device for virtual character, electronic equipment and storage medium | |
CN118512766A (en) | Virtual prop display method, device, equipment, medium and product | |
CN112755524B (en) | Virtual target display method and device, electronic equipment and storage medium | |
CN116099190A (en) | Interaction method, device, equipment, medium and program product based on virtual scene | |
CN114210062B (en) | Virtual prop using method, device, terminal, storage medium and program product | |
CN117298580A (en) | Virtual object interaction method, device, equipment, medium and program product | |
HK40070876A (en) | Ray display method based on virtual scene, device, equipment and storage medium | |
CN118526792A (en) | Method and device for controlling virtual weapon in game and electronic equipment | |
HK40043875B (en) | Method and apparatus for displaying virtual target, electronic device and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication |