CN117046110A - Virtual emission processing method and device, electronic equipment and storage medium - Google Patents
Virtual emission processing method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN117046110A CN117046110A CN202210491849.8A CN202210491849A CN117046110A CN 117046110 A CN117046110 A CN 117046110A CN 202210491849 A CN202210491849 A CN 202210491849A CN 117046110 A CN117046110 A CN 117046110A
- Authority
- CN
- China
- Prior art keywords
- virtual
- dimensional motion
- duration
- emission
- screen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003860 storage Methods 0.000 title claims abstract description 16
- 238000003672 processing method Methods 0.000 title abstract description 14
- 230000033001 locomotion Effects 0.000 claims abstract description 224
- 238000000034 method Methods 0.000 claims abstract description 80
- 238000012545 processing Methods 0.000 claims abstract description 67
- 230000004044 response Effects 0.000 claims abstract description 43
- 230000003993 interaction Effects 0.000 claims abstract description 34
- 238000013507 mapping Methods 0.000 claims abstract description 29
- 230000000007 visual effect Effects 0.000 claims abstract description 18
- 238000004590 computer program Methods 0.000 claims abstract description 13
- 230000000875 corresponding effect Effects 0.000 claims description 46
- 230000001276 controlling effect Effects 0.000 claims description 43
- 230000015654 memory Effects 0.000 claims description 24
- 230000008447 perception Effects 0.000 claims description 21
- 239000011159 matrix material Substances 0.000 claims description 11
- 230000009466 transformation Effects 0.000 claims description 10
- 238000001514 detection method Methods 0.000 claims description 6
- 230000002596 correlated effect Effects 0.000 claims description 3
- 238000012360 testing method Methods 0.000 description 23
- 238000010586 diagram Methods 0.000 description 22
- 230000008569 process Effects 0.000 description 15
- 238000004088 simulation Methods 0.000 description 14
- 238000005516 engineering process Methods 0.000 description 13
- 230000000694 effects Effects 0.000 description 11
- 238000004364 calculation method Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 238000013473 artificial intelligence Methods 0.000 description 4
- 238000010304 firing Methods 0.000 description 4
- 230000016776 visual perception Effects 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000002452 interceptive effect Effects 0.000 description 3
- 238000013515 script Methods 0.000 description 3
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 238000011068 loading method Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000002860 competitive effect Effects 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 230000010006 flight Effects 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000001339 gustatory effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000011065 in-situ storage Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000036651 mood Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/57—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
- A63F13/573—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/308—Details of the user interface
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/64—Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8076—Shooting
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application provides a virtual emission processing method, a virtual emission processing device, electronic equipment, a computer readable storage medium and a computer program product; the method comprises the following steps: displaying a virtual scene in a human-computer interaction interface; determining a field of view of the second virtual object in the virtual scene in response to the first virtual object transmitting a virtual emission in the virtual scene; in response to the overlapping of the three-dimensional motion trail of the virtual emission object and the visual field, mapping the overlapping part in the three-dimensional motion trail from the three-dimensional space of the virtual scene to the two-dimensional motion trail in the two-dimensional space of the screen, and determining the duration of the corresponding proportion in the perceived duration according to the proportion of the length of the two-dimensional motion trail to the size of the screen, wherein the perceived duration is the duration in which the flight direction of the virtual emission object in the screen can be perceived; and controlling the virtual projectile to fly along the two-dimensional motion trail. The application can accurately simulate the operation characteristics of the virtual emission and simultaneously give consideration to the viewing experience of users needing to perceive the flight direction.
Description
Technical Field
The present application relates to the field of internet technologies, and in particular, to a method and apparatus for processing a virtual emission object, an electronic device, and a computer readable storage medium.
Background
The man-machine interaction technology of the virtual scene based on the graphic processing hardware can realize diversified interactions among virtual objects controlled by users or artificial intelligence according to actual application requirements, and has wide practical value. For example, in a virtual scene such as a game, a real fight process between virtual objects can be simulated.
Taking a game scene as an example, the shooting game is a competitive game which is deeply favored by users, not only can help the users release pressure and relax moods, but also can improve the reaction capability and sensitivity of the users through the shooting game.
However, in the related art, since the flying speed of a virtual projectile (e.g., a virtual bullet or a virtual rocket) is high, the flying direction of the virtual projectile is not clearly seen in the screen by the player. If the flight speed of the virtual projectile is modified at the game logic level in order to make the flight direction of the virtual projectile emitted by the virtual object controlled by the other player clear, although the flight direction of the virtual projectile can be made clear to the player, this will not be the same as the flight speed that the virtual projectile itself can achieve, and no real simulation of the operation characteristics of the virtual projectile can be achieved (e.g. the virtual projectile would hit a target object in a virtual scene due to the modified flight speed).
That is, there is no effective solution in the related art for ensuring that a player perceives a viewing experience of a flying direction of a virtual emission while considering simulation accuracy of an operation characteristic of the virtual emission.
Disclosure of Invention
The embodiment of the application provides a processing method, a processing device, electronic equipment, a computer readable storage medium and a computer program product for a virtual emission, which can accurately simulate the operation characteristics of the virtual emission and simultaneously give consideration to the viewing experience of users for perceiving the flight direction of the virtual emission.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a processing method of a virtual emission object, which comprises the following steps:
displaying a virtual scene in a human-computer interaction interface;
determining a field of view of a second virtual object in the virtual scene in response to the first virtual object transmitting a virtual emission in the virtual scene; wherein the second virtual object is any object in the virtual scene that is distinct from the first virtual object;
mapping an overlapping portion in the three-dimensional motion trajectory from a three-dimensional space of the virtual scene to a two-dimensional motion trajectory in a two-dimensional space of a screen in response to the three-dimensional motion trajectory of the virtual projectile overlapping the field of view, and
Determining the duration corresponding to the ratio in the perceived duration according to the ratio of the length of the two-dimensional motion trail to the size of the screen, wherein the perceived duration is the duration capable of perceiving the flight direction of the virtual emission in the screen;
and controlling the virtual projectile to fly along the two-dimensional motion trail, wherein the duration of the flight along the two-dimensional motion trail is not less than the duration.
The embodiment of the application provides a processing method of a virtual emission object, which comprises the following steps:
displaying a virtual scene in a human-computer interaction interface;
transmitting a virtual emission object in the virtual scene in response to a first virtual object, and the three-dimensional motion trajectory of the virtual emission object overlaps with the field of view of a second virtual object,
controlling the virtual projectile to fly along a two-dimensional motion trail obtained by mapping an overlapping part of the three-dimensional motion trail and the visual field to a screen, and
and controlling the flying time of the virtual emission object along the two-dimensional motion track to be not less than a duration time, wherein the duration time is a time corresponding to the ratio of the length of the two-dimensional motion track to the size of the screen in a perception time, and the perception time is a time capable of perceiving the flying direction of the virtual emission object in the screen.
The embodiment of the application provides a processing device of a virtual emission object, which comprises:
the display module is used for displaying the virtual scene in the man-machine interaction interface;
a determining module for determining a field of view of a second virtual object in the virtual scene in response to the first virtual object transmitting a virtual emission in the virtual scene; wherein the second virtual object is any object in the virtual scene that is distinct from the first virtual object;
a mapping module, configured to map, from a three-dimensional space of the virtual scene to a two-dimensional motion trajectory in a two-dimensional space of a screen, an overlapping portion in the three-dimensional motion trajectory in response to a three-dimensional motion trajectory of the virtual projectile overlapping the field of view;
the determining module is further configured to determine a duration corresponding to the ratio in a perceived duration according to a ratio of the length of the two-dimensional motion trajectory to the size of the screen, where the perceived duration is a duration that can perceive a flight direction of the virtual emission in the screen;
and the control module is used for controlling the virtual projectile to fly along the two-dimensional motion trail, wherein the duration of the flight along the two-dimensional motion trail is not less than the duration.
The embodiment of the application provides a processing device of a virtual emission object, which comprises:
the display module is used for displaying the virtual scene in the man-machine interaction interface;
the control module is used for responding to a first virtual object to emit a virtual emission object in the virtual scene, wherein a three-dimensional motion track of the virtual emission object is overlapped with a visual field of a second virtual object, the virtual emission object is controlled to fly along a two-dimensional motion track, the two-dimensional motion track is obtained by mapping an overlapped part of the three-dimensional motion track and the visual field to a screen, and the duration of the virtual emission object flying along the two-dimensional motion track is controlled to be not less than a duration, wherein the duration is a duration of a ratio of a length of the corresponding two-dimensional motion track to the size of the screen in a perception duration, and the perception duration is a duration of being capable of perceiving the flight direction of the virtual emission object in the screen.
An embodiment of the present application provides an electronic device, including:
a memory for storing executable instructions;
and the processor is used for realizing the processing method of the virtual emission object provided by the embodiment of the application when executing the executable instructions stored in the memory.
The embodiment of the application provides a computer readable storage medium which stores executable instructions for realizing the processing method of the virtual emission object provided by the embodiment of the application when being executed by a processor.
The embodiment of the application provides a computer program product, which comprises a computer program or instructions for realizing the processing method of the virtual emission object provided by the embodiment of the application when being executed by a processor.
The embodiment of the application has the following beneficial effects:
the flight speed of the virtual emission is controlled from the visual angle level of the observer (namely, the player corresponding to the second virtual object), so that the observer can see the flight direction of the virtual emission in the screen clearly while the simulation speed of simulating the performance of the virtual emission used by the interactive logic of the virtual scene is not influenced, and the accurate simulation of the performance of the virtual emission and the viewing experience of the user to perceive the flight direction of the virtual emission are considered.
Drawings
FIG. 1A is a schematic diagram of a virtual projectile processing system 100 according to an embodiment of the application;
FIG. 1B is a schematic diagram of a virtual projectile processing system 101 according to an embodiment of the application;
Fig. 2 is a schematic structural diagram of a terminal device 400 according to an embodiment of the present application;
FIG. 3 is a flow chart of a method for processing a virtual emission object according to an embodiment of the present application;
fig. 4A is an application scenario schematic diagram of a method for processing a virtual emission object according to an embodiment of the present application;
FIG. 4B is a schematic diagram of mapping a three-dimensional space of a virtual scene to a two-dimensional space of a screen according to an embodiment of the present application;
fig. 5A and fig. 5B are schematic flow diagrams of a method for processing a virtual emission object according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a method for processing a virtual emission object according to an embodiment of the present application;
fig. 7 is a schematic view of an application scenario of a method for processing a virtual emission object provided by the related art;
FIG. 8 is a schematic diagram of a method for processing a virtual emission object according to an embodiment of the present application;
fig. 9 is an application scenario schematic diagram of a method for processing a virtual emission object according to an embodiment of the present application;
FIG. 10 is a flow chart of a method for processing virtual emitters according to an embodiment of the present application;
FIG. 11 is a frustum schematic of a virtual camera provided by an embodiment of the present application;
fig. 12 is a schematic diagram of a method for processing a virtual emission object according to an embodiment of the present application;
FIG. 13 is a schematic diagram of a method for processing virtual emitters according to an embodiment of the present application;
FIG. 14 is a schematic diagram of a transition from world space to screen space provided by an embodiment of the present application;
FIG. 15 is a schematic diagram of a method for processing virtual emitters according to an embodiment of the present application;
fig. 16 is a schematic diagram of a method for processing a virtual emission object according to an embodiment of the present application.
Detailed Description
The present application will be further described in detail with reference to the accompanying drawings, for the purpose of making the objects, technical solutions and advantages of the present application more apparent, and the described embodiments should not be construed as limiting the present application, and all other embodiments obtained by those skilled in the art without making any inventive effort are within the scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is to be understood that "some embodiments" can be the same subset or different subsets of all possible embodiments and can be combined with one another without conflict.
It will be appreciated that in embodiments of the present application, related data such as a target test duration selected by a user (e.g., a tester) from a plurality of test durations may be relevant, and when the embodiments of the present application are applied to a specific product or technology, user permissions or consents may be required, and the collection, use and processing of the related data may be required to comply with relevant laws and regulations and standards of the relevant country and region.
In the following description, the term "first\second\ …" is merely to distinguish similar objects and does not represent a particular ordering for objects, it being understood that the "first\second\ …" may be interchanged in a particular order or sequencing where permitted to enable embodiments of the present application described herein to be practiced in other than those illustrated or described herein.
In the following description, the term "plurality" refers to at least two.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the application only and is not intended to be limiting of the application.
Before describing embodiments of the present application in further detail, the terms and terminology involved in the embodiments of the present application will be described, and the terms and terminology involved in the embodiments of the present application will be used in the following explanation.
1) In response to a condition or state that is used to represent the condition or state upon which the performed operation depends, the performed operation or operations may be in real-time or with a set delay when the condition or state upon which it depends is satisfied; without being specifically described, there is no limitation in the execution sequence of the plurality of operations performed.
2) A virtual scene is a scene that an application program displays (or provides) when running on a terminal device. The scene may be a simulation environment for the real world, a semi-simulated and semi-fictional virtual environment, or a purely fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, a virtual scene may include sky, land, sea, etc., the land may include environmental elements of a desert, city, etc., and a user may control a virtual object to move in the virtual scene.
3) Virtual objects, images of various people and objects in a virtual scene that can interact, or movable objects in a virtual scene. The movable object may be a virtual character, a virtual animal, a cartoon character, etc., such as a character, an animal, etc., displayed in a virtual scene. The virtual object may be a virtual avatar in a virtual scene for representing a user. A virtual scene may include a plurality of virtual objects, each virtual object having its own shape and volume in the virtual scene, occupying a portion of space in the virtual scene.
4) A three-dimensional Space, also called World Space (3D, 3-dimensional) Space, is a three-dimensional Space of the entire virtual scene, and the position of the World Space represents the real coordinates of an object in the 3D Space, which is a three-dimensional coordinate, for example, including X, Y, Z three mutually orthogonal coordinates.
5) The two-dimensional Space, also called Screen Space (Screen Space), i.e. the world seen in the Screen, the position of which represents the coordinates of an object displayed in the Screen, is a two-dimensional coordinate, e.g. comprising an abscissa and an ordinate.
6) Virtual cameras, a virtual concept, are actually in the 3D space of a virtual scene, simulating the position of the eyes of a virtual object.
7) The view cone, i.e., the cone in front of the virtual camera, represents the range of viewing angles of the virtual camera, which is only visible to the player when objects are within the view cone.
The embodiment of the application provides a processing method, a processing device, electronic equipment, a computer readable storage medium and a computer program product for a virtual emission, which can accurately simulate the operation characteristics of the virtual emission and simultaneously give consideration to the viewing experience of users for perceiving the flight direction of the virtual emission. An exemplary application of the electronic device provided by the embodiment of the present application is described below, where the electronic device provided by the embodiment of the present application may be implemented as various types of terminal devices, or may be implemented cooperatively by the terminal device and a server.
The method for processing the virtual emission provided by the embodiment of the application is firstly taken as an example for a terminal device to singly implement.
For example, referring to fig. 1A, fig. 1A is a schematic architecture diagram of a processing system 100 of a virtual emission object provided by an embodiment of the present application, as shown in fig. 1A, a client 410 (e.g. a stand-alone game application) is running on a terminal device 400, and a virtual scene is displayed in a man-machine interaction interface of the client 410, where the virtual scene includes a first virtual object (e.g. a virtual object a controlled by artificial intelligence) and a second virtual object (e.g. a virtual object B controlled by a real player). The client 410 responds to the first virtual object to emit a virtual emission in the virtual scene, wherein the virtual emission can be a virtual bullet emitted by a virtual weapon or a virtual Rocket or the like emitted by a virtual shoulder-resistance Rocket (RPG, rock-Propelled Grenade) which is flown by providing a reaction force based on the fuel of the virtual gunpowder, the virtual emission can fly in the virtual scene according to a kinematic rule until falling to the ground or encountering an obstacle (such as a target object in the virtual scene) under the action of virtual gravity, and the view of the second virtual object in the virtual scene is determined; then, the client 410, in response to the overlapping of the three-dimensional motion trajectory of the virtual projectile and the field of view of the second virtual object, invokes the computing capability provided by the terminal device 400 to map the overlapping portion in the three-dimensional motion trajectory from the three-dimensional space of the virtual scene to the two-dimensional motion trajectory in the two-dimensional space of the screen, and determines the duration corresponding to the ratio in the perceived duration according to the ratio of the length of the two-dimensional motion trajectory to the size of the screen. The client 410 then controls the virtual projectile to fly along the two-dimensional motion trajectory, and controls the virtual projectile to fly along the two-dimensional motion trajectory for a duration not less than the duration, so that by modifying the flying speed of the virtual projectile from the perspective of the observer, it is ensured that the player can see clearly the flying direction of the virtual projectile in the screen while not affecting the simulated speed of the performance of the simulated virtual projectile used by the interaction logic of the virtual scene.
The following description will continue taking the processing method of the virtual emission provided by the embodiment of the application implemented by the terminal device and the server cooperatively as an example.
For example, referring to fig. 1B, fig. 1B is a schematic architecture diagram of a virtual emission processing system 101 according to an embodiment of the present application, where, as shown in fig. 1B, the virtual emission processing system 101 includes: server 200, network 300, terminal device 500, and terminal device 600, wherein network 300 may be a local area network or a wide area network, or a combination of both.
As shown in fig. 1B, a client 510 and a client 610 (e.g., a web game application) are respectively run on the terminal device 500 and the terminal device 600, and a virtual scene is displayed in a man-machine interaction interface of the client 610, taking the client 610 as an example, where the virtual scene includes a first virtual object (e.g., a virtual object a controlled by player 1) and a second virtual object (e.g., a virtual object B controlled by player 2). Client 610 determines the field of view of virtual object B in the virtual scene in response to virtual object a under control of player 1 transmitting a virtual emission in the virtual scene. Next, when the client 610 determines that there is an overlap between the three-dimensional motion trajectory of the virtual emission object emitted by the virtual object a and the field of view of the virtual object B, a request is transmitted to the server 200 through the network 300 to cause the server 200 to map a portion of the three-dimensional motion trajectory of the virtual emission object, which overlaps with the field of view of the virtual object B, from the three-dimensional space of the virtual scene to a two-dimensional motion trajectory in the two-dimensional space of the screen of the terminal device 600. Then, the server 200 may further determine a duration corresponding to the ratio among the perceived durations according to a ratio of the length of the two-dimensional motion trajectory to the size of the screen of the terminal device 600 (e.g., the length of a diagonal line of the screen, the height of the screen, the width of the screen, etc.), wherein the perceived duration is a duration in which the direction of flight of the virtual emission in the screen of the terminal device 600 can be perceived (e.g., the shortest flight duration). Finally, the server 200 may return the determined duration to the client 610, so that the client 610 controls the virtual projectile to fly along the two-dimensional motion trajectory, and controls the virtual projectile to fly along the two-dimensional motion trajectory for a duration not less than the duration, so that by modifying the flying speed of the virtual projectile in the screen of the terminal device 600 from the perspective of an observer (e.g., the player 2), it is ensured that the player 2 can see the flying direction of the virtual projectile in the screen while not affecting the simulation speed of the performance of the simulated virtual projectile used by the interaction logic of the virtual scene.
In some embodiments, the terminal device (such as the terminal device 400 shown in fig. 1A) may further implement the method for processing a virtual emission object provided by the embodiment of the present application by running a computer program, for example, the computer program may be a native program or a software module in an operating system; a Native Application (APP), i.e. a program that needs to be installed in an operating system to run, such as a game-like Application (corresponding to the client 410 above); the method can also be an applet, namely a program which can be run only by being downloaded into a browser environment; it may also be an applet that can be embedded in any APP, such as an applet component embedded in an instant messaging class application, where the applet component may be run or shut down by a user control. In general, the computer programs described above may be any form of application, module or plug-in.
In other embodiments, the embodiments of the present application may also be implemented by means of Cloud Technology (Cloud Technology), which refers to a hosting Technology that unifies serial resources such as hardware, software, networks, etc. in a wide area network or a local area network, so as to implement calculation, storage, processing, and sharing of data.
The cloud technology is a generic term of network technology, information technology, integration technology, management platform technology, application technology and the like based on cloud computing business model application, can form a resource pool, and is flexible and convenient as required. Cloud computing technology will become an important support. Background services of technical network systems require a large amount of computing and storage resources.
By way of example, the server 200 shown in fig. 1B may be an independent physical server, or may be a server cluster or a distributed system formed by a plurality of physical servers, or may be a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, content delivery networks (CDN, content Delivery Network), and basic cloud computing services such as big data and artificial intelligence platforms, where the cloud services may be processing services of virtual emissions for a terminal device (such as the terminal device 500 or the terminal device 600 shown in fig. 1B) to call. The terminal device (e.g., terminal device 400 in fig. 1A or terminal device 500 and terminal device 600 in fig. 1B) may be, but is not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart television, a smart watch, a vehicle-mounted terminal, etc. The terminal device and the server may be directly or indirectly connected through wired or wireless communication, which is not limited in the embodiment of the present application.
The following continues the description of the structure of the terminal device 400 shown in fig. 1A. Referring to fig. 2, fig. 2 is a schematic structural diagram of a terminal device 400 provided in an embodiment of the present application, and the terminal device 400 shown in fig. 2 includes: at least one processor 420, a memory 460, at least one network interface 430, and a user interface 440. The various components in terminal device 400 are coupled together by bus system 450. It is understood that bus system 450 is used to implement the connected communications between these components. The bus system 450 includes a power bus, a control bus, and a status signal bus in addition to a data bus. But for clarity of illustration the various buses are labeled as bus system 450 in fig. 2.
The processor 420 may be an integrated circuit chip with signal processing capabilities such as a general purpose processor, such as a microprocessor or any conventional processor, or the like, a digital signal processor (DSP, digital Signal Processor), or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or the like.
The user interface 440 includes one or more output devices 441 that enable presentation of media content, including one or more speakers and/or one or more visual displays. The user interface 440 also includes one or more input devices 442, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
Memory 460 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard drives, optical drives, and the like. Memory 460 optionally includes one or more storage devices physically remote from processor 420.
Memory 460 includes volatile memory or nonvolatile memory, and may also include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read Only Memory (ROM), and the volatile Memory may be a random access Memory (RAM, random Access Memory). The memory 460 described in embodiments of the present application is intended to comprise any suitable type of memory.
In some embodiments, memory 460 is capable of storing data to support various operations, examples of which include programs, modules and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 461 including system programs for handling various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and handling hardware-based tasks;
network communication module 462 for accessing other electronic devices via one or more (wired or wireless) network interfaces 430, the exemplary network interfaces 430 comprising: bluetooth, wireless compatibility authentication (WiFi), and universal serial bus (USB, universal Serial Bus), etc.;
A presentation module 463 for enabling presentation of information (e.g., a user interface for operating peripheral devices and displaying content and information) via one or more output devices 441 (e.g., a display screen, speakers, etc.) associated with the user interface 440;
an input processing module 464 for detecting one or more user inputs or interactions from one of the one or more input devices 442 and translating the detected inputs or interactions.
In some embodiments, the processing device for virtual emission provided in the embodiments of the present application may be implemented in software, and fig. 2 shows the processing device 465 for virtual emission stored in the memory 460, which may be software in the form of a program and a plug-in, and includes the following software modules: the display module 4651, the determination module 4652, the mapping module 4653, the control module 4654, the acquisition module 4655, and the generation module 4656 are logical, and thus may be arbitrarily combined or further split depending on the functions implemented. It should be noted that in fig. 2, all the above modules are shown once for convenience of expression, but should not be considered as excluding implementations that may include only the display module 4651, the determination module 4652, the mapping module 4653 and the control module 4654 at the processing means 465 of the virtual emission; or only the implementation of the display module 4651 and the control module 4654, the functions of each of which will be described below.
As described above, the method for processing a virtual emission object provided by the embodiment of the present application may be implemented by various types of electronic devices. Referring to fig. 3, fig. 3 is a schematic flow chart of a method for processing a virtual emission object according to an embodiment of the present application, and will be described with reference to the steps shown in fig. 3.
It should be noted that the method shown in fig. 3 may be executed by various forms of computer programs executed by the terminal device 400 shown in fig. 1A, and is not limited to the client 410 executed by the terminal device 400, but may also be the operating system 461, software modules, scripts and applets described above, so that the following examples of the client should not be considered as limiting the embodiments of the present application.
In step 101, a virtual scene is displayed in a human-machine interaction interface.
In some embodiments, the virtual scene may be an environment for interaction of game characters, for example, the game characters may fight in the virtual scene, and both parties may interact in the virtual scene by controlling actions of the game characters, so that a user can relax life pressure in the game process. For example, a client supporting a virtual scene is installed on a terminal device. The client may be any one of a First person shooter game (FPS, first-Person Shooting game), a Third person shooter game (TPS, third-Person Shooting game), a virtual reality application, a three-dimensional map program, or a multi-player gunfight survival game. When a user opens a client installed on the terminal device (for example, the user clicks an icon corresponding to a shooting game APP presented on a user interface of the terminal device), and the terminal device runs the client, a virtual scene may be displayed in a man-machine interaction interface of the client.
In addition, it should be noted that, in the method for processing a virtual emission object according to the embodiment of the present application, a virtual scene may be output based on a terminal device completely, or output based on cooperation between the terminal device and a server.
By way of example, taking a game application in which a client running on a terminal device is in a single-board/off-line mode as an example, for the game application in the single-board, the computing capability of graphics processing hardware of the terminal device can be completely relied on to complete the computing of relevant data of a virtual scene. Types of graphics processing hardware include, among others, central processing units (CPU, central Processing Unit) and graphics processors (GPU, graphics Processing Unit). For example, when forming the visual perception of the virtual scene, the terminal device calculates the data required for display through the graphic calculation hardware, and completes loading, analysis and rendering of the display data, and outputs a video frame capable of forming the visual perception for the virtual scene at the graphic output hardware, for example, a two-dimensional video frame is presented on the display screen of the smart phone, or a video frame for realizing the three-dimensional display effect is projected on the lens of the augmented reality/virtual reality glasses; furthermore, in order to enrich the perception effect, the terminal device may also form one or more of auditory perception, tactile perception, motion perception and gustatory perception by means of different hardware.
By way of example, taking a game application with a client running on a terminal device as a network version as an example, aiming at the game application with the network version, virtual scene calculation can be completed depending on the calculation capability of a server, and a virtual scene can be output in a man-machine interaction interface of the client running on the terminal device. Taking the visual perception of forming a virtual scene as an example, a server firstly calculates relevant display data (such as scene data) of the virtual scene and sends the relevant display data to a terminal device through a network, the terminal device finishes loading, analyzing and rendering of the calculated display data depending on graphic calculation hardware, and the terminal device outputs the virtual scene in a man-machine interaction interface of a client to form visual perception depending on graphic output hardware, for example, a two-dimensional video frame can be presented on a display screen of a smart phone, or a video frame for realizing a three-dimensional display effect is projected on a lens of an augmented reality/virtual reality glasses; as regards the perception of the form of the virtual scene, it is understood that the auditory perception may be formed by means of the corresponding hardware output of the terminal device, for example using a microphone, the tactile perception may be formed using a vibrator, etc.
In other embodiments, it may be possible in the man-machine interface of the client to display the virtual scene at a first-person perspective (e.g., playing a virtual object in the game at the user's own perspective); the virtual scene may be displayed with a third person viewing angle (for example, the user follows a virtual object in the game to play the game); the virtual scene can be displayed in a bird's eye view with a large viewing angle; wherein, the above-mentioned different visual angles can be arbitrarily switched.
By way of example, the virtual object may be an object controlled by a current user in the game, although other virtual objects may also be included in the virtual scene, such as virtual objects that may be controlled by other users or by a robot program. The virtual objects may be partitioned into any of a plurality of teams, may be hostile or collaborative between teams, and teams in the virtual scenario may include one or all of the above.
Taking the example of displaying the virtual scene from the first person perspective, the virtual scene displayed in the human-computer interaction interface may include: the field of view area of the virtual object is determined according to the viewing position and the field angle of the virtual object in the complete virtual scene, and a part of the virtual scene in the field of view area in the complete virtual scene is presented, namely the displayed virtual scene can be a part of the virtual scene relative to the panoramic virtual scene. Because the first person perspective is the viewing perspective that is most capable of giving the user impact, immersive perception of the user as being immersive during operation can be achieved.
Taking an example of displaying a virtual scene with a bird's eye view and a large viewing angle, the virtual scene displayed in the human-computer interaction interface may include: in response to a zoom operation for the panoramic virtual scene, a portion of the virtual scene corresponding to the zoom operation is presented in the human-machine interaction interface, i.e., the displayed virtual scene may be a portion of the virtual scene relative to the panoramic virtual scene. Therefore, the operability of the user in the operation process can be improved, and the efficiency of man-machine interaction can be improved.
In step 102, in response to the first virtual object emitting a virtual emission in the virtual scene, a field of view of the second virtual object in the virtual scene is determined.
In some embodiments, the client may, in response to a first virtual object (e.g., virtual object a controlled by player 1) emitting a virtual emission in a virtual scene (e.g., client 1, in response to a player 1-triggered emission operation, such as receiving a player 1 click operation on a fire button in a human-machine interaction interface, control virtual object a to perform a firing operation to emit a virtual emission, such as a virtual bullet, virtual rocket, etc., from a virtual firing prop held by virtual object a), determine a field of view of a second virtual object (e.g., virtual object B controlled by player 2) in the virtual scene, e.g., the client may determine a field of view of the second virtual object in the virtual scene by: acquiring the direction of a second virtual object in the virtual scene; and determining the field of view of the second virtual object in the virtual scene based on the visual angle range of the virtual camera, wherein the virtual camera is positioned at the eyes of the second virtual object and is consistent with the direction of the second virtual object.
Taking a first virtual object as a virtual object a controlled by a player 1 and a second virtual object as a virtual object B controlled by a player 2 as an example, a client associated with the player 2 (i.e. a client that the player 2 logs in based on an account of the client) responds to the virtual object a controlled by the player 1 to launch a virtual projectile in a virtual scene (for example, the player 1 controls the virtual object a to launch a virtual bullet in the virtual scene), acquires the current direction of the virtual object B controlled by the player 2 in the virtual scene, then acquires the current direction of the virtual object B based on the eyes of the virtual object B, and the visual angle range (i.e. the range covered by the lens of the virtual camera) of the virtual camera consistent with the direction of the virtual object B, including the maximum visual angles respectively corresponding to the four directions up, down, left and Right, as shown in figure 11, the virtual camera may be represented at a position where an eye of a virtual object controlled by a player in a virtual scene is located, a range covered by a lens of the virtual camera represents a maximum angle up, down, left, and Right seen by a player's view, a Left cross section (Left Plane) 1101, a Right cross section (Right Plane) 1102, an upper cross section (Top Plane) 1103, and a lower cross section (Bottom Plane) 1104 are formed, and further, a nearest distance and a farthest distance of the player's view may be added, and a Near cross section (Near Plane) 1105 and a Far cross section (Far Plane) 1106 are formed, polygons enclosed by these six cross sections are called a Frustum (frusta), a part of the virtual scene located within the Frustum may be regarded as a view of the virtual object B in the virtual scene, that is, only an object appearing within the Frustum can be seen by the player 2, only the part of the virtual scene located inside the frustum of the virtual camera in the virtual scene is displayed in the man-machine interaction interface of the client operated by the terminal device associated with the player 2, so that the view of the virtual object B in the virtual scene is determined, for example, the part of the virtual scene located in the visible angle range of the virtual camera in the virtual scene can be determined as the view of the virtual object B in the virtual scene (namely, the part of the virtual scene that can be seen by the virtual object B in the virtual scene currently).
The view of the second virtual object in the virtual scene is not limited to the frustum, but may be another three-dimensional shape such as a sphere or a cone in front of the virtual camera, which is not particularly limited in the embodiment of the present application.
In step 103, in response to the three-dimensional motion trajectory of the virtual projectile overlapping the field of view, an overlapping portion in the three-dimensional motion trajectory is mapped from the three-dimensional space of the virtual scene to a two-dimensional motion trajectory in the two-dimensional space of the screen.
In some embodiments, the client may determine the three-dimensional motion trajectory of the virtual projectile by: determining the position of a launching part (for example, a launching port of a virtual weapon) of a virtual shooting prop (for example, a virtual weapon, a virtual rocket tube and the like) held by a first virtual object as a starting point of a three-dimensional motion track of the virtual shooting object; generating detection rays extending along the shooting direction by taking the shooting position of the virtual shooting prop as a starting point; determining a position in the virtual scene where the virtual emission collides with the detection ray (for example, a falling point of the virtual emission under the action of virtual gravity or a point where the virtual emission collides with an obstacle in the virtual scene, for example, a point where the virtual emission hits a target object in the virtual scene, i.e. a point where the target object is hit can be determined as an end point of the three-dimensional motion track), as an end point of the three-dimensional motion track of the virtual emission; a curve (which may be, for example, a straight line, or a parabola) between the start point and the end point of the three-dimensional motion trajectory is determined as the three-dimensional motion trajectory of the virtual emission.
For example, referring to fig. 4A, fig. 4A is a schematic application scenario of a method for processing a virtual projectile according to an embodiment of the present application, as shown in fig. 4A, a first virtual object 402 (for example, a virtual object a controlled by a player 1) and a virtual weapon 403 held by the first virtual object 402 are displayed in a human-computer interaction interface 401. Upon receiving a click operation of player 1 on a firing button displayed in the man-machine interface 401, the first virtual object 402 performs a shooting operation to cause the virtual weapon 403 to fire a virtual bullet, a position 404 where a firing port of the virtual weapon 403 held by the first virtual object 402 is located may be determined as a starting point of a three-dimensional motion trajectory of the virtual bullet, and, at the same time, assuming that the first virtual object 402 controls the virtual bullet fired by the virtual weapon 403 to hit a third virtual object 405 (for example, a virtual object C controlled by an artificial intelligence) in the virtual scene, a position 406 hit on the third virtual object 405 may be determined as an end point of the three-dimensional motion trajectory of the virtual bullet, and then a parabola 407 between the starting point and the end point of the three-dimensional motion trajectory of the virtual bullet may be determined as a three-dimensional motion trajectory of the virtual bullet (i.e., the virtual bullet will subsequently fly from the starting point 404 along the parabola 407 to the end point 406).
In other embodiments, when the client detects that the three-dimensional motion trajectory of the virtual emission object emitted by the first virtual object overlaps with the field of view of the second virtual object (i.e., the virtual emission object emitted by the first virtual object in the virtual scene appears in the field of view of the second virtual object), the above-mentioned overlapping portion in the three-dimensional motion trajectory may be implemented by mapping from the three-dimensional space of the virtual scene to the two-dimensional motion trajectory in the two-dimensional space of the screen: acquiring a starting point and an ending point of an overlapped part in a three-dimensional motion track; multiplying the coordinates of the start point and the end point of the overlapped part in the three-dimensional space of the virtual scene with the space transformation and the projection matrix of the virtual camera (for example, the virtual camera which is positioned at the eye of the second virtual object and has the same orientation with the second virtual object) respectively to obtain the coordinates of the start point and the end point of the two-dimensional motion track in the two-dimensional space of the screen; and determining a connecting line between the starting point and the end point of the two-dimensional motion trail as the two-dimensional motion trail of the virtual emission object in the two-dimensional space of the screen.
For example, referring to fig. 4B, fig. 4B is a schematic diagram of mapping a three-dimensional space of a virtual scene to a two-dimensional space of a screen, as shown in fig. 4B, taking a first virtual object as a virtual object a controlled by a player 1 and a second virtual object as a virtual object B controlled by a player 2 as an example, when a client associated with the player 2 detects that a three-dimensional motion track (i.e., a line segment SE shown in fig. 4B) of a virtual emission object emitted by the virtual object a controlled by the player 1 in the virtual scene overlaps with a field of view (i.e., a trapezoid composed of broken lines shown in fig. 4B, in order to clearly represent in fig. 4B, a frustum is represented by a two-dimensional trapezoid), the client may first acquire the start point (i.e., point a shown in fig. 4B) and the end point (i.e., point B shown in fig. 4B) of the overlapping portion in the three-dimensional motion trajectory of the virtual object, and then multiply the coordinates of the start point (i.e., point a) and the end point (i.e., point B) of the overlapping portion in the three-dimensional space of the virtual scene (which is a three-dimensional coordinate, for example, assuming that the coordinates of point a in the virtual scene are x1, y1, z1, and the coordinates of point B in the virtual scene are x2, y2, z3, representing the positions of the start point and the end point of the overlapping portion displayed in the 3D world of the virtual scene, respectively) with the spatial transformation of the virtual camera and the projection matrix (e.g., matrix M) to obtain the start point (i.e., point A1 shown in fig. 4B) of the two-dimensional motion trajectory, a point obtained by mapping the start point of the overlapping portion from the three-dimensional space of the virtual scene to the two-dimensional space of the screen) and an end point (i.e., a point B1 shown in fig. 4B, a point obtained by mapping the end point of the overlapping portion from the three-dimensional space of the virtual scene to the two-dimensional space of the screen) in the two-dimensional space of the screen (this is a two-dimensional coordinate, for example, the coordinates of the point A1 in the screen are u1, v1, the coordinates of the point B1 in the screen are u2, v2, representing the positions where the start point and the end point of the two-dimensional motion trajectory are displayed in the screen, respectively); the connection between the start point and the end point of the two-dimensional motion trajectory may then be determined as the two-dimensional motion trajectory of the virtual emission in the two-dimensional space of the screen (i.e., the line between the A1 point and the B1 point shown in fig. 4B, i.e., the virtual emission emitted by the virtual object a would enter the view of player 2 from the A1 point in the screen and leave the view of player 2 from the B1 point).
The spatial transformation and projection matrix of the virtual camera is described below.
In some embodiments, the spatial transformation and Projection matrix of the virtual camera may include 4 parts, a Model (Model), a View (View), and a Projection (project) and a View (Viewport), respectively. For the three-dimensional space of the virtual scene, model position positioning, view angle positioning and Projection deformation can be generally performed jointly through Model, view and project, which is very different from the two-dimensional space in which positioning can be performed by only one pixel coordinate on a screen. Since Model, view and project are matrices of 4*4, that is, the operation object is a four-dimensional vector, when mapping coordinates of an object in a three-dimensional space of a virtual scene into a two-dimensional space of a screen, it is necessary to first supplement coordinates (x, y, z) of the object in the three-dimensional space of the virtual scene with a w component to become (x, y, z, w). Then the first three transformed matrixes are multiplied to the vector in turn, the four-dimensional vector is obtained by homogeneous division (namely, all components are divided by w component), and finally, the x component and the y component are multiplied by Viewport, so that the coordinates of the object in the screen can be obtained.
In addition, it should be noted that, in the embodiment of the present application, the motion track (including the three-dimensional motion track and the two-dimensional motion track) is a vector, that is, has a start point and an end point, and the direction is from the start point to the end point.
In other embodiments, accepting the above, the start and end points of the overlapping portion of the three-dimensional motion trajectory of the virtual emission with the field of view of the second virtual object may be obtained by: for the case where the first virtual object emits the virtual emission outside the field of view of the second virtual object, as shown in fig. 12, in response to the start point of the three-dimensional motion trajectory of the virtual emission (i.e., the S point shown in fig. 12) being located outside the field of view of the second virtual object (i.e., the trapezoid composed of broken lines shown in fig. 12), determining the point of incidence of the three-dimensional motion trajectory of the virtual emission (i.e., the line segment SE shown in fig. 12) in the field of view (i.e., the a point shown in fig. 12) as the start point of the overlapping portion, and the point of emergence of the three-dimensional motion trajectory in the field of view (i.e., the B point shown in fig. 12) as the end point of the overlapping portion; further, for the case where the first virtual object emits the virtual emission object within the field of view of the second virtual object, as shown in fig. 13, in response to the start point (i.e., the S point shown in fig. 13) of the three-dimensional movement locus of the virtual emission object (i.e., the line segment SE shown in fig. 13) being located within the field of view of the second virtual object, the start point of the three-dimensional movement locus of the virtual emission object is determined as the start point of the overlapping portion (in this case, the start point of the overlapping portion, i.e., the a point shown in fig. 13, is coincident with the S point), and the exit point (i.e., the B point shown in fig. 13) of the three-dimensional movement locus in the field of view is determined as the end point of the overlapping portion.
In some embodiments, as shown in fig. 12, the client may further generate a virtual projectile (e.g., a virtual bullet or a virtual rocket) at the start point of the three-dimensional motion trajectory of the virtual projectile (i.e., the S point shown in fig. 12), and then control the virtual projectile to jump from the start point of the three-dimensional motion trajectory to the start point of the overlapping portion (i.e., the first virtual object is transmitting the virtual projectile outside the field of view of the second virtual object) in response to the start point of the three-dimensional motion trajectory of the virtual projectile not coinciding with the start point of the overlapping portion (i.e., the first virtual object is transmitting the virtual projectile outside the field of view of the second virtual object), i.e., control the generated virtual projectile to jump directly from the start point of the S point to the a point from the S point shown in fig. 12, such that the virtual projectile is flying from the S point to the a point is outside the player screen (i.e., the virtual projectile is not visible to the S point to the a point), i.e., the virtual projectile is not required to be displayed in this portion of the flying process, and thus the virtual projectile may be controlled to jump directly from the S point to the a point to the point a point to save system resources of the terminal device.
Taking the first virtual object as a virtual object a controlled by the player 1 and the second virtual object as a virtual object B controlled by the player 2 as an example, after receiving the shooting operation triggered by the player 1, the client may generate the virtual emission at the starting point of the three-dimensional motion track of the virtual emission (for example, the position where the emitting part of the virtual shooting prop held by the virtual object a is located), and then when the client detects that the starting point of the three-dimensional motion track of the virtual emission does not coincide with the starting point of the overlapping part (that is, the virtual object a emits the virtual emission outside the field of view of the virtual object B), since the virtual emission cannot be seen by the player 2 during the flight outside the screen, the client may control the virtual emission emitted by the virtual object a to directly jump from the starting point of the three-dimensional motion track to the starting point of the overlapping part, so that the system resources of the terminal device and the server may be saved without affecting the game experience of the player 2.
In other embodiments, with the above, before controlling the virtual projectile to jump from the start point of the three-dimensional motion trajectory to the start point of the overlapping portion, the following process may also be performed: the control of the duration that the virtual emission stays at the start point of the three-dimensional motion track corresponds to at least one frame, for example, assuming that the virtual emission is generated at the start point of the three-dimensional motion track in the 3 rd frame, the control of the duration that the virtual emission stays at the start point of the three-dimensional motion track corresponds to at least one frame can control the virtual emission to stay at the start point of the three-dimensional motion track in the 4 th frame (that is, the position of the virtual emission does not change in the 4 th frame), so that the effect that the virtual emission is emitted in the virtual scene can be further rendered, and the game experience of a user is improved.
It should be noted that, after the duration of the stay of the virtual projectile at the start point of the three-dimensional motion trajectory reaches the duration threshold (for example, the duration corresponding to one frame), the step of controlling the virtual projectile to jump from the start point of the three-dimensional motion trajectory to the start point of the overlapping portion may be performed.
In step 104, a duration of a corresponding proportion of the perceived duration is determined according to a proportion of the length of the two-dimensional motion trajectory to the size of the screen.
Here, the perceived duration is a duration (e.g., shortest duration) during which the player can perceive the flight direction of the virtual emission in the screen.
In some embodiments, referring to fig. 5A, fig. 5A is a flowchart of a method for processing a virtual emission object according to an embodiment of the present application, as shown in fig. 5A, step 104 shown in fig. 3 may be implemented by steps 1041 to 1043 shown in fig. 5A, and will be described with reference to the steps shown in fig. 5A.
In step 1041, a size of the screen and a perceived duration corresponding to the size of the screen are acquired.
Here, the perceived duration may be positively correlated with the size of the screen, i.e., the larger the size of the screen, the longer the corresponding perceived duration, e.g., when the size of the screen is 6.5 inches, the corresponding perceived duration is 0.2 seconds; when the screen size is 8 inches, the corresponding perceived time period is 0.25 seconds.
In some embodiments, corresponding sensing time periods may be configured for different sizes of screens in advance, and different sizes and sensing time periods corresponding to each size may be stored in a database, so after the size of the screen of the current terminal device is obtained, the sensing time period corresponding to the size may be queried from the database based on the obtained size.
The sensing time period is an optimal value obtained by testing a plurality of testers, and is different according to differences of groups (such as eyesight, response sensitivity, etc.), and the following description is given of a sensing time period testing process.
In some embodiments, the perceived duration for each size of screen may be determined by: setting a plurality of corresponding test time periods for each size, distributing the test time periods to a plurality of testers for testing, obtaining target test time periods selected by each tester in the test time periods, calculating average values of the target test time periods selected by the testers, and determining the calculated average values as sensing time periods corresponding to the screens of the sizes.
For example, taking a mobile phone with a size of 6.5 inches (for example, a screen length of 14.39 cm) as an example, a plurality of test durations may be preset for the mobile phone with the size, for example, assuming that a total of 5 test durations are set, the test durations are respectively 0.05 seconds (i.e., a duration of controlling the virtual projectile to fly through the whole screen from left to right is 0.05 seconds), 0.1 seconds (i.e., a duration of controlling the virtual projectile to fly through the whole screen from left to right is 0.1 seconds), 0.15 seconds (i.e., a duration of controlling the virtual projectile to fly through the whole screen from left to right is 0.15 seconds), 0.2 seconds (i.e., a duration of controlling the virtual projectile to fly through the whole screen from left to right is 0.2 seconds) and 0.25 seconds (i.25 seconds of controlling the virtual projectile to fly through the whole screen from left to right), then a tester can clearly select the test duration for the 5 test durations, assuming that the tester a selects 0.15 seconds (i.e., the tester a time is considered to be in 0.15 seconds to be in a viewing process), the tester B can clearly see the virtual projectile in a viewing process of 0.25 seconds (i.25 seconds is considered to be in a viewing process of the average value of 0.2 seconds) and the tester is considered to be clearly in a viewing process of the test duration of 0.25 seconds (i.25 seconds is considered to be clearly in a viewing time is considered in a viewing process of the test time is selected by a test person is viewing a flight time is in a test time is viewing a test time is 0.2 time is viewing time). I.e., (0.15+0.2+0.2+0.25) seconds/4=0.2 seconds, the perceived duration corresponding to a 6.8 inch cell phone (i.e., for a 6.8 inch cell phone, when the virtual emissions fly from left to right for 0.2 seconds, the player is able to perceive the direction of flight of the virtual emissions in the screen). That is, the duration of 0.2 seconds is the best perceived by the tester, and can not only clearly see the flight direction of the virtual emission in the screen, but also show the sense of speed of the virtual emission. Because if the duration is longer (e.g., 0.3 seconds), the virtual emission appears to be slow, with no impact; whereas if the duration is shorter (e.g., 0.1 seconds), only one line is visually seen by the tester, and the direction of flight of the virtual projectile is not known, since the speed of flight of the virtual projectile is too fast.
It should be noted that, the size of the screen may also be the width of the screen, the length of the diagonal line, etc., and for other sizes, the corresponding sensing duration may also be configured for the screen through the above method, which is not described herein again in the embodiments of the present application.
In step 1042, a ratio of the length of the two-dimensional motion trajectory to the size of the screen is determined.
In some embodiments, after the size of the screen is obtained, a ratio of the length of the two-dimensional motion trajectory of the virtual emission in the two-dimensional space of the screen to the size of the screen may be determined, for example, taking fig. 8 as an example, point B shown in fig. 8 is the start point of the two-dimensional motion trajectory, point a is the end point of the two-dimensional motion trajectory, the distance between AB is assumed to be X pixels, and the width of the screen is assumed to be W pixels, and the ratio of the length of the two-dimensional motion trajectory to the size of the screen is assumed to be X/W.
In step 1043, the result of multiplying the proportion by the perceived duration is determined as the duration.
In some embodiments, after the perceived duration corresponding to the size of the screen and the ratio of the length of the two-dimensional motion trajectory to the size of the screen are obtained, the result of multiplying the calculated ratio and perceived duration may be determined as the duration. For example, assuming that the perceived duration corresponding to the size of the screen acquired in step 1041 is T, and assuming that the ratio of the length of the two-dimensional motion trajectory calculated in step 1042 to the size of the screen is X/W, the duration t1= (X/W) ×t.
In step 105, the virtual projectile is controlled to fly along a two-dimensional trajectory.
Here, the duration of the virtual projectile flying along the two-dimensional motion trajectory is not less than the duration. For example, when the client controls the virtual emission object to fly along the two-dimensional motion track, the duration of the virtual emission object flying along the two-dimensional motion track can be controlled to be not smaller than the duration, so that the player can clearly see the flying direction of the virtual emission object in the screen by ensuring the visible flying duration of the virtual emission object in the screen.
In some embodiments, the controlling the virtual projectile to fly along the two-dimensional motion trajectory may be further implemented by: controlling the virtual emission object to fly along the two-dimensional motion track in a uniform manner; alternatively, the virtual projectile is controlled to fly along a two-dimensional trajectory in a variable speed manner (e.g., a manner in which the flight speed gradually decreases).
Taking a virtual projectile as an example, the client can control the virtual projectile to fly along the two-dimensional motion track in a uniform manner; of course, the client may also control the virtual bullets to fly along the two-dimensional motion trajectories in a variable speed manner (e.g., a manner in which the flight speed is getting slower due to drag). It should be noted that, in order to ensure the visible flight duration of the virtual bullet in the screen, the duration of the client in controlling the virtual bullet to fly along the two-dimensional motion track is not less than the duration no matter what flight mode is.
The following describes a specific procedure for controlling the virtual projectile to fly along the two-dimensional motion trajectory in a uniform manner.
In other embodiments, referring to fig. 5B, fig. 5B is a flowchart of a method for processing a virtual emission object according to an embodiment of the present application, as shown in fig. 5B, before executing step 105 shown in fig. 3, steps 106 and 107 shown in fig. 5B may also be executed, and will be described in connection with the steps shown in fig. 5B.
In step 106, the flight speed of the virtual emission in the three-dimensional space of the virtual scene is determined according to the length and duration of the overlapping portion.
In some embodiments, after the duration is obtained based on the step 104, the flight speed of the virtual emission in the three-dimensional space of the virtual scene may be determined according to the length and the duration of the overlapping portion in the three-dimensional space of the virtual scene, for example, the result of dividing the length and the duration of the overlapping portion may be determined as the flight speed of the virtual emission in the three-dimensional space of the virtual scene.
It should be noted that, because the sensing duration is obtained through testing, and the duration is obtained by converting the sensing duration according to the proportion, if the virtual projectile flies according to the flight speed recalculated by using the duration, it can be ensured that the player can sense the flight direction of the virtual projectile.
In step 107, the flight speed of the virtual emission in the three-dimensional space of the virtual scene is mapped to the flight speed of the virtual emission in the two-dimensional space of the screen.
In some embodiments, the above-mentioned flying speed of the virtual emission in the three-dimensional space of the virtual scene may be achieved by mapping the flying speed of the virtual emission in the two-dimensional space of the screen to: and multiplying the flying speed of the virtual emission object in the three-dimensional space of the virtual scene by the space transformation of the virtual camera and the projection matrix to obtain the flying speed of the virtual emission object in the two-dimensional space of the screen, and then controlling the virtual emission object to fly along the two-dimensional motion track according to the flying speed of the virtual emission object in the two-dimensional space of the screen obtained by mapping.
It should be noted that the flight speed in the two-dimensional space of the screen is only used to show the flight direction of the virtual projectile in the perspective of a third party (e.g., the player corresponding to the second virtual object), and is different from the simulation speed used to calculate the hit effect of the virtual projectile in the interactive logic of the virtual scene, which is based on the real speed corresponding to the real projectile (e.g., bullet) simulated in the real world, so that the accurate simulation of the performance of the virtual projectile can be achieved. That is, in the interaction logic, whether the virtual projectile is able to hit a target object in the virtual scene, when it hits the target object, is still implemented based on the simulation speed. In other words, the scheme provided by the embodiment of the application uses two independent speed mechanisms at the logic level and the visual angle level of the observer, so that on one hand, the player can be ensured to see the flight direction of the virtual emission clearly through sensing the duration; on the other hand, only the speed of the portion of the three-dimensional trajectory motion of the virtual bullet overlapping the field of view is controlled, without affecting the simulation speed used in the interaction logic to calculate the shooting target of the virtual projectile and whether to hit the virtual projectile (i.e., the two speeds are different), thereby ensuring an accurate simulation of the performance of the virtual projectile.
In some embodiments, the client may also perform the following: and controlling the virtual emission object to jump from the end point of the overlapped part to the end point of the three-dimensional motion track after the virtual emission object reaches the end point of the overlapped part in response to the end point of the three-dimensional motion track not coinciding with the end point of the overlapped part. For example, taking fig. 12 as an example, since the virtual emission is not visible to the player during the off-screen flight, it is not necessary to perform the presentation, and after the virtual emission reaches the point B (i.e., the end point of the overlapping portion) shown in fig. 12, the virtual emission can be controlled to jump from the point B directly to the point E (i.e., the end point of the three-dimensional motion trajectory of the virtual emission), so that the system resources of the server and the terminal device can be saved without affecting the game experience of the player.
In other embodiments, after controlling the virtual emission to jump from the end point of the overlapping portion to the end point of the three-dimensional motion trajectory, the following process may be further performed: controlling the virtual emission object to stay at the end point of the three-dimensional motion track for a period corresponding to at least one frame; and canceling displaying the virtual emission in the virtual scene in response to the duration of the stay of the virtual emission at the end point of the three-dimensional motion trail being greater than a duration threshold. For example, taking fig. 12 as an example, after controlling the virtual emission to jump from point B (i.e., the end point of the overlapping portion) to point E (i.e., the end point of the three-dimensional motion trajectory), the virtual emission may be controlled to stay at point E for a period of time corresponding to at least one frame, so as to increase the effect of the virtual emission hitting the target object. In addition, when the stay time of the virtual emission at point E is longer than the time threshold, the display of the virtual emission in the virtual scene may be canceled (i.e., the virtual emission is reclaimed so as to avoid the long-time display of the virtual emission from interfering with the player).
According to the processing method of the virtual emission, provided by the embodiment of the application, two independent speed mechanisms are used on the logic level and the visual angle level of an observer, so that on one hand, a player can be ensured to see the flight direction of the virtual emission clearly through sensing time length; on the other hand, only the speed of the portion of the three-dimensional trajectory motion of the virtual bullet overlapping the field of view is controlled, without affecting the simulation speed used in the interaction logic to calculate the shooting target of the virtual projectile and whether to hit the virtual projectile (i.e., the two speeds are different), thereby ensuring an accurate simulation of the performance of the virtual projectile.
In the following, an exemplary application of the embodiment of the present application in an actual application scenario is described using a virtual scenario as an example of a game.
In shooting games, the client receives the gun-firing information of enemies (such as other virtual objects in hostile camps in the game) notified by the server, and generally includes a start point S, an end point E (i.e., hit point) and a flight speed V of the virtual bullet flight. The client then creates a virtual bullet (i.e., a special bullet effect) at the start point S, which is flown at a speed V from the start point S to the end point E for a total of Distance (E-S)/V, which is the result of the division of the Distance between the start point S and the end point E with the speed V.
For example, referring to fig. 6, fig. 6 is a schematic diagram of a virtual projectile processing method according to an embodiment of the present application, as shown in fig. 6, a virtual object 602 controlled by a player 1 and a virtual object 603 controlled by a player 2 are displayed in a virtual scene 601, and if a virtual bullet launched by the player 1 controlling the virtual object 602 in the virtual scene 601 hits the virtual object 604 in the virtual scene 601, the virtual bullet flies from a starting point (i.e. S point shown in fig. 6) to an ending point (e.g. E point on the virtual object 604). The distance from the start point S to the end point E of the virtual bullet is assumed to be 1000 meters, while the flight speed of the virtual bullet is assumed to be 2000 meters per second, so the virtual bullet requires a total flight time of 0.5 seconds. Further, the range formed by the dotted line portion shown in fig. 6 represents the field of view of player 2 in virtual scene 601, that is, a virtual bullet launched by virtual object 602 would enter the field of view of player 2 from point B and then pass out of the field of view of player 2 from point a. Assuming that the distance between point B and point a is 50 meters, the virtual bullet stays between the fields of view of player 2 for only 0.025 seconds, which is a short time, as shown in fig. 7, player 2 can see only a tail 702 of the special effect of the bullet, i.e. a line, in screen 701, or only the smoke left by the virtual bullet in the flight, but can not see at all whether the virtual bullet flies from point B to point a or from point a to point B.
In view of this, the embodiment of the application provides a method for processing virtual shots, which solves the problem that only one trailing ghost can be seen in a screen, but the flight direction of a virtual bullet cannot be seen because the flight speed of the virtual shots (such as virtual bullets) sent by others is too high in shooting games. Specifically, the embodiment of the application achieves the aim of enabling a player to clearly see the flight direction of the virtual bullet by recalculating the starting point, the finishing point and the flight speed of the trajectory of the virtual bullet and ensuring the visible flight time of the virtual bullet in a screen.
The method for processing the virtual emission object provided by the embodiment of the application is specifically described below.
In some embodiments, it is assumed that if the virtual bullet flies from left to right through the screen with a pixel width W for a time greater than or equal to 0.2 seconds (corresponding to the perceived time period described above), then the player is able to see the direction of flight of the virtual bullet. Then, the case of fig. 6 is represented by the player's perspective, and can be converted to fig. 8. As shown in fig. 8, the black boxes in the figure represent the player's screen, with virtual bullets entering the screen from point B at the edge of the screen and exiting the screen from point a. Let BA be X pixels in length and W pixels in total width of the screen. It can be determined that the virtual bullet should stay in the screen for 0.2 seconds (X/W). After the time is obtained, the speed at which the virtual bullet is expected to fly can be found by using the real distance from point B to point a in the actual 3D space of the game (for example, assuming 50 meters).
The following describes a determination procedure of 0.2 seconds.
In some embodiments, for a screen with a pixel width W, the test duration may be gradually increased from 0.016 seconds (i.e., the duration of one frame of 60 frames/second), for example, from 0.016 seconds to 0.032 seconds, and then from 0.032 seconds to 0.048 seconds, and then the screen may be pushed, and then given to a tester (e.g., multiple experimenters) for testing, and finally, most experimenters find that the time feel of about 0.2 seconds is the best, so that the flight direction of the virtual bullet in the screen can be clearly seen, and the speed feel of the virtual bullet can be reflected. Because the time is too long, the virtual bullet appears slow without impact force, although the direction of flight of the virtual bullet can be seen clearly. The time is too short, the flight speed of the virtual bullet is too high, and the player can only see one connecting line in the screen, so that the flight direction of the virtual bullet cannot be clearly seen.
Furthermore, it should be noted that the flying speed of the virtual bullet calculated in the above manner is far less than the real speed. In order to avoid the overlong flight time of the virtual bullet, the starting point of the virtual bullet can be directly changed from the point S to the point B, and the end point is changed from the point E to the point A. After all, the virtual bullets are not visible to the player during off-screen flights, nor are they necessarily displayed.
For example, referring to fig. 9, fig. 9 is a schematic view of an application scenario of a virtual projectile processing method according to an embodiment of the present application, as shown in fig. 9, compared to a solution provided by a related art (for example, a tail 702 of a special effect of a bullet shown in fig. 7), a virtual bullet has a significantly slower speed when flying across a screen of a player, and a player can visually see a track 902 and a direction of the virtual bullet flying in a man-machine interface 901 (for example, the player can clearly see that the virtual bullet flies from left to right). In addition, the virtual bullet will move instantaneously to the actual end point after flying off the player's screen.
Referring to fig. 10, fig. 10 is a schematic flow chart of a method for processing a virtual emission object according to an embodiment of the present application, and the steps shown in fig. 10 will be described.
In step 201, a frustum of a virtual camera is acquired.
In some embodiments, referring to fig. 11, fig. 11 is a frustum schematic diagram of a virtual camera provided in an embodiment of the present application, where, as shown in fig. 11, a left virtual camera represents a position where an eye of a game character controlled by a player is located in a 3D world, an up-down, a left-right maximum angle seen by a field of view of the player forms a left section, a right section, an up section, a down section, a nearest distance and a farthest distance of the field of view of the player, and forms a near section and a far section, and polygons enclosed by the six sections are called a frustum, and only objects that appear in the frustum can be seen by the player.
In step 202, the intersection of the frustum with the bullet trajectory is calculated.
In some embodiments, after the current frustum of the virtual camera is acquired, the intersection of the frustum with the trajectory of the virtual bullet (corresponding to the three-dimensional motion trajectory described above) may be calculated. As shown in fig. 12, assuming that the actual starting position of the trajectory of the virtual bullet is S point, the end point is E point, the intersection points of the trajectory of the virtual bullet and the frustum are a point a (i.e., the incident point) and a point B (i.e., the exit point), respectively, for clarity of representation, the frustum is represented by two-dimensional (2 d, 2-Dimension) in fig. 12, the virtual bullet is launched from S point to E point in fig. 12, and the intersection points of the frustum are a point and B point, respectively. That is, if the virtual camera is stationary at all times, then the virtual bullet will enter the player's field of view at point A and then sweep out of the player's field of view from point B.
In other embodiments, referring to fig. 13, fig. 13 is a schematic diagram of a method for processing a virtual projectile according to an embodiment of the application, and fig. 13 represents a special case, where a virtual bullet is launched from point S to point E, as shown in fig. 13, but point S is directly inside the frustum, that is, the virtual bullet is launched in the field of view of the player. In this case, the origin of the virtual bullet (i.e., point S) and the intersection of the trajectory and the frustum (i.e., point a) are coincident.
In step 203, it is determined whether the trajectory and the frustum are related, and if they do not intersect, step 204 is performed; if so, step 205 is performed.
In step 204, the flow is exited.
In some embodiments, if the trajectory of the virtual bullet does not intersect the virtual camera's current frustum (i.e., the virtual bullet root does not appear in the player's field of view), then no bullet special effects are generated and the process is exited.
In step 205, the intersection of the trajectory and frustum is scaled to screen space.
In some embodiments, after the intersection of the trajectory of the virtual bullet with the frustum (i.e., points A and B) is calculated in step 202, points A and B may be scaled to the player's screen space and marked as points A1 and B1, respectively.
By way of example, referring to fig. 14, fig. 14 is a schematic diagram illustrating a conversion from world space to screen space according to an embodiment of the present application, as shown in fig. 14, by multiplying the position of an object in world space by the spatial transformation of a virtual camera and a projection matrix M, a position { u, v } where u represents an abscissa and v represents an ordinate, where the object is finally presented in the screen can be obtained. That is, the point A1 in the screen space is obtained by multiplying the point a by the spatial transformation of the virtual camera and the projection matrix M, and the point B1 in the screen space is obtained by multiplying the point B by the spatial transformation of the virtual camera and the projection matrix M. For example, fig. 15 represents the result of the mapping of fig. 12 in the screen space, and fig. 16 represents the result of the mapping of fig. 13 in the screen space.
In step 206, the actual time of flight T1 of the virtual bullet is calculated from the pixel distance between the points in screen space where the virtual bullet appears and disappears.
In some embodiments, after the point a and the point B are scaled to the screen space of the player to obtain the corresponding point A1 and point B1, the pixel distances of the point A1 and the point B1 in the screen space may be calculated first, and the time that the virtual bullet needs to stay in the screen may be calculated according to the actual pixel width of the screen. For example, assuming that a virtual bullet is considered to fly left to right across the screen with a pixel width W for a total of T seconds of flight to see the direction of flight of the virtual bullet, then the time that the virtual bullet actually needs to fly is T1 seconds, where t1= |a1b1|/w×t, |a1b1| represents the pixel distance between points A1 and B1.
In step 207, the virtual bullet actual flight velocity V is calculated using T1 and the intersection of the bullet trajectory and the frustum.
In some embodiments, after calculating the time T1 that the virtual bullet actually needs to stay in the screen, the flight speed V of the virtual bullet may be back calculated according to the time T1 that the virtual bullet actually stays in the screen and the distances of points a and B in the world space. Wherein the flying speed v= |ab|/T1, wherein ab| represents the true distance from point a to point B in 3D space
In step 208, the virtual bullet is generated at the virtual bullet' S actual origin S.
In some embodiments, the first frame may have the virtual child pop up at the now S point.
In step 209, it is determined whether the S point coincides with the intersection point a of the ballistic frustum, and if not, step 210 is performed; if so, step 211 is performed.
In step 210, a frame is awaited in place and a virtual bullet is caused to appear at the frustum intersection A2 of the trajectory and the present frame.
In some embodiments, as shown in fig. 12, when the intersection point of the S point and the ballistic frustum (i.e., point a) does not coincide, after the virtual bullet is generated at the S point, a frame may be waited for in situ, and the intersection points A2 and B2 between the ballistic line segment SE and the frustum of the virtual camera of the current frame may be recalculated, where A2 is the first intersection point (i.e., the incident point) and B2 is the second intersection point (i.e., the exit point), and then the virtual bullet generated at the S point in step 208 may be directly moved to the point A2.
In step 211, the virtual bullet is allowed to start traveling in the original path direction at a velocity V.
In some embodiments, at the point of intersection where the virtual bullet is moved to the ballistic frustum (e.g., point a shown in fig. 12, or point a shown in fig. 13), the virtual bullet may be moved toward the original path at the velocity V calculated in step 207.
In step 212, it is determined whether the virtual bullet has already been a frustum, and if not, step 213 is performed; if so, step 214 is performed.
In some embodiments, it may be calculated whether the virtual bullet has currently gone out of the frustum of the virtual camera (e.g., whether point B shown in fig. 12 or 13 has been reached) or whether endpoint E has been reached, and if so, jump to step 214; otherwise, the process goes to step 213.
In step 213, one frame is advanced.
In some embodiments, when the virtual bullet does not exit the frustum of the virtual camera, a frame may be advanced and the process may be resumed to step 212.
In step 214, the virtual bullet is moved directly to end point E and stopped for one frame.
In some embodiments, after the virtual bullet has emerged as a frustum, the virtual bullet may be moved instantaneously to endpoint E and stopped at endpoint E for one frame.
In step 215, the virtual bullets are recovered.
In some embodiments, after the duration of the virtual bullet stay at the endpoint E exceeds a duration threshold (e.g., 1 second), the virtual bullet-related resources may be reclaimed, thereby canceling the display of the virtual bullet in the game screen.
In other embodiments, the player may also be helped to clearly determine the direction of the virtual bullet's flight by displaying a direction indicator (e.g., an arrow indicating the direction of the virtual bullet's flight) directly on the trajectory of the virtual bullet.
The following describes the beneficial effects of the method for processing a virtual emission object according to the embodiment of the present application with reference to fig. 6. As shown in fig. 6, after the method provided by the embodiment of the present application is applied in the shooting game, whether the target object is hit is calculated according to the simulated velocity of the virtual bullet, and the hit effect is synchronously displayed on the shooter (i.e., the virtual object 602 shown in fig. 6) and the hit party (i.e., the virtual object 604 shown in fig. 6). Whereas for an observer (e.g. a virtual object 603 controlled by player 2) the presentation of his field of view is independent of the shooter and the hit, i.e. the trajectory of the virtual bullet in the screen of player 2 is not perfectly synchronized with player 1, so that the direction of flight of the virtual bullet can be presented at different speeds. In addition, by jumping twice (i.e., the virtual bullet jumps directly from point S to point B before entering the field of view of player 2, and jumps directly from point a to point E after exiting the field of view of player 2, which is equivalent to compressing the total running time of the virtual bullet as much as possible into the field of view of player 2), the time that player 2 perceives that the virtual bullet flies out of the field of view is guaranteed to coincide with the time that the shooter and the shooter perceive that the virtual bullet hits the target object (i.e., virtual object 604), thus, while taking into account the accurate simulation of the running characteristics of the virtual bullet, the viewing experience that player 2 perceives the direction of flight of the virtual bullet is also guaranteed.
Continuing with the description below of an exemplary architecture of the virtual emission processing apparatus 465 implemented as software modules provided by embodiments of the present application, in some embodiments, as shown in FIG. 2, the software modules stored in the virtual emission processing apparatus 465 of the memory 460 may include: a display module 4651, a determination module 4652, a mapping module 4653, and a control module 4654.
A display module 4651 for displaying virtual scenes in the human-computer interaction interface; a determination module 4652 for determining a field of view of a second virtual object in the virtual scene in response to the first virtual object emitting a virtual emission in the virtual scene; wherein the second virtual object is any object in the virtual scene that is distinct from the first virtual object; a mapping module 4653 for mapping the overlapping portion in the three-dimensional motion trajectory from the three-dimensional space of the virtual scene to a two-dimensional motion trajectory in the two-dimensional space of the screen in response to the three-dimensional motion trajectory of the virtual projectile overlapping the field of view; the determining module 4652 is further configured to determine a duration of a corresponding proportion of the perceived duration according to a proportion of a length of the two-dimensional motion trajectory to a size of the screen, where the perceived duration is a duration in which a flight direction of the virtual emission in the screen can be perceived; the control module 4654 is configured to control the virtual projectile to fly along a two-dimensional motion trajectory, where a duration of the flight along the two-dimensional motion trajectory is not less than a duration of the flight.
In some embodiments, the processing device 465 of the virtual emission further includes an acquisition module 4655 for acquiring an orientation of the second virtual object in the virtual scene; the determining module 4652 is further configured to determine, based on a range of angles of visibility of the virtual camera, a field of view of the second virtual object in the virtual scene, where the virtual camera is located at an eye of the second virtual object and is oriented in accordance with the second virtual object.
In some embodiments, the acquiring module 4655 is further configured to acquire a start point and an end point of the overlapping portion in the three-dimensional motion trajectory; the mapping module 4652 is further configured to multiply coordinates of a start point and an end point of the overlapping portion in a three-dimensional space of the virtual scene with a spatial transformation of the virtual camera and the projection matrix, respectively, to obtain coordinates of a start point and an end point of the two-dimensional motion trail in a two-dimensional space of the screen; the determining module 4652 is further configured to determine a line between a start point and an end point of the two-dimensional motion trajectory as a two-dimensional motion trajectory of the virtual projectile in a two-dimensional space of the screen.
In some embodiments, the determining module 4652 is further configured to determine, in response to the start point of the three-dimensional motion profile of the virtual emission being within the field of view, a start point of the three-dimensional motion profile as the start point of the overlapping portion, and an exit point of the three-dimensional motion profile in the field of view as the end point of the overlapping portion; and determining an incident point of the three-dimensional motion trajectory in the field of view as a start point of the overlapping portion and an exit point of the three-dimensional motion trajectory in the field of view as an end point of the overlapping portion in response to the start point of the three-dimensional motion trajectory of the virtual projectile being located outside the field of view.
In some embodiments, the processing device 465 of the virtual emission object further includes a generating module 4656 for generating the virtual emission object at a start point of a three-dimensional motion trajectory of the virtual emission object; the control module 4654 is further configured to control the virtual projectile to jump from the start of the three-dimensional motion trajectory to the start of the overlap in response to the start of the three-dimensional motion trajectory not coinciding with the start of the overlap.
In some embodiments, the control module 4654 is further configured to control the virtual projectile to dwell at the start of the three-dimensional motion trajectory for a duration corresponding to at least one frame.
In some embodiments, the control module 4654 is further configured to control the virtual projectile to jump from the endpoint of the overlap to the endpoint of the three-dimensional motion profile after the virtual projectile reaches the endpoint of the overlap in response to the endpoint of the three-dimensional motion profile not coinciding with the endpoint of the overlap.
In some embodiments, the control module 4654 is further configured to control the virtual projectile to stay at the end of the three-dimensional motion profile for a duration corresponding to at least one frame; the display module 4651 is further configured to cancel displaying the virtual emission in the virtual scene in response to a duration of the virtual emission stay at the end point of the three-dimensional motion trajectory being greater than a duration threshold.
In some embodiments, the obtaining module 4655 is further configured to obtain a size of the screen and a perceived duration corresponding to the size of the screen, where the perceived duration is positively correlated to the size of the screen; a determining module 4652, further configured to determine a ratio of a length of the two-dimensional motion trajectory to a size of the screen; and determining the multiplication result of the proportion and the perception duration as the duration.
In some embodiments, the control module 4654 is further configured to control the virtual projectile to fly along the two-dimensional trajectory in a uniform manner; alternatively, the virtual projectile is controlled to fly along a two-dimensional trajectory in a variable speed manner.
In some embodiments, the determining module 4652 is further configured to determine a three-dimensional motion profile of the virtual emission by: determining the position of the transmitting part of the virtual shooting prop held by the first virtual object as the starting point of the three-dimensional motion track of the virtual shooting prop; generating detection rays extending along the shooting direction by taking the shooting position of the virtual shooting prop as a starting point; determining the position of the virtual scene, which collides with the detection ray, as the end point of the three-dimensional motion track of the virtual emission object; and determining a curve between the starting point and the end point of the three-dimensional motion track as the three-dimensional motion track of the virtual emission object.
In some embodiments, the determining module 4652 is further configured to determine a flight speed of the virtual emission in a three-dimensional space of the virtual scene based on the length and duration of the overlapping portion; the mapping module 4653 is further configured to map a flight speed of the virtual emission in a three-dimensional space of the virtual scene to a flight speed of the virtual emission in a two-dimensional space of the screen; the control module 4654 is further configured to control the virtual projectile to fly along the two-dimensional motion trajectory according to a flying speed in the two-dimensional space.
Continuing with the description below of an exemplary architecture of the virtual emission processing apparatus 465 implemented as software modules provided by embodiments of the present application, in some embodiments, as shown in FIG. 2, the software modules stored in the virtual emission processing apparatus 465 of the memory 460 may include: a display module 4651 and a control module 4654.
A display module 4651 for displaying virtual scenes in the human-computer interaction interface; the control module 4654 is configured to, in response to the first virtual object emitting a virtual emission in the virtual scene, where there is an overlap between a three-dimensional motion trajectory of the virtual emission and a field of view of the second virtual object, control the virtual emission to fly along a two-dimensional motion trajectory, where the two-dimensional motion trajectory is obtained by mapping an overlapping portion of the three-dimensional motion trajectory and the field of view to a screen, and control a duration of the virtual emission to fly along the two-dimensional motion trajectory to be not less than a duration, where the duration is a duration of a ratio of a length of the corresponding two-dimensional motion trajectory to a size of the screen in a perceived duration, and the perceived duration is a duration in which a direction of flight of the virtual emission in the screen can be perceived.
In some embodiments, the control module 4654 is further configured to control the virtual projectile to fly along the two-dimensional trajectory in a uniform manner; alternatively, the virtual projectile is controlled to fly along a two-dimensional trajectory in a variable speed manner.
In some embodiments, the processing device 465 of the virtual emission object further includes a generating module 4656 for generating the virtual emission object at a start point of a three-dimensional motion trajectory of the virtual emission object; the control module 4654 is further configured to control the virtual projectile to jump from the start point of the three-dimensional motion profile to the start point of the overlap in response to the start point of the three-dimensional motion profile being outside the field of view, and to control the virtual projectile to jump from the end point of the overlap to the end point of the three-dimensional motion profile after the virtual projectile reaches the end point of the overlap in response to the end point of the three-dimensional motion profile not coinciding with the end point of the overlap.
It should be noted that, the description of the apparatus according to the embodiment of the present application is similar to the description of the embodiment of the method described above, and has similar beneficial effects as the embodiment of the method, so that a detailed description is omitted. The technical details of the processing device for virtual emission provided in the embodiment of the present application may be understood according to the description of any one of fig. 3, fig. 5A, or fig. 5B.
Embodiments of the present application provide a computer program product or computer program comprising computer instructions (i.e., executable instructions) stored in a computer readable storage medium. The processor of the electronic device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions, so that the electronic device executes the method for processing the virtual emission object according to the embodiment of the application.
Embodiments of the present application provide a computer readable storage medium having stored therein executable instructions that, when executed by a processor, cause the processor to perform a method provided by embodiments of the present application, for example, a method of processing a virtual emission as illustrated in any one of fig. 3, 5A, or 5B.
In some embodiments, the computer readable storage medium may be FRAM, ROM, PROM, EPROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; but may be a variety of devices including one or any combination of the above memories.
In some embodiments, the executable instructions may be in the form of programs, software modules, scripts, or code, written in any form of programming language (including compiled or interpreted languages, or declarative or procedural languages), and they may be deployed in any form, including as stand-alone programs or as modules, components, subroutines, or other units suitable for use in a computing environment.
As an example, the executable instructions may, but need not, correspond to files in a file system, may be stored as part of a file that holds other programs or data, for example, in one or more scripts in a hypertext markup language (HTML, hyper Text Markup Language) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
As an example, executable instructions may be deployed to be executed on one electronic device or on multiple electronic devices located at one site or, alternatively, on multiple electronic devices distributed across multiple sites and interconnected by a communication network.
The foregoing is merely exemplary embodiments of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and scope of the present application are included in the protection scope of the present application.
Claims (20)
1. A method of processing a virtual emission, the method comprising:
displaying a virtual scene in a human-computer interaction interface;
Determining a field of view of a second virtual object in the virtual scene in response to the first virtual object transmitting a virtual emission in the virtual scene; wherein the second virtual object is any object in the virtual scene that is distinct from the first virtual object;
mapping an overlapping portion in the three-dimensional motion trajectory from a three-dimensional space of the virtual scene to a two-dimensional motion trajectory in a two-dimensional space of a screen in response to the three-dimensional motion trajectory of the virtual projectile overlapping the field of view, and
determining the duration corresponding to the ratio in the perceived duration according to the ratio of the length of the two-dimensional motion trail to the size of the screen, wherein the perceived duration is the duration capable of perceiving the flight direction of the virtual emission in the screen;
and controlling the virtual projectile to fly along the two-dimensional motion trail, wherein the duration of the flight along the two-dimensional motion trail is not less than the duration.
2. The method of claim 1, wherein the determining a field of view of the second virtual object in the virtual scene comprises:
acquiring the direction of a second virtual object in the virtual scene;
And determining a field of view of the second virtual object in the virtual scene based on a range of visibility angles of a virtual camera, wherein the virtual camera is located at an eye of the second virtual object and is consistent with the orientation of the second virtual object.
3. The method of claim 1, wherein mapping the overlapping portion of the three-dimensional motion profile from the three-dimensional space of the virtual scene to a two-dimensional motion profile in the two-dimensional space of the screen comprises:
acquiring a starting point and an ending point of an overlapped part in the three-dimensional motion trail;
multiplying the coordinates of the start point and the end point of the overlapped part in the three-dimensional space of the virtual scene with the space transformation and the projection matrix of the virtual camera respectively to obtain the coordinates of the start point and the end point of the two-dimensional motion trail in the two-dimensional space of the screen;
and determining a connecting line between the starting point and the end point of the two-dimensional motion trail as the two-dimensional motion trail of the virtual emitter in the two-dimensional space of the screen.
4. A method according to claim 3, wherein the acquiring the start and end points of the overlapping portions in the three-dimensional motion trajectory comprises:
Determining a start point of the three-dimensional motion trajectory as a start point of the overlapping portion and an exit point of the three-dimensional motion trajectory in the field of view as an end point of the overlapping portion in response to the start point of the three-dimensional motion trajectory of the virtual emission being located within the field of view;
and determining an incident point of the three-dimensional motion trail in the visual field as the starting point of the overlapped part and an emergent point of the three-dimensional motion trail in the visual field as the ending point of the overlapped part in response to the starting point of the three-dimensional motion trail of the virtual emission object being positioned outside the visual field.
5. The method according to claim 1, wherein the method further comprises:
generating the virtual emission object at the starting point of the three-dimensional motion trail of the virtual emission object;
and controlling the virtual projectile to jump from the starting point of the three-dimensional motion trail to the starting point of the overlapped part in response to the starting point of the three-dimensional motion trail not coinciding with the starting point of the overlapped part.
6. The method of claim 5, wherein prior to controlling the virtual projectile to jump from the start of the three-dimensional motion trajectory to the start of the overlap, the method further comprises:
And controlling the virtual emission object to stay at the starting point of the three-dimensional motion track for a duration corresponding to at least one frame.
7. The method according to claim 1, wherein the method further comprises:
and controlling the virtual emission object to jump from the end point of the overlapped part to the end point of the three-dimensional motion track after the virtual emission object reaches the end point of the overlapped part in response to the non-coincidence of the end point of the three-dimensional motion track and the end point of the overlapped part.
8. The method of claim 7, wherein after controlling the virtual projectile to jump from the end of the overlapping portion to the end of the three-dimensional motion profile, the method further comprises:
controlling the virtual emission object to stay at the end point of the three-dimensional motion track for a duration corresponding to at least one frame;
and canceling to display the virtual emission in the virtual scene in response to the duration of stay of the virtual emission at the end point of the three-dimensional motion trail being greater than a duration threshold.
9. The method according to claim 1, wherein determining a duration of the ratio of the perceived duration according to a ratio of the length of the two-dimensional motion trajectory to the size of the screen comprises:
Acquiring the size of the screen and a perception duration corresponding to the size of the screen, wherein the perception duration is positively correlated with the size of the screen;
determining a ratio of a length of the two-dimensional motion trajectory to a size of the screen;
and determining the multiplication result of the proportion and the perception duration as the duration.
10. The method of claim 1, wherein the controlling the virtual projectile to fly along the two-dimensional trajectory comprises:
controlling the virtual emission object to fly along the two-dimensional motion track in a uniform manner; or controlling the virtual projectile to fly along the two-dimensional motion trail in a variable speed manner.
11. The method according to claim 1, wherein the method further comprises:
determining a three-dimensional motion trajectory of the virtual projectile by:
determining the position of the transmitting part of the virtual shooting prop held by the first virtual object as the starting point of the three-dimensional motion trail of the virtual shooting prop;
generating detection rays extending along the shooting direction by taking the shooting position of the virtual shooting prop as a starting point;
Determining the position of the virtual scene, which collides with the detection ray, as the end point of the three-dimensional motion track of the virtual emission object;
and determining a curve between the starting point and the end point of the three-dimensional motion track as the three-dimensional motion track of the virtual emission object.
12. The method of claim 1, wherein prior to controlling the virtual projectile to fly along the two-dimensional trajectory, the method further comprises:
determining the flying speed of the virtual emission in the three-dimensional space of the virtual scene according to the length of the overlapped part and the duration;
mapping the flying speed of the virtual emission in the three-dimensional space of the virtual scene into the flying speed of the virtual emission in the two-dimensional space of the screen;
the controlling the virtual projectile to fly along the two-dimensional motion trail comprises:
and controlling the virtual projectile to fly along the two-dimensional motion trail according to the flying speed in the two-dimensional space.
13. A method of processing a virtual emission, the method comprising:
displaying a virtual scene in a human-computer interaction interface;
Transmitting a virtual emission object in the virtual scene in response to a first virtual object, and the three-dimensional motion trajectory of the virtual emission object overlaps with the field of view of a second virtual object,
controlling the virtual projectile to fly along a two-dimensional motion trail obtained by mapping an overlapping part of the three-dimensional motion trail and the visual field to a screen, and
and controlling the flying time of the virtual emission object along the two-dimensional motion track to be not less than a duration time, wherein the duration time is a time corresponding to the ratio of the length of the two-dimensional motion track to the size of the screen in a perception time, and the perception time is a time capable of perceiving the flying direction of the virtual emission object in the screen.
14. The method of claim 13, wherein the controlling the virtual projectile to fly along a two-dimensional trajectory comprises:
controlling the virtual emission object to fly along the two-dimensional motion track in a uniform manner; or controlling the virtual projectile to fly along the two-dimensional motion trail in a variable speed manner.
15. The method of claim 13, wherein the method further comprises:
Generating the virtual emission object at the starting point of the three-dimensional motion trail of the virtual emission object;
controlling the virtual projectile to jump from the start point of the three-dimensional motion trajectory to the start point of the overlapping portion in response to the start point of the three-dimensional motion trajectory being outside the field of view;
and controlling the virtual emission object to jump from the end point of the overlapped part to the end point of the three-dimensional motion track after the virtual emission object reaches the end point of the overlapped part in response to the non-coincidence of the end point of the three-dimensional motion track and the end point of the overlapped part.
16. A virtual projectile processing apparatus, the apparatus comprising:
the display module is used for displaying the virtual scene in the man-machine interaction interface;
a determining module for determining a field of view of a second virtual object in the virtual scene in response to the first virtual object transmitting a virtual emission in the virtual scene; wherein the second virtual object is any object in the virtual scene that is distinct from the first virtual object;
a mapping module, configured to map, from a three-dimensional space of the virtual scene to a two-dimensional motion trajectory in a two-dimensional space of a screen, an overlapping portion in the three-dimensional motion trajectory in response to a three-dimensional motion trajectory of the virtual projectile overlapping the field of view;
The determining module is further configured to determine a duration corresponding to the ratio in a perceived duration according to a ratio of the length of the two-dimensional motion trajectory to the size of the screen, where the perceived duration is a duration that can perceive a flight direction of the virtual emission in the screen;
and the control module is used for controlling the virtual projectile to fly along the two-dimensional motion trail, wherein the duration of the flight along the two-dimensional motion trail is not less than the duration.
17. A virtual projectile processing apparatus, the apparatus comprising:
the display module is used for displaying the virtual scene in the man-machine interaction interface;
the control module is used for responding to a first virtual object to emit a virtual emission object in the virtual scene, wherein a three-dimensional motion track of the virtual emission object is overlapped with a visual field of a second virtual object, the virtual emission object is controlled to fly along a two-dimensional motion track, the two-dimensional motion track is obtained by mapping an overlapped part of the three-dimensional motion track and the visual field to a screen, and the duration of the virtual emission object flying along the two-dimensional motion track is controlled to be not less than a duration, wherein the duration is a duration of a ratio of a length of the corresponding two-dimensional motion track to the size of the screen in a perception duration, and the perception duration is a duration of being capable of perceiving the flight direction of the virtual emission object in the screen.
18. An electronic device, the electronic device comprising:
a memory for storing executable instructions;
a processor for implementing the method of processing a virtual emission according to any one of claims 1 to 12, or any one of claims 13 to 15, when executing executable instructions stored in said memory.
19. A computer readable storage medium storing executable instructions for implementing the method of processing a virtual emission according to any one of claims 1 to 12 or any one of claims 13 to 15 when executed by a processor.
20. A computer program product comprising a computer program or instructions which, when executed by a processor, implements the method of processing a virtual emission as claimed in any one of claims 1 to 12, or any one of claims 13 to 15.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210491849.8A CN117046110A (en) | 2022-05-07 | 2022-05-07 | Virtual emission processing method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210491849.8A CN117046110A (en) | 2022-05-07 | 2022-05-07 | Virtual emission processing method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117046110A true CN117046110A (en) | 2023-11-14 |
Family
ID=88652344
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210491849.8A Pending CN117046110A (en) | 2022-05-07 | 2022-05-07 | Virtual emission processing method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117046110A (en) |
-
2022
- 2022-05-07 CN CN202210491849.8A patent/CN117046110A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112090069B (en) | Information prompting method and device in virtual scene, electronic equipment and storage medium | |
CN112076473B (en) | Control method and device of virtual prop, electronic equipment and storage medium | |
CN112121414B (en) | Tracking method and device in virtual scene, electronic equipment and storage medium | |
KR102706744B1 (en) | Method and apparatus, device, storage medium and program product for controlling virtual objects | |
US12097428B2 (en) | Method and apparatus for state switching in virtual scene, device, medium, and program product | |
CN112057863B (en) | Virtual prop control method, device, equipment and computer readable storage medium | |
TWI831074B (en) | Information processing methods, devices, equipments, computer-readable storage mediums, and computer program products in virtual scene | |
CN112138385B (en) | Virtual shooting prop aiming method and device, electronic equipment and storage medium | |
CN113633964B (en) | Virtual skill control method, device, equipment and computer readable storage medium | |
CN112057860B (en) | Method, device, equipment and storage medium for activating operation control in virtual scene | |
WO2022105471A1 (en) | Position acquisition method and apparatus in virtual scene, device, medium and program product | |
CN113559510B (en) | Virtual skill control method, device, equipment and computer readable storage medium | |
CN112156472B (en) | Control method, device and equipment of virtual prop and computer readable storage medium | |
CN113181649A (en) | Control method, device, equipment and storage medium for calling object in virtual scene | |
CN113703654B (en) | Camouflage processing method and device in virtual scene and electronic equipment | |
CN114130006B (en) | Virtual prop control method, device, equipment, storage medium and program product | |
CN117046110A (en) | Virtual emission processing method and device, electronic equipment and storage medium | |
CN112870694B (en) | Picture display method and device of virtual scene, electronic equipment and storage medium | |
CN113663329B (en) | Shooting control method and device for virtual character, electronic equipment and storage medium | |
CN112891930B (en) | Information display method, device, equipment and storage medium in virtual scene | |
CN116764196A (en) | Processing method, device, equipment, medium and program product in virtual scene | |
CN113633991A (en) | Virtual skill control method, device, equipment and computer readable storage medium | |
CN112402965A (en) | Position monitoring and anti-monitoring method, device, terminal and storage medium | |
CN118436976A (en) | Interactive processing method and device for virtual scene, electronic equipment and storage medium | |
CN117815662A (en) | Virtual prop control method, device, equipment, storage medium and program product |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |