CN111282275B - Method, device, equipment and storage medium for displaying collision traces in virtual scene - Google Patents

Method, device, equipment and storage medium for displaying collision traces in virtual scene Download PDF

Info

Publication number
CN111282275B
CN111282275B CN202010151742.XA CN202010151742A CN111282275B CN 111282275 B CN111282275 B CN 111282275B CN 202010151742 A CN202010151742 A CN 202010151742A CN 111282275 B CN111282275 B CN 111282275B
Authority
CN
China
Prior art keywords
virtual
collision
throwing prop
prop
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010151742.XA
Other languages
Chinese (zh)
Other versions
CN111282275A (en
Inventor
郭畅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010151742.XA priority Critical patent/CN111282275B/en
Publication of CN111282275A publication Critical patent/CN111282275A/en
Application granted granted Critical
Publication of CN111282275B publication Critical patent/CN111282275B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/577Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/58Controlling game characters or game objects based on the game progress by computing conditions of game characters, e.g. stamina, strength, motivation or energy level
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • A63F2300/646Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car for calculating the trajectory of an object
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/65Methods for processing data by generating or executing the game program for computing the condition of a game character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Abstract

The application discloses a method, a device, equipment and a storage medium for displaying collision traces in a virtual scene, and relates to the technical field of virtual scenes. The method comprises the following steps: displaying a scene picture of a virtual scene, wherein the scene picture is a picture before a virtual throwing prop of a virtual object device is thrown; responding to the virtual object to throw the virtual throwing prop, and acquiring a collision point between the thrown virtual throwing prop and a virtual obstacle in the virtual scene; and displaying collision traces on the virtual barrier corresponding to the collision points. The scheme shows corresponding collision traces at the collision points where the virtual throwing props collide with the virtual barriers in the virtual scene, and the user can quickly determine the falling point position of the virtual throwing props according to the virtual traces, so that the time for the user to find the virtual throwing props can be greatly shortened, the virtual shooting game time is greatly shortened, and the electric quantity of the terminal is saved.

Description

Method, device, equipment and storage medium for displaying collision traces in virtual scene
Technical Field
The embodiment of the application relates to the field of virtual environments, in particular to a method, a device, equipment and a storage medium for displaying collision traces in a virtual scene.
Background
In the virtual shooting game, a player can control a virtual object through a terminal, and use a virtual item to fight against other players or a virtual object controlled by Artificial Intelligence (AI).
In the virtual shooting game, a virtual throwing prop is usually set, and when a player controls a virtual object to fight in the virtual shooting game, the player can use the virtual throwing prop to attack. For example, in one possible implementation, at least one virtual control for controlling the throwing direction and controlling the throwing of the virtual throwing prop is stacked on the game interface of the virtual shooting game, and the player can firstly control the throwing direction of the virtual throwing prop and then throw the virtual throwing prop by triggering the throwing control. After the virtual object throws a virtual throwing prop, the virtual object can be controlled to pick up the thrown virtual throwing prop again.
In the related art, when a player needs to pick up a virtual throwing item, the player needs to judge a drop point of the virtual throwing item according to a throwing direction and then search for the virtual throwing item near the drop point, and the process needs to consume a large amount of time for the player to search for the virtual throwing item, so that virtual shooting game time is long, and electric quantity of a terminal is wasted.
Disclosure of Invention
The embodiment of the application provides a method, a device, equipment and a storage medium for displaying collision traces in a virtual scene, which can reduce the time for players to find back virtual thrown props and save the electric quantity of a terminal. The technical scheme is as follows:
in one aspect, a method for displaying collision traces in a virtual scene is provided, and the method includes:
displaying a scene picture of a virtual scene, wherein the scene picture is a picture before a virtual throwing prop of a virtual object device is thrown;
responding to the virtual object to throw the virtual throwing prop, and acquiring a collision point between the thrown virtual throwing prop and a virtual obstacle in the virtual scene;
and displaying collision traces on the virtual barrier corresponding to the collision points.
In one aspect, a method for displaying collision traces in a virtual scene is provided, and the method includes:
displaying a first scene picture of a virtual scene, wherein the first scene picture is a picture before a virtual throwing prop of a virtual object device is thrown;
in response to a throwing operation of the virtual throwing prop, displaying a second scene picture of the virtual scene, wherein the second scene picture is a picture of the virtual throwing prop when moving after being thrown;
in response to the virtual thrown prop colliding with a virtual obstacle in the virtual scene, showing a collision trace at a corresponding collision point on the virtual obstacle.
In another aspect, an apparatus for displaying collision traces in a virtual scene is provided, the apparatus comprising:
the picture display module is used for displaying a scene picture of a virtual scene, wherein the scene picture is a picture before a virtual throwing prop of a virtual object device is thrown;
a collision point obtaining module, configured to respond to the virtual object throwing the virtual throwing prop, and obtain a collision point between the thrown virtual throwing prop and a virtual obstacle in the virtual scene;
and the trace display module is used for displaying collision traces on the virtual barrier corresponding to the collision points.
In an exemplary aspect, the apparatus further includes:
the material obtaining module is used for obtaining the material of the virtual barrier before the trace display module displays the collision trace on the virtual barrier corresponding to the collision point;
the shape determining module is used for determining the shape of the collision trace according to the material of the virtual barrier;
and the trace display module is used for displaying the collision traces on the virtual barrier corresponding to the collision points according to the shapes of the collision traces.
In an exemplary scheme, the shape determining module is configured to determine the shape of the collision trace according to the material of the virtual obstacle and the prop type of the virtual throwing prop.
In an exemplary aspect, the apparatus further includes:
the speed obtaining module is used for obtaining the speed of the virtual throwing prop when the virtual throwing prop collides with the virtual barrier before the trace display module displays the collision trace on the virtual barrier corresponding to the collision point according to the shape of the collision trace;
the size determining module is used for determining the size of the collision trace according to the material of the virtual barrier and the speed of the virtual throwing prop;
and the trace display module is used for displaying the collision traces on the virtual barrier corresponding to the collision points according to the shape and the size of the virtual barrier.
In an exemplary scheme, the material obtaining module is configured to obtain a material of the virtual obstacle before the virtual throwing prop collides with the virtual obstacle.
In an exemplary aspect, the material obtaining module is configured to,
after the virtual throwing prop is thrown, in the moving process of the virtual throwing prop, taking the virtual throwing prop as a starting point, and setting a virtual ray line segment along the moving direction of the virtual throwing prop;
and responding to the situation that the top end of the virtual ray line segment reaches the virtual obstacle, and obtaining the material of the virtual obstacle.
In an exemplary aspect, the apparatus further includes: a removal module for removing the first and second substrates,
removing the virtual throwing prop from the virtual scene in response to the time length of stopping moving after the virtual throwing prop is thrown reaching a specified time length;
alternatively, the first and second electrodes may be,
removing the virtual throwing prop from the virtual scene in response to the virtual object being culled out of the virtual scene.
In an exemplary aspect, the apparatus further includes:
and the picking control module is used for responding to the fact that the virtual throwing prop stops moving after being thrown and the distance between the virtual object and the virtual throwing prop is smaller than a distance threshold value, and controlling the virtual object to automatically pick up the virtual throwing prop.
In an exemplary scenario, the virtual throwing prop is a virtual cold weapon.
In another aspect, a computer device is provided, which includes a processor and a memory, where at least one instruction, at least one program, a set of codes, or a set of instructions is stored in the memory, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by the processor to implement the collision trace showing method in a virtual scene as provided in the embodiments of the present application.
In another aspect, a computer-readable storage medium is provided, in which at least one instruction, at least one program, a code set, or a set of instructions is stored, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by the processor to implement the collision trace showing method in a virtual scene as provided in the embodiments of the present application.
In another aspect, a computer program product is provided, which when running on a computer, causes the computer to execute the method for displaying collision traces in a virtual scene as provided in the embodiments of the present application.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
through the virtual collision point department of throwing the stage property and bumping with the virtual barrier in the virtual scene, show the collision trace that corresponds to the position that this virtual stage property and virtual barrier bumped of suggestion user, it is corresponding, the user can be according to the virtual trace confirm the virtual landing point position of throwing the stage property fast, thereby the time that the virtual stage property was seeked to the reduction user that can be very big, thereby the virtual shooting recreation time of very big reduction, practice thrift the electric quantity at terminal.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a terminal according to an exemplary embodiment of the present application;
FIG. 2 is a schematic illustration of a display interface of a virtual scene provided by an exemplary embodiment of the present application;
FIG. 3 is a block diagram of a computer system provided in an exemplary embodiment of the present application;
FIG. 4 is a flowchart of a method for displaying collision traces in a virtual scene according to an exemplary embodiment of the present application;
FIG. 5 is a flowchart of a method for displaying collision traces in a virtual scene according to an exemplary embodiment of the present application;
FIG. 6 is a schematic illustration of an equipment interface according to the embodiment of FIG. 5;
FIG. 7 is a schematic view of a to-be-thrown interface of the virtual throwing prop according to the embodiment of FIG. 5;
FIG. 8 is a schematic diagram of the throw of a virtual throw prop according to the embodiment of FIG. 5;
FIG. 9 is a pre-crash detection schematic diagram of the embodiment of FIG. 5;
FIG. 10 is a schematic illustration of an impact trace according to the embodiment of FIG. 5;
FIG. 11 is a flow chart of a game-based impact trace demonstration method provided by an exemplary embodiment of the present application;
FIG. 12 is a block diagram of an apparatus for displaying crash traces in a virtual environment according to an exemplary embodiment of the present application;
fig. 13 is a block diagram of a computer device according to another exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, terms referred to in the embodiments of the present application are briefly described:
1) virtual scene: is a virtual scene that is displayed (or provided) when an application program runs on a terminal. The virtual scene can be a simulation environment scene of a real world, can also be a semi-simulation semi-fictional three-dimensional environment scene, and can also be a pure fictional three-dimensional environment scene. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, and a three-dimensional virtual scene, and the following embodiments are illustrated by way of example, but not limited thereto, in which the virtual scene is a three-dimensional virtual scene. Optionally, the virtual scene may also be used for virtual scene engagement between at least two virtual characters. Optionally, the virtual scene may also be used for a virtual firearm fight between at least two virtual characters. Optionally, the virtual scene may also be used for fighting between at least two virtual characters using a virtual firearm within a target area that may be continually smaller over time in the virtual scene.
2) Virtual object: refers to a movable object in a virtual scene. The movable object may be at least one of a virtual character, a virtual animal, a virtual vehicle. Optionally, when the virtual scene is a three-dimensional virtual scene, the virtual object is a three-dimensional stereo model created based on an animated skeleton technique. Each virtual object has its own shape, volume and orientation in the three-dimensional virtual scene and occupies a portion of the space in the three-dimensional virtual scene.
Virtual scenes are typically rendered based on hardware (e.g., a screen) in a terminal generated by an application in a computer device, such as a terminal. The terminal can be a mobile terminal such as a smart phone, a tablet computer or an electronic book reader; alternatively, the terminal may be a personal computer device such as a notebook computer or a stationary computer.
Referring to fig. 1, a schematic structural diagram of a terminal according to an exemplary embodiment of the present application is shown. As shown in fig. 1, the terminal includes a main board 110, an external input/output device 120, a memory 130, an external interface 140, a touch system 150, and a power supply 160.
The main board 110 has integrated therein processing elements such as a processor and a controller.
The external input/output device 120 may include a display component (e.g., a display screen), a sound playing component (e.g., a speaker), a sound collecting component (e.g., a microphone), various keys, and the like.
The memory 130 has program codes and data stored therein.
The external interface 140 may include a headset interface, a charging interface, a data interface, and the like.
The touch system 150 may be integrated into a display component or a key of the external input/output device 120, and the touch system 150 is used to detect a touch operation performed by a user on the display component or the key.
The power supply 160 is used to power the various other components in the terminal.
In the embodiment of the present application, the processor in the main board 110 may generate a virtual scene by executing or calling the program code and data stored in the memory, and expose the generated virtual scene through the external input/output device 120. In the process of displaying the virtual scene, the capacitive touch system 150 may detect a touch operation performed when the user interacts with the virtual scene.
3) Virtual shooting props: the virtual props are used for fighting between virtual objects through shooting in a virtual scene. For example, the virtual shooting prop may be a virtual firearm, a virtual bow, or the like in a virtual shooting game.
4) And (3) virtually throwing the prop: the virtual props are used for fighting between virtual objects through throwing in a virtual scene. For example, the virtual throwing prop may be a virtual fly knife, a virtual fly axe, a virtual grenade, and the like in a virtual shooting game.
In an embodiment of the present application, the virtual scene may be a three-dimensional virtual scene. Taking the example that the virtual scene is a three-dimensional virtual scene, please refer to fig. 2, which shows a schematic view of a display interface of the virtual scene according to an exemplary embodiment of the present application. As shown in fig. 2, the display interface of the virtual scene includes a scene screen 200, and the scene screen 200 includes a virtual object 210, a virtual operation control 220a, a virtual operation control 220b, a virtual operation control 220c, a scene screen of the three-dimensional virtual scene, and other virtual objects. The virtual object 210 may be a current virtual object of a terminal corresponding to a user, or a virtual carrier in which the current virtual object of the terminal corresponding to the user is located. The other virtual objects can be virtual objects corresponding to users or artificial intelligent control of other terminals.
In fig. 2, the virtual object 210, the scene picture of the three-dimensional virtual scene displayed in the scene picture 200, is an object observed from the viewing angle (may also be referred to as the user viewing angle) of the camera model around the virtual vehicle 210, and exemplarily, as shown in fig. 2, the scene picture of the three-dimensional virtual scene displayed from the viewing angle of the camera model is the ground, the sky, the horizon, the hill, the factory building, and the like.
In fig. 2, the virtual operation control 220 is used for controlling the motion state of the virtual object 210, for example, controlling the posture (including lying, crouching, standing, etc.) of the virtual object 210, the sight moving, jumping, moving, shooting, throwing, etc. For example, in fig. 2, the user may control the movement of the virtual object through the virtual stick 220a on the left side, the movement of the sight of the virtual object 210 through the sliding operation of the blank area, the posture of the virtual object through the virtual control 220b, and the virtual object shooting, throwing, and the like through the virtual control 220 c.
The terminal in the present application may be a desktop computer, a laptop computer, a mobile phone, a tablet computer, an e-book reader, an MP3(Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3) player, an MP4(Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4) player, and so on. The terminal is installed and operated with an application program supporting a virtual environment, such as an application program supporting a three-dimensional virtual environment. The application program may be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, and an MOBA game. Alternatively, the application may be a stand-alone application, such as a stand-alone 3D game program, or may be a network online application.
The terminal in the present application may include: an operating system and an application program.
The operating system is the base software that provides applications with secure access to the computer hardware.
An application is an application that supports a virtual environment. Optionally, the application is an application that supports a three-dimensional virtual environment. The application program may be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a Third-person Shooting Game (TPS), a First-person Shooting Game (FPS), a MOBA Game, and a multi-player gun-battle type survival Game. The application may be a stand-alone application, such as a stand-alone 3D game program.
Fig. 3 shows a block diagram of a computer system provided in an exemplary embodiment of the present application. The computer system 300 includes: a first device 320, a server 340, and a second device 360. The first device 320 and the second device 360 may be implemented as terminals in the present application.
The first device 320 is installed and operated with an application program supporting a virtual environment. The application program can be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, an MOBA game and a multi-player gunfight living game. The first device 320 is a device used by a first user who uses the first device 320 to control a first virtual object located in a virtual environment for activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the first virtual object is a first virtual character, such as a simulated persona or an animated persona.
The first device 320 is connected to the server 340 through a wireless network or a wired network.
The server 340 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 340 is used for providing background services for applications supporting a three-dimensional virtual environment. Alternatively, server 340 undertakes primary computing work and first device 320 and second device 360 undertakes secondary computing work; alternatively, the server 340 undertakes secondary computing work and the first device 320 and the second device 360 undertake primary computing work; alternatively, the server 340, the first device 320, and the second device 360 perform cooperative computing by using a distributed computing architecture.
The second device 360 is installed and operated with an application program supporting a virtual environment. The application program can be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, an FPS game, an MOBA game and a multi-player gun battle type survival game. The second device 360 is a device used by a second user who uses the second device 360 to control a second virtual object located in the virtual environment to perform activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the second virtual object is a second virtual character, such as a simulated persona or an animated persona.
Optionally, the first virtual character and the second virtual character are in the same virtual environment. Alternatively, the first avatar and the second avatar may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights. Alternatively, the first virtual character and the second virtual character may belong to different teams, different organizations, or two groups with enemy.
Alternatively, the applications installed on the first device 320 and the second device 360 are the same, or the applications installed on the two devices are the same type of application for different control system platforms. The first device 320 may generally refer to one of a plurality of devices, and the second device 360 may generally refer to one of a plurality of devices, and this embodiment is illustrated by the first device 320 and the second device 360 only. The device types of the first device 320 and the second device 360 are the same or different, and include: at least one of a game console, a desktop computer, a smartphone, a tablet, an e-book reader, an MP3 player, an MP4 player, and a laptop portable computer. The following embodiments are illustrated where the device is a desktop computer.
Those skilled in the art will appreciate that the number of devices described above may be greater or fewer. For example, the number of the devices may be only one, or several tens or hundreds, or more. The number and the type of the devices are not limited in the embodiments of the present application.
With reference to the noun introduction and the description of the implementation environment, please refer to fig. 4, which shows a flowchart of a method for displaying collision traces in a virtual scene according to an exemplary embodiment of the present application. The method may be executed by a computer device running an application program corresponding to the virtual scene, for example, the computer device may be a terminal, or the computer device may also be a cloud server running the application program corresponding to the virtual scene. As shown in fig. 4, the method may include the steps of:
step 401, a scene picture of a virtual scene is displayed, wherein the scene picture is a picture before a virtual throwing prop equipped by a virtual object is thrown.
In the virtual shooting game, shooting is performed for the purpose of simulating a gunshot feeling in reality. Virtual weapon props are divided into two categories: one is a virtual shooting prop, and the other is a virtual throwing prop; the virtual shooting prop is attacked by shooting bullets/arrows, the virtual throwing prop throws the whole explosion prop in the form of grenades and the like, along with the diversification of the virtual throwing prop, the virtual shooting game at present is added with the non-explosion virtual throwing prop such as a virtual flying axe, a virtual flying knife and the like, the virtual flying axe and the like are different from the explosion prop, and the virtual throwing prop damages the target after completely hitting the target.
Step 402, responding to the virtual object throwing the virtual throwing prop, and acquiring a collision point between the thrown virtual throwing prop and a virtual barrier in the virtual scene.
The virtual throwing prop may encounter various virtual obstacles such as a virtual wall, a virtual wooden box and the like during the movement (flight) after being thrown.
In this embodiment of the application, after the virtual throwing prop is thrown, when colliding with the virtual obstacle in the moving process, the computer device may obtain a collision point of the virtual throwing prop and the virtual obstacle, that is, a position point of the virtual throwing prop contacting with the virtual obstacle in the moving process.
And 403, displaying collision traces on the virtual barrier corresponding to the collision points.
In this embodiment of the application, if the virtual throwing prop collides with the virtual obstacle, the computer device may display a collision trace at the collision point, so as to prompt the user that the virtual throwing prop collides with the virtual obstacle at the collision point.
From the perspective of user interface presentation, the computer device may present a first scene picture of a virtual scene, the first scene picture being a picture before a virtual throwing prop of the virtual object equipment is thrown; then, responding to the throwing operation of the virtual throwing prop, and displaying a second scene picture of the virtual scene, wherein the second scene picture is a picture of the virtual throwing prop when the virtual throwing prop moves after being thrown; and then, responding to the collision of the virtual throwing prop and a virtual obstacle in the virtual scene, and displaying collision traces on the virtual obstacle at a corresponding collision point.
Taking a virtual shooting game as an example, a player can control a virtual character to switch a weapon to a virtual throwing item, such as a virtual flying axe, then, after the player presses a throwing button in an interface, a throwing direction indication pattern is displayed in the interface, after the player determines a throwing direction by adjusting the throwing direction indication pattern, the throwing button is released, then, the virtual character throws the virtual flying axe towards the throwing direction, when the virtual flying axe collides with a virtual obstacle, such as a virtual wall, in a flight process, a collision trace is displayed on a collision point on the virtual wall, and the player can quickly position a falling point range of the virtual flying axe through the collision trace.
To sum up, the scheme shown in this application embodiment shows corresponding collision vestige through the virtual collision point department of throwing the stage property and virtual obstacle emergence collision in the virtual scene to the position that this virtual stage property and virtual obstacle emergence collision of suggestion user, it is corresponding, the user can be according to the virtual vestige fast determination virtual landing point position of throwing the stage property, thereby the time of looking for virtual stage property of shortening the user that can be very big, thereby very big virtual shooting game time of shortening, the electric quantity at saving terminal.
In the virtual scene, for virtual obstacles of different materials, collision traces generated under the collision of the same virtual throwing prop may also be different, that is, the computer device may show different collision traces according to the material of the virtual obstacle.
Fig. 5 is a flowchart of a method for displaying collision traces in a virtual scene according to an exemplary embodiment of the present application. The method may be executed by a computer device running an application program corresponding to the virtual scene, for example, the computer device may be a terminal, or the computer device may also be a cloud server running the application program corresponding to the virtual scene. As shown in fig. 5, the method may include the steps of:
step 501, showing a scene picture of a virtual scene, wherein the scene picture is a picture before a virtual throwing prop equipped by a virtual object is thrown.
In an exemplary scenario, the virtual throwing prop is a virtual cold weapon.
For example, the virtual throwing prop may be a virtual flyaxe, a virtual flyknife, or the like.
In the present embodiment, the virtual flying axe is a special projectile weapon that has both the unlimited use of a near war weapon and the long distance attack effect of the projectile weapon, and thus the weapon is a combination of the two types, which has the advantage of not being as close to the near war weapon to attack the enemy, nor being as single use as an explosive projectile weapon.
Wherein, the user can switch the virtual throwing prop currently used through the equipment interface. For example, please refer to fig. 6, which illustrates a schematic diagram of an equipment interface according to an embodiment of the present application. As shown in fig. 6, the equipment interface 60 shows on the right an icon 61 of a throwing weapon, i.e. a virtual grenade, currently used by the virtual object, and on the left of the equipment interface 60 an icon 62 of a virtual flyer held by the virtual object. The user can switch the throwing-type weapon currently used by the virtual object from a virtual grenade to a virtual flying axe by clicking on the icon 62.
In an exemplary scenario, the virtual throwing prop may also be a consuming throwing prop such as a virtual torpedo.
Please refer to fig. 7, which shows a schematic diagram of an interface to be thrown of a virtual throwing prop according to an embodiment of the present application. As shown in fig. 7, after the user selects the virtual hatchet as the throwing weapon currently used by the virtual object through the equipment interface 60, the virtual item held by the virtual object in the virtual shooting game interface 70 is switched to a virtual hatchet 71.
Step 502, in response to the virtual object throwing the virtual throwing prop, obtaining the material of the virtual barrier in the virtual scene.
Please refer to fig. 8, which shows a schematic throwing diagram of a virtual throwing prop according to an embodiment of the present application. As shown in fig. 8, after the virtual character switches the weapon to the virtual flying axe, the user presses a throwing button 81 on an interface 80, a throwing direction indicating pattern 82 is displayed on the interface, the player determines the throwing direction by adjusting the throwing direction indicating pattern 82, then releases the throwing button 81, and then the virtual character throws the virtual flying axe in the throwing direction, the virtual flying axe collides with a virtual wall 83 during flight, and at this time, the computer device can acquire the material of the virtual wall 83.
The virtual flying axe and other weapons are different from the common explosive throwing props such as the virtual grenade and the like. Wherein, the flow rate of the water is controlled by the control unit. The flight path of the explosive throwing prop presents a parabolic curve, because similar to a virtual grenade, the virtual throwing prop such as a virtual smoke bomb needs to be thrown to a far position to attack a target and possibly needs to pass through a multilayer virtual barrier, but the virtual cold weapon such as a virtual flying axe does not need to be thrown too far, so that the curve of the virtual throwing prop such as the virtual flying axe is straight, and the design has two main reasons: the first is that the virtual throwing props such as the virtual flying axe can effectively act only when hitting enemies, so that straight line aiming is needed; secondly, the virtual throwing prop such as the virtual flying axe can be picked up by the virtual object again, and if the throwing is too far away, the virtual throwing prop cannot be used for an unlimited time, so that the virtual throwing prop such as the virtual flying axe is a throwing object weapon with a medium-short distance.
In this embodiment, the computer device may obtain the material of the virtual obstacle before the virtual throwing prop collides with the virtual obstacle.
The computer equipment needs to consume certain processing time according to the process of determining the collision trace by the material of the virtual obstacle, and if the collision trace is determined according to the material of the virtual obstacle after the virtual throwing prop is detected to collide with the virtual obstacle, the display time of the collision trace is prolonged greatly compared with the collision occurrence time, so that the visual reality of the display of the collision trace is poor, and the display effect of the collision trace is influenced. In view of the above problem, in the embodiment of the present application, before the virtual throwing prop collides with the virtual obstacle, the computer device may obtain the material of the virtual obstacle in advance, so as to leave enough time for the subsequent process of determining the collision trace according to the material of the virtual obstacle.
In an exemplary scheme, after the virtual throwing prop is thrown, the computer device may set a virtual ray line segment along the moving direction of the virtual throwing prop with the virtual throwing prop as a starting point in the moving process of the virtual throwing prop; and responding to the situation that the top end of the virtual ray line segment reaches the virtual obstacle, and acquiring the material of the virtual obstacle.
Please refer to fig. 9, which illustrates a pre-crash detection schematic diagram according to an embodiment of the present application. As shown in fig. 9, after the virtual flying axe 91 is thrown, it flies along a path 92, in the process, the computer device sets a virtual ray line segment 93 along the direction of the path 92 with the virtual flying axe 91 as a starting point, and the length of the virtual ray line segment can be set by a developer. When the virtual hatchet 91 approaches the virtual wall 94 such that the top of the virtual ray line segment 93 contacts the virtual wall 94, the computer device obtains the material of the virtual wall 94, thereby leaving enough time for the subsequent determination of the collision trace, so that the collision trace can be displayed immediately when the virtual hatchet 91 collides with the virtual wall 94.
Step 503, determining the shape of the collision trace according to the material of the virtual obstacle.
In this application embodiment, to the virtual barrier of different materials, the shape of the collision vestige that leaves after the collision of virtual throw stage property can be different, for example, when the material of virtual wall is the masonry material, the collision vestige can be short thick shape, when the material of virtual wall is the steel material, the collision vestige can be elongated shape. In this regard, the computer device may determine the shape of the impact trace in conjunction with the material of the virtual obstacle.
In an exemplary scenario, the computer device may determine the shape of the collision trace according to the material of the virtual obstacle and the prop type of the virtual throwing prop.
In this embodiment of the present application, the virtual throwing prop may have a plurality of different types, for example, in addition to the virtual flying axe, the virtual throwing prop may further include a virtual fly cutter, a virtual dart, and the like, and for the same virtual obstacle, collision traces generated when the virtual throwing props of different types collide may also be different, and therefore, in this embodiment of the present application, the computer device may determine the shape of the collision traces by combining the material of the virtual obstacle and the prop type of the virtual throwing prop.
For example, the shape of the collision trace corresponding to the combination of the type of each virtual throwing prop and the material of the virtual obstacle may be stored in the computer device in advance, and the computer device may determine the shape of the corresponding collision trace by querying according to the type of the virtual throwing prop and the material of the virtual obstacle.
Step 504, obtaining the collision point of the virtual throwing prop and the virtual barrier after the virtual throwing prop is thrown.
In this embodiment of the application, the computer device may obtain a collision point between the virtual throwing prop and the virtual obstacle according to a moving path of the virtual obstacle in the virtual scene and a position of the virtual obstacle in the virtual scene.
And 505, displaying the collision trace on the virtual obstacle corresponding to the collision point according to the shape of the collision trace.
In this embodiment of the application, after the virtual throwing prop collides with the virtual obstacle, the computer device may show a collision trace at the collision point.
In the embodiment of the application, the virtual throwing prop such as the virtual flying axe has a rebounding effect, that is, when colliding with a person or a static virtual obstacle in the flying process, the virtual throwing prop rebounds to other directions to continue flying. Because common throwers (such as virtual grenades) are disposable, players usually do not pick back, so that collision traces are not needed, virtual throwers such as virtual flyshafts are different, and players usually need to pick back for reuse.
Please refer to fig. 10, which shows a schematic illustration of an impact trace according to an embodiment of the present application. As shown in fig. 10, after the virtual hatchet 1001 is thrown, the virtual hatchet 1001 collides with the virtual wall 1002 and the virtual floor 1003 in sequence, the computer device displays a collision trace 1002a at the collision point of the virtual wall 1002 and a collision trace 1003a at the collision point of the virtual floor 1003, and the user can quickly find the position of the virtual hatchet 1001 at the falling point through the two collision traces.
In an exemplary scheme, the computer device further obtains the speed of the virtual throwing prop when colliding with the virtual obstacle before showing the collision trace on the virtual obstacle corresponding to the collision point according to the shape of the collision trace; and determining the size of the collision trace according to the material of the virtual barrier and the speed of the virtual throwing prop.
Accordingly, when the collision trace is shown on the virtual obstacle corresponding to the collision point according to the shape of the collision trace, the computer apparatus may show the collision trace on the virtual obstacle corresponding to the collision point according to the shape and size of the virtual obstacle.
In the embodiment of the present application, the size of the collision trace may be related to the moving speed of the virtual throwing prop. For example, taking a virtual shooting game scenario as an example, after being thrown, a virtual throwing prop (such as a virtual flying axe) may collide with a plurality of different virtual obstacles, such as a virtual wall, and then a virtual ground after bouncing, wherein the speed of the virtual flying axe before colliding with the virtual wall is different from the speed after colliding with the virtual wall, and accordingly, the computer device may determine the size of a collision trace according to the moving speed of the virtual throwing prop before colliding with the virtual obstacle, wherein the size of the collision trace may be positively correlated with the moving speed of the virtual throwing prop before colliding with the virtual obstacle.
Step 506, in response to that the virtual throwing prop stops moving after being thrown and the distance between the virtual object and the virtual throwing prop is smaller than a distance threshold, controlling the virtual object to automatically pick up the virtual throwing prop.
In this embodiment of the application, after the virtual throwing prop is thrown, when the virtual object moves to a preset range around the virtual throwing prop, the virtual throwing prop can be automatically picked up.
Optionally, in response to that the time length for stopping moving after the virtual throwing prop is thrown reaches a specified time length, removing the virtual throwing prop from the virtual scene;
alternatively, the first and second electrodes may be,
in response to the virtual object being culled out of the virtual scene, removing the virtual throwing prop from the virtual scene.
In embodiments of the application, after the virtual throwing prop is thrown, the computer device may remove the virtual throwing prop from the virtual scene if the virtual object does not pick up the virtual throwing prop for a certain time, or the virtual object is culled out of the virtual scene.
To sum up, the scheme shown in this application embodiment shows corresponding collision vestige through the virtual collision point department of throwing the stage property and virtual obstacle emergence collision in the virtual scene to the position that this virtual stage property and virtual obstacle emergence collision of suggestion user, it is corresponding, the user can be according to the virtual vestige fast determination virtual landing point position of throwing the stage property, thereby the time of looking for virtual stage property of shortening the user that can be very big, thereby very big virtual shooting game time of shortening, the electric quantity at saving terminal.
In addition, according to the scheme shown in the embodiment of the application, the computer equipment can determine the shape of the collision trace according to the material of the virtual barrier and the type of the virtual throwing prop, so that the virtual barriers made of different materials show the collision traces in different shapes after being collided by the virtual throwing props of different types, and the display effect of the collision traces is improved.
In addition, according to the scheme shown in the embodiment of the application, the computer equipment can determine the size of the collision trace according to the moving speed of the virtual throwing prop before the collision with the virtual obstacle, so that the display effect of the collision trace is further improved.
In addition, according to the scheme shown in the embodiment of the application, the computer equipment can start from the virtual throwing prop and detect the virtual obstacle along the virtual ray line segment extending from the moving direction of the virtual throwing prop in the moving process of the virtual throwing prop, and obtains the material of the virtual obstacle when the top end of the virtual ray line segment reaches the virtual obstacle, so that the collision trace is determined in advance before the collision occurs, the collision trace is displayed immediately when the collision occurs, and the display effect of the collision trace is improved.
The method for displaying the collision trace provided by the embodiment of the application is described with reference to a game. FIG. 11 shows a flow chart of a game-based impact trace demonstration method provided by an exemplary embodiment of the present application. The method can be applied to a terminal or a server in the system as shown in fig. 3. The method comprises the following steps:
step 1101, start.
Taking the terminal as an example of a smart phone, the user enters the game program, and the smart phone displays the user interface corresponding to the game program.
Step 1102, switching the prop used by the virtual object to a virtual throwing prop.
In some embodiments, a user switches the prop being used by the virtual object to a virtual throw prop by directly clicking on an icon of the virtual throw prop on a user interface (such as a game master interface or an equipment interface). In other embodiments, the virtual throwing prop used by the virtual object is a virtual throwing prop that the virtual object picks up in the virtual environment, or robs other virtual objects.
Step 1103, whether to switch out the virtual throwing prop; if yes, go to step 1104, otherwise, return.
Step 1104, the virtual object holds the virtual throwing prop.
Step 1105, whether to press the throwing control; if yes, go to step 1106, otherwise return.
In the embodiment of the application, when the virtual object holds the virtual throwing prop, the throwing control can be displayed in the game main interface, and the computer equipment can detect whether the user presses the throwing control in real time.
Step 1106, displaying the throwing track.
When a user presses down the throwing control, a throwing track and a throwing terminal point are displayed in the game main interface, the throwing track is used for representing the motion path of the virtual throwing prop in the virtual scene, and the throwing terminal point is the falling point of the virtual throwing prop in the virtual environment. The motion path changes with the change of the position and the direction of the virtual object, namely, the motion path is dynamically changed and is calculated instantly according to the current position and the current direction.
Step 1107, whether the user releases the throwing control; if yes, go to step 1108, otherwise return.
Step 1108, throw the virtual thrown prop out.
After the virtual throwing prop is thrown, the virtual throwing prop will move along the throwing track.
Step 1109, whether the virtual obstacle is collided or not; if yes, go to step 1110, otherwise return.
Step 1110, displaying the collision trace.
When the virtual throwing prop is thrown out, the virtual throwing prop can continuously make a small line segment in the current direction in the flight process, the line segment is used for detecting an object touched by a front position, and when a non-human target (namely a virtual barrier) is detected, the relevant information of the virtual barrier hit by the ray can be returned, including the material of the virtual barrier; one type can be obtained through the material, and then different bullet hole special effects are generated according to different types.
Step 1111, whether the target is hit; if yes, go to step 1112, otherwise go to step 1115 and end.
Step 1112, eliminating the target, and throwing the prop to the ground virtually.
Step 1113, whether the virtual throwing prop approaches the preset range around the virtual throwing prop or not; if yes, go to step 1114, otherwise go to step 1115 and end.
Step 1114, the virtual throw prop is picked up again.
In an exemplary scheme, the bounce times of the virtual throwing prop may be limited to N times, for example, 3 times, that is, after 3 times of collision with a virtual obstacle or after hitting a target, the virtual throwing prop may fall on a virtual ground, at this time, a stationary virtual throwing prop model may be generated on the ground, and the model may be seen only by a player himself, so that only himself can pick up the virtual throwing prop; the picking mode is automatic picking, namely the virtual object can be automatically picked as long as the virtual object enters the radius range, and manual picking is not needed.
Optionally, the virtual throwing item may disappear within a period of time, such as one minute, after falling to the virtual ground, or the player may be eliminated, which may also result in the virtual throwing item disappearing.
Fig. 12 is a block diagram of a device for displaying collision traces in a virtual scene according to an exemplary embodiment of the present application, where the device may be implemented as all or part of a computer device in a hardware manner or a combination of hardware and software. The computer device may be a terminal, or the computer device may also be a cloud server running an application program corresponding to the virtual scene. As shown in fig. 12, the apparatus includes:
a picture display module 1201, configured to display a scene picture of a virtual scene, where the scene picture is a picture before a virtual throwing prop of a virtual object device is thrown;
a collision point obtaining module 1202, configured to respond to the virtual object throwing the virtual throwing prop, and obtain a collision point between the thrown virtual throwing prop and a virtual obstacle in the virtual scene;
and a trace display module 1203, configured to display a collision trace on the virtual obstacle corresponding to the collision point.
In an exemplary aspect, the apparatus further includes:
a material obtaining module, configured to obtain a material of the virtual obstacle before the trace displaying module 1203 displays a collision trace on the virtual obstacle at the collision point;
the shape determining module is used for determining the shape of the collision trace according to the material of the virtual barrier;
the trace displaying module 1203 is configured to display the collision trace on the virtual obstacle corresponding to the collision point according to the shape of the collision trace.
In an exemplary scheme, the shape determining module is configured to determine the shape of the collision trace according to the material of the virtual obstacle and the prop type of the virtual throwing prop.
In an exemplary aspect, the apparatus further includes:
a speed obtaining module, configured to obtain a speed at which the virtual throwing prop collides with the virtual obstacle before the trace displaying module 1203 displays the collision trace on the virtual obstacle at the collision point according to the shape of the collision trace;
the size determining module is used for determining the size of the collision trace according to the material of the virtual barrier and the speed of the virtual throwing prop;
the trace displaying module 1203 is configured to display the collision trace on the virtual obstacle corresponding to the collision point according to the shape and size of the virtual obstacle.
In an exemplary scheme, the material obtaining module is configured to obtain a material of the virtual obstacle before the virtual throwing prop collides with the virtual obstacle.
In an exemplary aspect, the material obtaining module is configured to,
after the virtual throwing prop is thrown, in the moving process of the virtual throwing prop, taking the virtual throwing prop as a starting point, and setting a virtual ray line segment along the moving direction of the virtual throwing prop;
and responding to the situation that the top end of the virtual ray line segment reaches the virtual obstacle, and obtaining the material of the virtual obstacle.
In an exemplary aspect, the apparatus further includes: a removal module for removing the first and second substrates,
removing the virtual throwing prop from the virtual scene in response to the time length of stopping moving after the virtual throwing prop is thrown reaching a specified time length;
alternatively, the first and second electrodes may be,
removing the virtual throwing prop from the virtual scene in response to the virtual object being culled out of the virtual scene.
In an exemplary aspect, the apparatus further includes:
and the picking control module is used for responding to the fact that the virtual throwing prop stops moving after being thrown and the distance between the virtual object and the virtual throwing prop is smaller than a distance threshold value, and controlling the virtual object to automatically pick up the virtual throwing prop.
In an exemplary scenario, the virtual throwing prop is a virtual cold weapon.
To sum up, the scheme shown in this application embodiment shows corresponding collision vestige through the virtual collision point department of throwing the stage property and virtual obstacle emergence collision in the virtual scene to the position that this virtual stage property and virtual obstacle emergence collision of suggestion user, it is corresponding, the user can be according to the virtual vestige fast determination virtual landing point position of throwing the stage property, thereby the time of looking for virtual stage property of shortening the user that can be very big, thereby very big virtual shooting game time of shortening, the electric quantity at saving terminal.
In addition, according to the scheme shown in the embodiment of the application, the computer equipment can determine the shape of the collision trace according to the material of the virtual barrier and the type of the virtual throwing prop, so that the virtual barriers made of different materials show the collision traces in different shapes after being collided by the virtual throwing props of different types, and the display effect of the collision traces is improved.
In addition, according to the scheme shown in the embodiment of the application, the computer equipment can determine the size of the collision trace according to the moving speed of the virtual throwing prop before the collision with the virtual obstacle, so that the display effect of the collision trace is further improved.
In addition, according to the scheme shown in the embodiment of the application, the computer equipment can start from the virtual throwing prop and detect the virtual obstacle along the virtual ray line segment extending from the moving direction of the virtual throwing prop in the moving process of the virtual throwing prop, and obtains the material of the virtual obstacle when the top end of the virtual ray line segment reaches the virtual obstacle, so that the collision trace is determined in advance before the collision occurs, the collision trace is displayed immediately when the collision occurs, and the display effect of the collision trace is improved.
Fig. 13 shows a block diagram of a computer device 1300 according to an exemplary embodiment of the present invention. The computer device 1300 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Computer device 1300 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc. Alternatively, the computer device 1300 may be a server on the network side.
Generally, computer device 1300 includes: a processor 1301 and a memory 1302.
Processor 1301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 1301 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1301 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1301 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, processor 1301 may further include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
Memory 1302 may include one or more computer-readable storage media, which may be non-transitory. The memory 1302 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1302 is used to store at least one instruction for execution by processor 1301 to implement the methods provided by the method embodiments herein.
In some embodiments, computer device 1300 may also optionally include: a peripheral interface 1303 and at least one peripheral. Processor 1301, memory 1302, and peripheral interface 1303 may be connected by a bus or signal line. Each peripheral device may be connected to the peripheral device interface 1303 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1304, touch display 1305, camera 1306, audio circuitry 1307, positioning component 1308, and power supply 1309.
Peripheral interface 1303 may be used to connect at least one peripheral associated with I/O (Input/Output) to processor 1301 and memory 1302. In some embodiments, processor 1301, memory 1302, and peripheral interface 1303 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1301, the memory 1302, and the peripheral device interface 1303 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 1304 is used to receive and transmit RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1304 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1304 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1304 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 1304 may communicate with other computer devices via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1304 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1305 is used to display a UI (user interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1305 is a touch display screen, the display screen 1305 also has the ability to capture touch signals on or over the surface of the display screen 1305. The touch signal may be input to the processor 1301 as a control signal for processing. At this point, the display 1305 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 1305 may be one, providing the front panel of the computer device 1300; in other embodiments, the display 1305 may be at least two, respectively disposed on different surfaces of the computer device 1300 or in a folded design; in still other embodiments, the display 1305 may be a flexible display disposed on a curved surface or on a folded surface of the computer device 1300. Even further, the display 1305 may be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The Display 1305 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or the like.
The camera assembly 1306 is used to capture images or video. Optionally, camera assembly 1306 includes a front camera and a rear camera. Generally, a front camera is disposed on a front panel of a computer apparatus, and a rear camera is disposed on a rear surface of the computer apparatus. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1306 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1307 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1301 for processing, or inputting the electric signals to the radio frequency circuit 1304 for realizing voice communication. The microphones may be multiple and placed at different locations on the computer device 1300 for stereo sound acquisition or noise reduction purposes. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1301 or the radio frequency circuitry 1304 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 1307 may also include a headphone jack.
The Location component 1308 is used to locate the current geographic Location of the computer device 1300 for navigation or LBS (Location Based Service). The Positioning component 1308 can be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in europe.
The power supply 1309 is used to supply power to the various components in the computer device 1300. The power source 1309 may be alternating current, direct current, disposable or rechargeable. When the power source 1309 comprises a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, computer device 1300 also includes one or more sensors 1310. The one or more sensors 1310 include, but are not limited to: acceleration sensor 1311, gyro sensor 1312, pressure sensor 1313, fingerprint sensor 1314, optical sensor 1315, and proximity sensor 1316.
The acceleration sensor 1311 may detect the magnitude of acceleration in three coordinate axes of the coordinate system established with the computer apparatus 1300. For example, the acceleration sensor 1311 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1301 may control the touch display screen 1305 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1311. The acceleration sensor 1311 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1312 may detect a body direction and a rotation angle of the computer device 1300, and the gyro sensor 1312 may cooperate with the acceleration sensor 1311 to collect a 3D motion of the user with respect to the computer device 1300. Processor 1301, based on the data collected by gyroscope sensor 1312, may perform the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensors 1313 may be disposed on the side bezel of the computer device 1300 and/or underneath the touch display 1305. When the pressure sensor 1313 is disposed on the side frame of the computer device 1300, a user's holding signal to the computer device 1300 may be detected, and the processor 1301 performs left-right hand recognition or shortcut operation according to the holding signal acquired by the pressure sensor 1313. When the pressure sensor 1313 is disposed at a lower layer of the touch display screen 1305, the processor 1301 controls an operability control on the UI interface according to a pressure operation of the user on the touch display screen 1305. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1314 is used for collecting the fingerprint of the user, and the processor 1301 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 1314, or the fingerprint sensor 1314 identifies the identity of the user according to the collected fingerprint. When the identity of the user is identified as a trusted identity, the processor 1301 authorizes the user to perform relevant sensitive operations, including unlocking a screen, viewing encrypted information, downloading software, paying, changing settings, and the like. The fingerprint sensor 1314 may be disposed on the front, back, or side of the computer device 1300. When a physical key or vendor Logo is provided on the computer device 1300, the fingerprint sensor 1314 may be integrated with the physical key or vendor Logo.
The optical sensor 1315 is used to collect the ambient light intensity. In one embodiment, the processor 1301 can control the display brightness of the touch display screen 1305 according to the intensity of the ambient light collected by the optical sensor 1315. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1305 is increased; when the ambient light intensity is low, the display brightness of the touch display 1305 is turned down. In another embodiment, the processor 1301 can also dynamically adjust the shooting parameters of the camera assembly 1306 according to the ambient light intensity collected by the optical sensor 1315.
The proximity sensor 1316, also known as a distance sensor, is typically disposed on a front panel of the computer device 1300. The proximity sensor 1316 is used to capture the distance between the user and the front face of the computer device 1300. In one embodiment, the touch display 1305 is controlled by the processor 1301 to switch from the bright screen state to the dark screen state when the proximity sensor 1316 detects that the distance between the user and the front face of the computer device 1300 gradually decreases; the touch display 1305 is controlled by the processor 1301 to switch from the breath-screen state to the light-screen state when the proximity sensor 1316 detects that the distance between the user and the front surface of the computer device 1300 is gradually increasing.
Those skilled in the art will appreciate that the architecture shown in FIG. 13 is not intended to be limiting of the computer device 1300, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, which may be a computer readable storage medium contained in a memory of the above embodiments; or it may be a separate computer-readable storage medium not incorporated in the terminal. The computer readable storage medium has stored therein at least one instruction, at least one program, set of codes, or set of instructions that is loaded and executed by the processor to implement the method according to the various embodiments described above.
Optionally, the computer-readable storage medium may include: a Read Only Memory (ROM), a Random Access Memory (RAM), a Solid State Drive (SSD), or an optical disc. The Random Access Memory may include a resistive Random Access Memory (ReRAM) and a Dynamic Random Access Memory (DRAM). The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The present application also provides a computer program product, which when run on a computer causes the computer to perform the methods provided by the various method embodiments described above.
The above description is only exemplary of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like that are made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A method for displaying collision traces in a virtual scene is characterized by comprising the following steps:
displaying a scene picture of a virtual scene, wherein the scene picture is a picture before a virtual throwing prop of a virtual object device is thrown;
after the virtual throwing prop is thrown, in the moving process of the virtual throwing prop, taking the virtual throwing prop as a starting point, and setting a virtual ray line segment along the moving direction of the virtual throwing prop; the moving process comprises a process of moving the virtual throwing prop after being thrown and before the virtual throwing prop collides with the virtual barrier and a process of continuing moving after rebounding;
responding to the fact that the top end of the virtual ray line segment reaches the virtual obstacle, and obtaining the material of the virtual obstacle; wherein the material of the virtual obstacle is obtained before the virtual throwing prop collides with the virtual obstacle;
determining the shape of the collision trace according to the material of the virtual barrier;
acquiring a collision point between the thrown virtual throwing prop and a virtual barrier in the virtual scene; the collision point is a position point which is contacted with the virtual obstacle in the moving process of the virtual throwing prop;
displaying collision marks on the virtual barrier corresponding to the collision points according to the shapes of the collision marks; the collision trace is used for prompting the user that the virtual throwing prop collides with the virtual obstacle at the collision point;
and controlling the virtual object to automatically pick up the virtual throwing prop in response to that the virtual throwing prop stops moving after being thrown for at least one bounce and the distance between the virtual object and the virtual throwing prop is smaller than a distance threshold.
2. The method of claim 1, wherein said determining the shape of the collision trace based on the material of the virtual obstacle comprises:
and determining the shape of the collision trace according to the material of the virtual barrier and the prop type of the virtual throwing prop.
3. The method according to claim 2, wherein before displaying the collision trace on the virtual obstacle at the corresponding collision point according to the shape of the collision trace, further comprising:
acquiring the speed of the virtual throwing prop when colliding with the virtual barrier;
determining the size of the collision trace according to the material of the virtual barrier and the speed of the virtual throwing prop;
the displaying the collision trace on the virtual obstacle corresponding to the collision point according to the shape of the collision trace comprises:
and displaying the collision trace on the virtual obstacle corresponding to the collision point according to the shape and the size of the virtual obstacle.
4. The method of claim 1, further comprising:
removing the virtual throwing prop from the virtual scene in response to the time length of stopping moving after the virtual throwing prop is thrown reaching a specified time length;
alternatively, the first and second electrodes may be,
removing the virtual throwing prop from the virtual scene in response to the virtual object being culled out of the virtual scene.
5. The method of any one of claims 1 to 4, wherein the virtual throwing prop is a virtual cold weapon.
6. A method for displaying collision traces in a virtual scene is characterized by comprising the following steps:
displaying a first scene picture of a virtual scene, wherein the first scene picture is a picture before a virtual throwing prop of a virtual object device is thrown;
in response to a throwing operation of the virtual throwing prop, displaying a second scene picture of the virtual scene, wherein the second scene picture is a picture of the virtual throwing prop when moving after being thrown;
in response to the virtual thrown prop colliding with a virtual obstacle in the virtual scene, displaying a collision trace at a corresponding collision point on the virtual obstacle; the collision point is a position point which is contacted with the virtual obstacle in the moving process of the virtual throwing prop; the collision trace is used for prompting the user that the virtual throwing prop collides with the virtual obstacle at the collision point; the collision trace is a trace displayed according to the shape of the collision trace; the shape of the collision trace is determined according to the material of the virtual obstacle; the material of the virtual obstacle is obtained when the virtual throwing prop is thrown and the top end of a virtual ray line segment arranged along the moving direction of the virtual throwing prop reaches the virtual obstacle by taking the virtual throwing prop as a starting point in the moving process of the virtual throwing prop; wherein the material of the virtual obstacle is obtained before the virtual throwing prop collides with the virtual obstacle; the moving process comprises a process of moving the virtual throwing prop after being thrown and before the virtual throwing prop collides with the virtual barrier and a process of continuing moving after rebounding;
and controlling the virtual object to automatically pick up the virtual throwing prop in response to that the virtual throwing prop stops moving after being thrown for at least one bounce and the distance between the virtual object and the virtual throwing prop is smaller than a distance threshold.
7. An apparatus for displaying collision traces in a virtual scene, the apparatus comprising:
the picture display module is used for displaying a scene picture of a virtual scene, wherein the scene picture is a picture before a virtual throwing prop of a virtual object device is thrown;
the material obtaining module is used for setting a virtual ray line segment along the moving direction of the virtual throwing prop by taking the virtual throwing prop as a starting point in the moving process of the virtual throwing prop after the virtual throwing prop is thrown; the moving process comprises a process of moving the virtual throwing prop after being thrown and before the virtual throwing prop collides with the virtual barrier and a process of continuing moving after rebounding;
the material obtaining module is further configured to obtain a material of the virtual obstacle in response to the top end of the virtual ray line segment reaching the virtual obstacle; wherein the material of the virtual obstacle is obtained before the virtual throwing prop collides with the virtual obstacle;
the shape determining module is used for determining the shape of the collision trace according to the material of the virtual barrier;
the collision point acquisition module is used for acquiring a collision point between the thrown virtual throwing prop and a virtual barrier in the virtual scene; the collision point is a position point which is contacted with the virtual obstacle in the moving process of the virtual throwing prop;
the trace display module is used for displaying collision traces on the virtual barrier corresponding to the collision points; the collision trace is used for prompting the user that the virtual throwing prop collides with the virtual obstacle at the collision point;
and the pick-up control module is used for controlling the virtual object to automatically pick up the virtual throwing prop in response to that the virtual throwing prop stops moving after being rebounded for at least one time after being thrown and the distance between the virtual object and the virtual throwing prop is smaller than a distance threshold value.
8. The apparatus of claim 7, wherein the shape determining module is configured to determine the shape of the collision trace based on the material of the virtual obstacle and the type of prop of the virtual throwing prop.
9. A computer device comprising a processor and a memory, said memory having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, said at least one instruction, said at least one program, said set of codes, or said set of instructions being loaded and executed by said processor to implement a method of collision trace exposure in a virtual scene as claimed in any one of claims 1 to 6.
10. A computer-readable storage medium, having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the method for collision trace presentation in a virtual scene according to any one of claims 1 to 6.
CN202010151742.XA 2020-03-06 2020-03-06 Method, device, equipment and storage medium for displaying collision traces in virtual scene Active CN111282275B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010151742.XA CN111282275B (en) 2020-03-06 2020-03-06 Method, device, equipment and storage medium for displaying collision traces in virtual scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010151742.XA CN111282275B (en) 2020-03-06 2020-03-06 Method, device, equipment and storage medium for displaying collision traces in virtual scene

Publications (2)

Publication Number Publication Date
CN111282275A CN111282275A (en) 2020-06-16
CN111282275B true CN111282275B (en) 2022-03-11

Family

ID=71020230

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010151742.XA Active CN111282275B (en) 2020-03-06 2020-03-06 Method, device, equipment and storage medium for displaying collision traces in virtual scene

Country Status (1)

Country Link
CN (1) CN111282275B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111760284A (en) * 2020-08-12 2020-10-13 腾讯科技(深圳)有限公司 Virtual item control method, device, equipment and storage medium
CN112044084B (en) * 2020-09-04 2022-06-28 腾讯科技(深圳)有限公司 Virtual item control method, device, storage medium and equipment in virtual environment
CN112044071B (en) 2020-09-04 2021-10-15 腾讯科技(深圳)有限公司 Virtual article control method, device, terminal and storage medium
CN112121414B (en) * 2020-09-29 2022-04-08 腾讯科技(深圳)有限公司 Tracking method and device in virtual scene, electronic equipment and storage medium
CN112121431A (en) * 2020-09-29 2020-12-25 腾讯科技(深圳)有限公司 Interactive processing method and device of virtual prop, electronic equipment and storage medium
CN112619151B (en) * 2020-12-22 2022-08-12 上海米哈游天命科技有限公司 Collision prediction method and device, electronic equipment and storage medium
CN112619134B (en) * 2020-12-22 2023-05-02 上海米哈游天命科技有限公司 Method, device, equipment and storage medium for determining flight distance of transmission target
CN112642163A (en) * 2020-12-22 2021-04-13 上海米哈游天命科技有限公司 Motion trajectory prediction method and device, electronic equipment and storage medium
CN112619163B (en) * 2020-12-22 2023-05-02 上海米哈游天命科技有限公司 Flight path control method and device, electronic equipment and storage medium
CN112619164B (en) * 2020-12-22 2023-05-02 上海米哈游天命科技有限公司 Method, device, equipment and storage medium for determining flying height of transmission target
CN113413597A (en) * 2021-06-21 2021-09-21 网易(杭州)网络有限公司 Virtual item assembling method and device, computer equipment and storage medium
CN115953930B (en) * 2023-03-16 2023-06-06 深圳市心流科技有限公司 Concentration training method, device, terminal and storage medium based on vision tracking

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110538459A (en) * 2019-09-05 2019-12-06 腾讯科技(深圳)有限公司 Method, apparatus, device and medium for throwing virtual explosives in virtual environment
CN110721468A (en) * 2019-09-30 2020-01-24 腾讯科技(深圳)有限公司 Interactive property control method, device, terminal and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106611436A (en) * 2016-12-30 2017-05-03 腾讯科技(深圳)有限公司 Animation resource display processing method and device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110538459A (en) * 2019-09-05 2019-12-06 腾讯科技(深圳)有限公司 Method, apparatus, device and medium for throwing virtual explosives in virtual environment
CN110721468A (en) * 2019-09-30 2020-01-24 腾讯科技(深圳)有限公司 Interactive property control method, device, terminal and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
【使命召唤手游】小李飞刀来啦!战飞斧全教学;嗨我是KH;《哔哩哔哩视频》;20200131;整个视频 *
使命召唤OL致命装备飞斧图文评测 飞斧好用吗;落叶;《百度》;20150409;第1-2页 *

Also Published As

Publication number Publication date
CN111282275A (en) 2020-06-16

Similar Documents

Publication Publication Date Title
CN111282275B (en) Method, device, equipment and storage medium for displaying collision traces in virtual scene
CN110427111B (en) Operation method, device, equipment and storage medium of virtual prop in virtual environment
CN111035924B (en) Method, device and equipment for controlling props in virtual scene and storage medium
CN110755841B (en) Method, device and equipment for switching props in virtual environment and readable storage medium
CN110917619B (en) Interactive property control method, device, terminal and storage medium
WO2021143259A1 (en) Virtual object control method and apparatus, device, and readable storage medium
CN110585710B (en) Interactive property control method, device, terminal and storage medium
CN111589150B (en) Control method and device of virtual prop, electronic equipment and storage medium
CN111475573B (en) Data synchronization method and device, electronic equipment and storage medium
CN111202975B (en) Method, device and equipment for controlling foresight in virtual scene and storage medium
CN111228809A (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN111475029B (en) Operation method, device, equipment and storage medium of virtual prop
CN112870715B (en) Virtual item putting method, device, terminal and storage medium
CN111714893A (en) Method, device, terminal and storage medium for controlling virtual object to recover attribute value
CN111389005B (en) Virtual object control method, device, equipment and storage medium
CN111760284A (en) Virtual item control method, device, equipment and storage medium
CN112221141A (en) Method and device for controlling virtual object to use virtual prop
CN111265857A (en) Trajectory control method, device, equipment and storage medium in virtual scene
CN112933601A (en) Virtual throwing object operation method, device, equipment and medium
CN111249726B (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN113713382A (en) Virtual prop control method and device, computer equipment and storage medium
CN112316430B (en) Prop using method, device, equipment and medium based on virtual environment
CN111659122B (en) Virtual resource display method and device, electronic equipment and storage medium
CN113713383A (en) Throwing prop control method and device, computer equipment and storage medium
CN111111181A (en) Method, device and equipment for setting props in virtual environment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40023663

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant