CN114100128B - Prop special effect display method, device, computer equipment and storage medium - Google Patents

Prop special effect display method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN114100128B
CN114100128B CN202111500232.XA CN202111500232A CN114100128B CN 114100128 B CN114100128 B CN 114100128B CN 202111500232 A CN202111500232 A CN 202111500232A CN 114100128 B CN114100128 B CN 114100128B
Authority
CN
China
Prior art keywords
virtual
prop
virtual object
special effect
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111500232.XA
Other languages
Chinese (zh)
Other versions
CN114100128A (en
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202111500232.XA priority Critical patent/CN114100128B/en
Publication of CN114100128A publication Critical patent/CN114100128A/en
Application granted granted Critical
Publication of CN114100128B publication Critical patent/CN114100128B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a prop special effect display method, a prop special effect display device, computer equipment and a storage medium, and belongs to the technical field of computers. The method comprises the following steps: displaying a view field picture of a first virtual object controlled by the local terminal equipment; and displaying the special prop effect of the virtual prop under the condition that the triggered virtual prop is positioned in the visual field of the first virtual object and the distance between the virtual prop and the first virtual object is not larger than the target distance, and not displaying the special prop effect of the virtual prop under the condition that the triggered virtual prop is positioned in the visual field and the distance between the virtual prop and the first virtual object is larger than the target distance. According to the method provided by the embodiment of the application, the special prop effect of the virtual prop can be displayed only when the triggered virtual prop is in the visual field and the distance between the virtual prop and the virtual object is smaller than the target distance, so that resources required for displaying the special prop effect are saved.

Description

Prop special effect display method, device, computer equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a prop special effect display method, a prop special effect display device, computer equipment and a storage medium.
Background
With the development of computer technology, online games are becoming more and more abundant and diverse. In the online game, when any virtual prop is triggered, prop special effects of the virtual prop are displayed, so that the display effect is improved. However, in the case that a plurality of virtual props are triggered, prop special effects of the plurality of virtual props need to be displayed, and thus more resources need to be occupied.
Disclosure of Invention
The embodiment of the application provides a prop special effect display method, a device, computer equipment and a storage medium, which can save resources required to be occupied for displaying prop special effects. The technical scheme is as follows:
in one aspect, a method for displaying special effects of props is provided, the method comprising:
displaying a view field picture of a first virtual object controlled by the local terminal equipment;
displaying the prop special effect of the virtual prop under the condition that the triggered virtual prop is positioned in the visual field of the first virtual object and the distance between the virtual prop and the first virtual object is not greater than the target distance;
and under the condition that the triggered virtual prop is positioned in the visual field and the distance between the virtual prop and the first virtual object is larger than the target distance, not displaying the prop special effect of the virtual prop.
In another aspect, there is provided a prop special effect display device, the device comprising:
the display module is used for displaying a visual field picture of the first virtual object controlled by the local terminal equipment;
the display module is further used for displaying the prop special effect of the virtual prop under the condition that the triggered virtual prop is located in the visual field of the first virtual object and the distance between the virtual prop and the first virtual object is not greater than the target distance;
the display module is further configured to not display a prop special effect of the virtual prop when the triggered virtual prop is located in the field of view and a distance between the virtual prop and the first virtual object is greater than the target distance.
In another possible implementation manner, the display module is configured to display a prop special effect of the virtual prop when the virtual prop is triggered, a second virtual object holding the virtual prop is located in the field of view, and a distance between the second virtual object and the first virtual object is not greater than the target distance.
In another possible implementation, the display module is configured to display a firing special effect of any virtual firearm if the virtual firearm is triggered, a second virtual object holding the virtual firearm is located within the field of view, and a distance between the second virtual object and the first virtual object is not greater than the target distance.
In another possible implementation manner, the display module is configured to display a prop special effect of the virtual prop when the virtual prop is triggered to move within the field of view and a distance between the virtual prop and the first virtual object is not greater than the target distance.
In another possible implementation manner, the display module is configured to display a special effect of movement of the virtual bullet when any virtual bullet is ejected and a distance between the virtual bullet and the first virtual object is not greater than the target distance.
In another possible implementation manner, the display module is configured to display, at the contact position, a prop special effect of the virtual prop when the virtual prop is triggered and then is in contact with any virtual object, the contact position is within the field of view, and a distance between the contact position and a position of the first virtual object is not greater than the target distance.
In another possible implementation manner, the display module is configured to display, at a contact position of any virtual bullet, a bullet hole special effect of the virtual bullet if the contact position is within the field of view and a distance between the contact position and a position of the first virtual object is not greater than the target distance after the virtual bullet is ejected.
In another possible implementation manner, the display module includes:
a determining unit configured to determine a bullet hole special effect matching the virtual article in a case where the virtual bullet is in contact with the virtual article after being ejected, the contact position is within the field of view, and a distance between the contact position and a position of the first virtual object is not greater than the target distance;
and the display unit is used for displaying the determined bullet hole special effect at the contact position.
In another possible implementation, the apparatus further includes:
the acquisition module is used for acquiring the position of the first virtual object and the position of a second virtual object holding the virtual prop under the condition that the virtual prop is triggered;
and the display module is used for displaying the prop special effect of the virtual prop under the condition that the position of the second virtual object is in the visual field and the distance between the position of the second virtual object and the position of the first virtual object is not greater than the target distance.
In another possible implementation, the apparatus further includes:
the acquisition module is used for acquiring the position of the first virtual object and the position of a second virtual object holding the virtual prop under the condition that the virtual prop is triggered;
The determining module is used for determining the position of the virtual prop based on the position of the second virtual object, the orientation of the second virtual object and the relative position relation between the second virtual object and the virtual prop;
and the display module is used for displaying the special prop effect of the virtual prop under the condition that the position of the virtual prop is in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is not greater than the target distance.
In another possible implementation, the apparatus further includes:
the acquisition module is used for acquiring the position of the first virtual object and the position of a third virtual object triggering the virtual prop under the condition that the virtual prop is triggered and starts to move;
the determining module is used for determining the current position of the virtual prop based on the position of the third virtual object and the moving direction of the virtual prop;
and the display module is used for displaying the special moving effect of the virtual prop under the condition that the virtual prop moves in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is not greater than the target distance.
In another aspect, a computer device is provided that includes a processor and a memory having at least one computer program stored therein, the at least one computer program loaded and executed by the processor to perform operations performed by the prop special effect display method as described in the above aspect.
In another aspect, there is provided a computer readable storage medium having stored therein at least one computer program loaded and executed by a processor to perform the operations performed by the prop special effect display method as described in the above aspect.
In yet another aspect, a computer program product is provided, comprising a computer program that, when executed by a processor, performs the operations performed by the prop special effect display method as described in the above aspect.
The beneficial effects that technical scheme that this application embodiment provided include at least:
according to the method, the device, the computer equipment and the storage medium, whether the prop special effect of the triggered virtual prop is displayed is determined based on the visual field and the target distance of the virtual object controlled by the local equipment, and the prop special effect of the virtual prop is displayed only when the triggered virtual prop is in the visual field of the virtual object controlled by the local equipment and the distance between the virtual prop and the virtual object is smaller than the target distance, and the prop special effect of the virtual prop is not required to be displayed but is larger than the target distance, so that resources required to be occupied by displaying the prop special effect are saved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an implementation environment provided by an embodiment of the present application;
FIG. 2 is a flowchart of a method for displaying special effects of props provided in an embodiment of the present application;
FIG. 3 is a flowchart of a method for displaying special effects of props provided in an embodiment of the present application;
fig. 4 is a schematic diagram of a virtual scene provided in an embodiment of the present application;
fig. 5 is a schematic diagram of a virtual scene provided in an embodiment of the present application;
fig. 6 is a schematic diagram of a virtual scene provided in an embodiment of the present application;
FIG. 7 is a flowchart of a method for displaying special effects of props provided in an embodiment of the present application;
FIG. 8 is a flowchart of a method for displaying special effects of props provided in an embodiment of the present application;
FIG. 9 is a schematic diagram of a bullet hole effect provided in an embodiment of the present application;
FIG. 10 is a schematic diagram of a bullet hole effect provided in an embodiment of the present application;
FIG. 11 is a schematic diagram of a bullet hole effect provided in an embodiment of the present application;
FIG. 12 is a flowchart of a method for displaying special effects of props provided in an embodiment of the present application;
FIG. 13 is a schematic diagram of a firing effect and a movement effect provided in an embodiment of the present application;
FIG. 14 is a schematic diagram of a firing effect and a movement effect provided in an embodiment of the present application;
FIG. 15 is a flowchart of a method for displaying special effects of props provided in an embodiment of the present application;
fig. 16 is a schematic structural diagram of a prop special effect display device provided in an embodiment of the present application;
fig. 17 is a schematic structural diagram of a prop special effect display device provided in an embodiment of the present application;
fig. 18 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 19 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The terms "first," "second," "third," and the like, as used herein, may be used to describe various concepts, but are not limited by these terms unless otherwise specified. These terms are only used to distinguish one concept from another. For example, a first virtual object may be referred to as a second virtual object, and similarly, a second virtual object may be referred to as a first virtual object, without departing from the scope of the present application.
The terms "at least one," "a plurality," "each," "any one," as used herein, include one, two or more, a plurality includes two or more, and each refers to each of a corresponding plurality, any one referring to any one of the plurality. For example, the plurality of virtual objects includes 3 virtual objects, and each refers to each of the 3 virtual objects, and any one refers to any one of the 3 virtual objects, which can be the first virtual object, or the second virtual object, or the third virtual object.
In order to facilitate understanding of the embodiments of the present application, some terms related to the embodiments of the present application are explained first:
and (3) a mobile terminal: including cell phones, tablets, or other hand-held portable gaming devices.
Shooting games: shooting games include first person shooting games, third person shooting games, or other games that use heat weapons to conduct remote attacks.
Virtual scene: is a virtual scene that an application program displays (or provides) while running on a terminal. The virtual scene is a simulation environment for the real world, or a semi-simulated and semi-fictional virtual environment, or a purely fictional virtual environment. The virtual scene is any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, which is not limited in this application. For example, a virtual scene includes sky, land, sea, etc., the land includes environmental elements of a desert, city, etc., and a user can control a virtual object to move in the virtual scene. Of course, the virtual scene also includes virtual objects, such as throwing objects, buildings, carriers, and props such as weapons required for arming themselves or fight with other virtual objects, and can also be used for simulating real environments in different weather, such as sunny, rainy, foggy, or night weather. The variety of scene elements enhances the diversity and realism of virtual scenes.
Virtual object: refers to a virtual character that is movable in a virtual scene, the movable object being a virtual character, a virtual animal, a cartoon character, or the like. The virtual object is a virtual avatar in the virtual scene for representing a user. The virtual scene comprises a plurality of virtual objects, and each virtual object has a shape and a volume thereof in the virtual scene and occupies a part of space in the virtual scene. Alternatively, the virtual object is a Character controlled by operating on a client, or is an artificial intelligence (Artificial Intelligence, AI) set in a virtual environment fight by training, or is a Non-Player Character (NPC) set in a virtual scene fight. Optionally, the virtual object is a virtual character playing an athletic in the virtual scene. Optionally, the number of virtual objects in the virtual scene fight is preset, or dynamically determined according to the number of clients joining the fight, which is not limited in the embodiment of the present application.
Alternatively, the user can control the virtual object to move in the virtual scene, for example, a shooting game, the user controls the virtual object to freely fall, glide, or open a parachute to fall in the sky of the virtual scene, run, jump, crawl, bend down, or the like on land, or controls the virtual object to swim, float, or dive in the ocean, or controls the virtual object to move in the virtual scene in a riding carrier. The user can also control the virtual object to enter and exit the building in the virtual scene, find and pick up the virtual prop (such as throwing objects, weapons, and the like) in the virtual scene, so as to fight with other virtual objects through the picked virtual prop, for example, the virtual prop is clothing, helmet, body armor, medical article, cold weapons, hot weapons, or the like, or is a virtual prop left after other virtual objects are eliminated. The above scenario is merely exemplified herein, and the embodiments of the present application are not limited thereto in detail.
Virtual prop: refers to props that can be used with virtual objects in a virtual scene. Taking shooting games as an example, throwing object props such as virtual bombs, virtual grenades and the like are arranged in the shooting games, shooting props such as virtual firearms, virtual crossbows and the like are also arranged, virtual bullets shot by the virtual firearms and virtual arrows shot by the virtual crossbows can also be called virtual props, and the throwing object props and the shooting props can damage attacked virtual objects. The virtual prop can also assist the virtual object to achieve a certain purpose, for example, a smoke bullet can assist the virtual object in obscuring the figure. It should be noted that, the embodiments of the present application do not limit the types of virtual props.
The prop special effect method provided by the embodiment of the application is executed by computer equipment. Optionally, the computer device is a terminal or a server. Optionally, the server is a stand-alone physical server, or is a server cluster or a distributed system formed by a plurality of physical servers, or is a cloud server for providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs (Content Delivery Network, content delivery networks), basic cloud computing services such as big data and artificial intelligence platforms, and the like. Optionally, the terminal is a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, a smart voice interaction device, a smart home appliance, a vehicle-mounted terminal, and the like, but is not limited thereto.
In some embodiments, the computer program related to the embodiments of the present application may be deployed to be executed on one computer device or on multiple computer devices located at one site, or on multiple computer devices distributed across multiple sites and interconnected by a communication network, where the multiple computer devices distributed across multiple sites and interconnected by the communication network can form a blockchain system.
In some embodiments, the computer device is provided as a terminal. FIG. 1 is a schematic diagram of an implementation environment provided by embodiments of the present application. Referring to fig. 1, the implementation environment includes at least one terminal 101 (2 are illustrated in fig. 1) and a server 102. The terminal 101 and the server 102 are connected by a wireless or wired network.
The terminal 101 has installed thereon a target application served by the server 102, which supports virtual scene display, for example, any one of a Role-Playing Game (RPG) and a multiplayer online tactical Game (Multiplayer Online Battle Arena Games, MOBA). The terminal 101 is a terminal used by any user who uses the terminal 101 to operate a virtual object located in a virtual scene to perform an activity including at least one of crawling, walking, running, jumping, driving, picking up, shooting, attacking, throwing.
In one possible implementation, different users respectively use different terminals to control the virtual objects, and the virtual objects controlled by the different terminals are located in the same virtual scene, where the different virtual objects can perform activities. In the application, the first terminal controls the first virtual object as an example, the first terminal displays the view field picture of the first virtual object, and in the virtual scene, the first virtual object or the virtual objects controlled by other terminals can trigger the virtual prop. And if any virtual prop is triggered, the virtual prop is in the field of view of the first virtual object and the distance between the virtual prop and the first virtual object is not greater than the target distance, displaying the prop special effect of the virtual prop in the field of view, and if the virtual prop is triggered but the virtual prop is not in the field of view of the first virtual object, or if the virtual prop is triggered, the virtual prop is in the field of view of the first virtual object but the distance between the virtual prop and the first virtual object is greater than the target distance, and not displaying the prop special effect of the virtual prop.
Fig. 2 is a flowchart of a prop special effect display method provided in an embodiment of the present application, which is executed by a terminal, and as shown in fig. 2, the method includes:
201. The terminal displays a view field picture of the first virtual object controlled by the local terminal device.
The first virtual object is a virtual object controlled by the terminal, for example, the virtual object is a virtual character, a virtual animal, a virtual vehicle, or the like. The visual field screen displays a virtual scene observed from the visual angle of the first virtual object, for example, the visual field screen is a screen corresponding to a visual field displayed from the first person visual angle of the first virtual object, or a screen corresponding to a visual field displayed from the third person visual angle of the first virtual object. For example, the first virtual object is a virtual character, the virtual scene is watched by a first person perspective of the first virtual object, and a view field picture of the first virtual object is displayed so as to simulate a view field which can be observed by a real person to observe a real scene. For example, the view screen includes a virtual building, a vehicle, a virtual object, or the like.
202. And the terminal displays the prop special effect of the virtual prop under the condition that the triggered virtual prop is positioned in the visual field of the first virtual object and the distance between the virtual prop and the first virtual object is not greater than the target distance.
The virtual prop is any type of prop in the virtual scene, and the virtual prop can be triggered. For example, the virtual prop is a virtual firearm, a virtual grenade, a virtual bullet, or the like. The target distance is an arbitrary distance, for example, the target distance is 100 meters or 200 meters, or the like. The prop special effect is an effect which is presented when the virtual prop is triggered, for example, the virtual prop is a virtual firearm, and the prop special effect of the virtual firearm is a firing special effect so as to present an effect that the virtual prop fires. For another example, the virtual prop is a virtual bullet, and the prop special effect of the virtual bullet is a moving special effect or a bullet hole special effect, the moving special effect can show the moving effect of the virtual bullet, and the bullet hole special effect can show the bullet hole formed by the virtual bullet.
In this embodiment of the present application, the field of view includes a portion of the virtual scene observed from the perspective of the first virtual object, the field of view of the first virtual object corresponds to a field of view screen of the first virtual object, and the field of view screen displays the virtual scene in the field of view of the first virtual object. The triggered virtual prop is within the field of view of the first virtual object, indicating that the triggered virtual prop can be viewed. The target distance can represent the maximum distance that the special effect of the prop can be presented in the visual field, namely, the distance between the triggered virtual prop in the visual field and the first virtual object is not greater than the target distance, the virtual prop can be seen in the visual field, the special effect of the prop when the virtual prop is triggered can also be seen, and then the prop of the virtual prop is displayed in the visual field picture.
203. And under the condition that the triggered virtual prop is positioned in the visual field and the distance between the virtual prop and the first virtual object is larger than the target distance, the terminal does not display the prop special effect of the virtual prop.
In the embodiment of the application, the distance between the triggered virtual prop in the visual field and the first virtual object is larger than the target distance, which means that the virtual prop can be seen in the visual field, but the prop special effect when the virtual prop is triggered cannot be seen, even if the prop special effect of the virtual prop is displayed, the display effect of the prop special effect is not obvious, therefore, the prop special effect of the virtual prop is not displayed in the visual field picture any more, and resources required to be occupied by displaying the prop special effect are saved.
According to the method provided by the embodiment of the application, whether the prop special effect of the triggered virtual prop is displayed is determined based on the visual field and the target distance of the virtual object controlled by the local equipment, and the prop special effect of the virtual prop is displayed only when the triggered virtual prop is in the visual field of the virtual object controlled by the local equipment and the distance between the virtual prop and the virtual object is smaller than the target distance, and the prop special effect of the virtual prop is not required to be displayed but the distance between the virtual prop and the virtual object is larger than the target distance, so that resources required to be occupied by displaying the prop special effect are saved.
Based on the embodiment shown in fig. 2, the virtual prop is a virtual prop that can be held by a virtual object, and when the virtual prop is triggered, it is determined whether the position of a second virtual object holding the virtual prop is in the field of view and whether the distance between the second virtual object and the first virtual object is greater than the target distance, so as to determine whether to display the prop special effect of the virtual prop.
Fig. 3 is a flowchart of a prop special effect display method provided in an embodiment of the present application, which is executed by a terminal, and as shown in fig. 3, the method includes:
301. The terminal displays a view field picture of the first virtual object controlled by the local terminal device.
In the embodiment of the present application, a virtual scene included in a field of view of a first virtual object is presented in a field of view screen displayed by a terminal.
In one possible implementation, the distance between any position within the field of view and the position of the first virtual object is no greater than the visual distance.
The visual distance is the farthest distance that the first virtual object can observe, and is any distance, for example, the visual distance may be 1000 meters. The field of view includes only locations that are no greater than a visual distance from the location of the first virtual object and that are observable within the field of view of the first virtual object. For example, the distance between the first location and the location of the first virtual object is no greater than the visual distance, but the first location is behind the first virtual object, then the first location is not observed within the field of view of the first virtual object, and then the first location is not within the field of view. Or, the distance between any first position and the position of the first virtual object is not greater than the visual distance, but an obstacle exists between the first position and the first virtual object, when the virtual scene is observed in the view of the first virtual object, the first position is blocked by the obstacle, so that the first position cannot be observed, and the first position is not in the view. As shown in fig. 4, there is an obstacle 403 between the virtual object 401 and the virtual object 402, and even if the distance between the virtual object 401 and the virtual object 402 is close, the virtual object 402 is not within the field of view of the virtual object 401.
Optionally, a ray transmission mode is adopted to determine whether an obstacle exists between two positions, namely: transmitting rays from the position of the first virtual object to the first position, and determining that no obstacle exists between the first position and the position of the first virtual object under the condition that the rays are not contacted with the virtual object in the process of reaching the first position; in the event that the ray does not reach the first location and contacts the virtual object, it is determined that an obstacle exists between the first location and the location of the first virtual object. As shown in fig. 5, the first virtual object shoots based on shooting prop, and a ray 501 is emitted from the position of the lens shown in fig. 5, where the lens refers to the muzzle of the virtual firearm, and in fig. 5, the ray 501 collides with the collision box of the box beside the virtual object, which indicates that the ray 501 does not shoot on the virtual object to be detected, but shoots on the box beside the virtual object.
In one possible implementation, the first virtual object corresponds to a virtual camera, and the step 301 includes: and the terminal displays the virtual scene shot by the virtual camera.
The virtual scene shot by the virtual camera is the visual field picture of the first virtual object. Optionally, the view field picture of the first virtual object is displayed with a first person viewing angle or a third person viewing angle, the virtual scene shot by the first virtual camera is the view field picture displayed with the first person viewing angle, and the virtual scene shot by the second virtual camera is the view field picture displayed with the third person viewing angle.
Wherein the first virtual camera is located at a different position than the second virtual camera. For example, the position of the first virtual camera is the same as the position of the eyes of the first virtual object, the position of the second virtual camera is above and behind the head of the first virtual object, only the arm and the hand-held virtual prop of the first virtual object are displayed in the view field screen displayed in the first person viewing angle, and the back, the arm and the hand-held virtual prop of the first virtual object can be displayed in the view field screen displayed in the third person viewing angle.
302. And the terminal displays the prop special effect of the virtual prop under the conditions that the virtual prop is triggered, the second virtual object holding the virtual prop is positioned in the visual field of the first virtual object, and the distance between the second virtual object and the first virtual object is not greater than the target distance.
In the embodiment of the present application, the second virtual object and the first virtual object are located in the same virtual scene. Optionally, the second virtual object is a virtual object controlled by other terminals, or is a non-player character. For example, the virtual scene is a multi-player competition scene, each player controls one virtual object through a terminal, and a plurality of players can realize multi-player competition in the virtual scene through the controlled plurality of virtual objects. For another example, the virtual scene is a stand-alone scene, in which only the first virtual object is a virtual object controlled by the player through the terminal, and the other virtual objects in the virtual scene are non-player characters, and the player performs a fight against the non-player characters in the virtual scene through the controlled first virtual object.
The virtual prop is a virtual prop which can be held by a virtual object in a virtual scene, for example, the virtual prop is a shooting prop such as a virtual firearm, a virtual bow and crossbow, or the virtual prop is a throwing prop such as a virtual bomb, a virtual grenade and the like. The virtual prop can be triggered when the virtual object holds the virtual prop, for example, the virtual prop is a virtual firearm, and shooting can be performed based on the held virtual firearm when the virtual object holds the virtual firearm; or the virtual prop is a virtual grenade, and the lead wire of the virtual grenade can be pulled out when the virtual grenade is held by a virtual object; alternatively, the virtual prop is a combustion flask, and the virtual object can ignite the handheld combustion flask when holding the combustion flask.
For different virtual props, prop special effects are different when the virtual props are triggered. For example, the virtual prop is a virtual grenade, and when the virtual object pulls out the lead of the handheld virtual grenade, the virtual grenade can smoke, i.e. the prop special effect when the virtual grenade is triggered is a smoke special effect; or the virtual prop is a combustion bottle, and when the virtual object ignites the handheld combustion bottle, the combustion bottle can show a combustion special effect, namely the prop special effect when the combustion bottle is triggered is the combustion special effect.
When the virtual prop is triggered, the virtual prop is held by the second virtual object, and the position of the second virtual object holding the virtual prop is similar to the position of the virtual prop, so that the position of the second virtual object can be used as the position of the virtual prop, whether the second virtual object is in the view of the first virtual object is determined, whether the virtual prop is in the view of the first virtual object is determined, whether the distance between the second virtual object and the first virtual object is larger than the target distance is determined, and whether the distance between the second virtual object and the first virtual object is larger than the target distance is determined, so that the prop special effect when the virtual prop is triggered is determined.
In one possible implementation, this step 302 includes: and loading the special prop effect of the virtual prop by the terminal under the conditions that the virtual prop is triggered, a second virtual object holding the virtual prop is positioned in the visual field, and the distance between the second virtual object and the first virtual object is not greater than the target distance, and displaying the loaded special prop effect.
When the special prop effect of the virtual prop is determined to be displayed, the special prop effect of the virtual prop is loaded, and is displayed after the loading is completed so as to display the special prop effect of the virtual prop triggered.
In one possible implementation, a position within the field of view of the first virtual object that is no greater than the target distance from the position of the first virtual object constitutes the special effects visual range. If any virtual prop is triggered and the virtual prop is in the special effect visual range, displaying the prop special effect of the virtual prop in the visual field picture; if any virtual prop is triggered and the virtual prop is not in the special effect visual range, the prop special effect of the virtual prop is not displayed in the visual field picture. That is, the virtual prop is not in the special effect visual range, even though the prop special effect of the virtual prop is displayed, the special effect display effect is not obvious, so that the prop special effect which is not in the special effect visual range is not displayed, the display effect is not reduced, and meanwhile, the resources occupied by the special effect display can be saved. In the embodiment of the application, only the prop special effect of the triggered virtual prop in the special effect visual range is displayed, and the prop special effect is not displayed when the virtual prop in the visual field is triggered outside the special effect visual range, so that the prop special effect of the virtual prop is not required to be loaded frequently, and resources required for displaying the special effect are saved.
In one possible implementation, this step 302 includes: and displaying the firing special effect of the virtual firearm under the condition that any virtual firearm is triggered, a second virtual object of the handheld virtual firearm is positioned in the visual field, and the distance between the second virtual object and the first virtual object is not greater than the target distance.
In the embodiment of the application, the virtual prop is a virtual firearm, the second virtual object shoots based on the handheld virtual firearm, and if the second virtual object is in the visual field and in the special effect visual range, the firing special effect of the virtual firearm is displayed in the visual field picture, for example, the muzzle of the virtual firearm is displayed to emit fire.
303. And the terminal does not display the prop special effect of the virtual prop under the condition that the second virtual object of the handheld virtual prop is triggered and the distance between the second virtual object and the first virtual object is larger than the target distance.
In some embodiments, this step 303 includes: and under the condition that any virtual firearm is triggered, a second virtual object of the handheld virtual firearm is positioned in the visual field, and the distance between the second virtual object and the first virtual object is larger than the target distance, the firing special effect of the virtual firearm is not displayed.
In one possible implementation, after step 301, the location of a second virtual object of the handheld virtual prop is acquired, and then whether to display the prop special effect of the virtual prop is determined according to the location of the second virtual object, and the process includes the following steps 304-306:
304. and the terminal acquires the position of the first virtual object and the position of the second virtual object of the handheld virtual prop under the condition that any virtual prop is triggered.
Wherein the position of the first virtual object represents the position of the first virtual object in the virtual scene and the position of the second virtual object represents the position of the second virtual object in the virtual scene. The position can be represented in any form, for example, the position is represented in the form of coordinates.
In one possible implementation, this step 304 includes: and under the condition that any virtual prop is triggered, the terminal checks the position of the first virtual object controlled by the terminal and the position of the second virtual object holding the virtual prop.
In the embodiment of the application, the terminal can check the position of any virtual object in the virtual scene. And under the condition that any virtual prop is triggered, the terminal detects the positions of the controlled first virtual object and the second virtual object holding the virtual prop in the virtual scene.
Optionally, the process of obtaining the position of the virtual object by the terminal includes: and the terminal responds to the trigger instruction of any virtual prop, and the position of the first virtual object controlled by the terminal and the position of the second virtual object holding the virtual prop are checked.
The triggering instruction indicates to trigger the virtual prop.
For example, in a stand-alone game scenario, the second virtual object is a non-player character, the first virtual object performs a fight against the non-player character in the virtual scenario, and during the fight, the terminal generates a trigger instruction based on the game logic of the running stand-alone game, where the trigger instruction instructs a non-player character to trigger the handheld virtual prop, and then the terminal acquires the position of the non-player character and the position of the first virtual object.
In one possible implementation, the second virtual object is a virtual object controlled by another terminal, and then the step 304 includes: the terminal receives a trigger instruction of any virtual object and the position of a second virtual object holding the virtual prop, which are synchronized by the server, and acquires the position of the first virtual object.
The trigger instruction is sent to the server by the terminal controlling the second virtual object.
In the embodiment of the application, each terminal controls one virtual object, and the virtual objects controlled by a plurality of terminals are located in the same virtual scene. For any terminal, the terminal can detect the position of the controlled virtual object in the virtual scene, synchronize the detected position to the server, and synchronize the trigger instruction of the virtual prop to the server when the terminal detects the trigger operation of the virtual prop held by the controlled virtual object, the server receives the position of the virtual object synchronized by the terminal and the trigger instruction of the virtual prop, and synchronizes the position of the virtual object and the trigger instruction of the virtual prop to other terminals, so that the other terminals can determine whether to display the prop special effect of the triggered virtual prop later.
305. And the terminal displays the prop special effect of the virtual prop under the condition that the position of the second virtual object is in the visual field and the distance between the position of the second virtual object and the position of the first virtual object is not more than the target distance.
In the embodiment of the application, after the terminal obtains the position of the second virtual object, when the position of the second virtual object is within the field of view of the first virtual object and the distance between the position of the second virtual object and the position of the first virtual object is not greater than the target distance, the position of the second virtual object is determined to be within the special effect visual range, that is, the virtual object is determined to be within the special effect visual range, so that the special effect of the prop of the virtual object can be displayed. The position of the virtual object of the handheld virtual prop is used as the position of the virtual prop, and whether the prop special effect of the virtual prop needs to be displayed or not is judged based on the position of the virtual object, so that the process of determining the position of the virtual prop is simplified, and the prop special effect of the virtual prop can be displayed in time.
In one possible implementation, the prop special effect of the virtual prop is a special effect with a display range.
The special prop effect of the virtual prop is a special effect with a display range, and even if the triggered virtual prop is not in the visual field of the first virtual object but the second virtual object holding the virtual prop is in the visual field, the special prop effect of the virtual prop can be displayed, so that the accuracy of displaying the special prop effect is ensured. As shown in fig. 6, the virtual object 601 is in the visual field of the first virtual object and the virtual object 601 is in the visual field of the special effects, but the virtual firearm held by the virtual object 601 is not in the visual field, when the virtual object 601 shoots based on the held virtual firearm, the firing special effect of the virtual firearm is loaded and displayed, and a part of the firing special effect can be seen at the edge of the visual field.
In one possible implementation, determining whether the location of the second virtual object is within the field of view includes: and carrying out human projection on a view angle picture shot by a virtual camera corresponding to the first virtual object based on the position of the second virtual object, if the position of the second virtual object is projected on the view angle picture, determining that the position of the second virtual object is in the view field, and if the position of the second virtual object is not projected on the view angle picture, determining that the position of the second virtual object is not in the view field.
In this embodiment of the present application, the virtual scene included in the view angle picture shot by the virtual camera is the virtual scene in the view field of the first virtual object. And projecting the position of the second virtual object to a view angle picture shot by the virtual camera so as to determine whether the position of the second virtual object is in the view field of the first virtual object.
Optionally, the process of determining whether the position of the second virtual object is within the field of view of the first virtual object based on the position of the second virtual object and the view angle picture shot by the virtual camera includes: determining the position of the virtual camera in the virtual scene and the shooting direction of the virtual camera, mapping the position of the second virtual object onto the plane where the virtual camera is based on the mapping relation between the first coordinate system and the second coordinate system, obtaining the mapping position corresponding to the position of the second virtual object, determining that the position of the second virtual object is in the view field if the mapping position is in the view field range of the virtual camera, and determining that the position of the second virtual object is not in the view field if the mapping position is not in the view field range of the virtual camera.
The first coordinate system is a coordinate system in the virtual scene, the first coordinate system takes any point in the virtual scene as an origin, the position of the second virtual object and the position of the virtual camera are both represented by coordinates in the first coordinate system, the second coordinate system is a coordinate system of the virtual camera, the second coordinate system takes an intersection point of the shooting direction of the virtual camera and the virtual lens as the origin, the mapping position is represented by coordinates in the second coordinate system, and the mapping relation is used for mapping the coordinates in the first coordinate system to the coordinates in the second coordinate system. The view angle range is a range captured by the virtual camera under the second coordinate system, for example, the view angle range is expressed in a form of a matrix, and a position capable of being projected in the view angle range is a position captured by the virtual camera. Thus, based on whether the projected position is within the view angle range, it is determined whether the position of the second virtual object is within the field of view of the first virtual object.
306. And under the condition that the position of the second virtual object is in the visual field and the distance between the position of the second virtual object and the position of the first virtual object is larger than the target distance, the terminal does not display the special prop effect of the virtual prop.
This step 306 is similar to the step 305 described above, and will not be described again.
In the embodiment of the present application, after the terminal obtains the position of the first virtual object and the position of the second virtual object, the terminal determines whether the position of the second virtual object is within the field of view and whether the distance between the position of the second object and the position of the first virtual object is greater than the target distance, and in another embodiment, the server determines whether the position of the second virtual object is within the field of view and whether the distance between the position of the second object and the position of the first virtual object is greater than the target distance, and in the case that the server determines that the position of the second virtual object is within the field of view of the first virtual object and the distance between the position of the second object and the position of the first virtual object is greater than the target distance, a display notification is sent to the terminal, and the terminal displays the special effect of the virtual object in the field of view screen based on the display notification.
In one possible implementation, the server interacts with the terminal, and the process of displaying the prop special effects of the virtual prop by the terminal includes: when other terminals detect triggering operation of the controlled second virtual object, the server synchronizes the position of the second virtual object and the triggering instruction of the virtual object, the server receives the position of the second virtual object and the triggering instruction of the virtual object synchronized by the other terminals, determines the view of the first virtual object based on the position and the sight direction of the first virtual object, determines whether the position of the second virtual object is in the view of the first virtual object and whether the distance between the position of the second object and the position of the first virtual object is greater than the target distance based on the position of the second virtual object, and when the server determines that the position of the second virtual object is in the view of the first virtual object and the distance between the position of the second object and the position of the first virtual object is greater than the target distance, sends a display notification to the terminal, receives the display notification, and displays the special effect of the second virtual object in the view picture based on the display notification.
Wherein the display notification indicates that the virtual prop held by the second virtual object is triggered and within the special effect visual range of the first virtual object.
According to the method provided by the embodiment of the application, whether the prop special effect of the triggered virtual prop is displayed is determined based on the visual field and the target distance of the virtual object controlled by the local equipment, and the prop special effect of the virtual prop is displayed only when the triggered virtual prop is in the visual field of the virtual object controlled by the local equipment and the distance between the virtual prop and the virtual object is smaller than the target distance, and the prop special effect of the virtual prop is not required to be displayed but the distance between the virtual prop and the virtual object is larger than the target distance, so that resources required to be occupied by displaying the prop special effect are saved.
And the position of the virtual object of the handheld virtual prop is used as the position of the virtual prop, so that whether the prop special effect of the virtual prop needs to be displayed or not can be judged conveniently based on the position of the virtual object, the process of determining the position of the virtual prop is simplified, and the prop special effect of the virtual prop can be displayed timely.
And the special prop effect of the virtual prop is a special effect with a display range, and even if the triggered virtual prop is not in the visual field of the first virtual object but the second virtual object holding the virtual prop is in the visual field, the special prop effect of the virtual prop can be displayed, so that the accuracy of displaying the special prop effect is ensured.
Based on the embodiment shown in fig. 2, the virtual prop can be moved after being triggered, and the prop special effect of the virtual prop is only displayed when the virtual prop moves in the field of view of the first virtual object and the distance between the virtual prop and the first virtual object is smaller than the target distance, and the specific process is described in the following embodiment.
Fig. 7 is a flowchart of a prop special effect display method provided in an embodiment of the present application, which is executed by a terminal, and as shown in fig. 7, the method includes:
701. the terminal displays a view field picture of the first virtual object controlled by the local terminal device.
This step is the same as steps 201 and 301 described above and will not be described again here.
702. And the terminal displays the prop special effect of the virtual prop under the condition that the terminal moves in the visual field of the first virtual object after the virtual prop is triggered and the distance between the virtual prop and the first virtual object is not greater than the target distance.
In the embodiment of the application, the virtual prop can be moved after being triggered, and if the virtual prop moves in the visual field of the first virtual object and the distance between the virtual prop and the first virtual object in the moving process is not greater than the target distance, the prop special effect of the virtual prop is displayed in the visual field picture. Optionally, in the case of movement of the virtual prop, the prop effect displayed is a movement effect. For example, the virtual prop is a burning flask thrown by a virtual object, and the special effect of burning when the burning flask moves is displayed in the process of moving the burning flask.
In one possible implementation, the step 702 includes: in the case where any virtual bullet is moved within the visual field after being ejected and the distance between the virtual bullet and the first virtual object is not greater than the target distance, the movement special effect of the virtual bullet is displayed.
Wherein the virtual bullet is ejected by any virtual object in the virtual scene based on a handheld virtual firearm, which may or may not be in the field of view. The moving effect of the virtual bullet is any type of effect, for example, the moving effect of the virtual bullet is a fire. For example, if a virtual bullet moves within the field of view and the distance between the virtual bullet and the first virtual object is not greater than the target distance throughout a distance of movement, a special effect of movement of the virtual bullet is displayed, and a trajectory of the virtual bullet is presented.
Optionally, the virtual bullet is ejected and then moved until the virtual bullet contacts any virtual object or the distance of movement reaches a maximum distance. Optionally, the virtual bullet is shot out and then makes a parabolic motion. The maximum distance represents the range of the virtual firearm from which the virtual bullet was ejected.
703. And under the condition that the terminal moves in the visual field after the virtual prop is triggered and the distance between the virtual prop and the first virtual object is larger than the target distance, the prop special effect of the virtual prop is not displayed.
In one possible implementation, this step 703 includes: in the case where any virtual bullet is moved in the visual field after being ejected and the distance between the virtual bullet and the first virtual object is greater than the target distance, the special effect of movement of the virtual bullet is not displayed in the visual field screen.
In one possible implementation, after step 301, the location of a second virtual object of the handheld virtual prop is acquired, and then, whether to display a prop special effect of the virtual prop is determined according to the location of the second virtual object, and the process includes the following steps 704-707:
704. and the terminal acquires the position of the first virtual object and the position of the third virtual object triggering the virtual prop under the condition that the virtual prop is triggered and starts to move.
In this embodiment of the present application, the third virtual object is any virtual object in a virtual scene, and the virtual prop is a movable virtual prop. For example, the virtual prop is a virtual bullet, and the virtual bullet starts moving after being ejected. For another example, the virtual prop is a virtual mine, and after the virtual mine is triggered and when the virtual object throws the virtual mine, the virtual mine starts to move.
This step is similar to step 304 described above and will not be described again.
705. And the terminal determines the current position of the virtual prop based on the position of the third virtual object and the moving direction of the virtual prop.
The movement direction of the virtual prop can indicate which direction the virtual prop moves. For example, the virtual prop is a virtual bullet, and the moving direction of the virtual prop is the shooting direction when the virtual firearm shoots the virtual bullet; or the virtual prop is a virtual grenade, and the moving direction of the virtual prop is the throwing direction of the virtual grenade thrown by the virtual object. In this embodiment of the present application, when the virtual prop is triggered and starts to move, the position of the third virtual object is taken as a starting point, and the virtual prop moves along the moving direction of the virtual prop, so that the position of the virtual prop in the moving process can be determined based on the position of the third virtual object and the moving direction of the virtual prop. Optionally, the terminal determines the position of the virtual prop in real time based on the position of the third virtual object and the moving direction of the virtual prop.
In one possible implementation, the step 705 includes: and determining the position of the virtual prop in real time based on the position of the third virtual object, the moving direction of the virtual prop and the moving speed of the virtual prop.
The moving speed of the virtual prop is a fixed value or a speed conforming to the gravity condition. The virtual prop takes the position of the third virtual object as a starting point, moves along the moving direction at the moving speed, and can determine the position of the virtual prop in real time based on the position of the third virtual object, the moving direction of the virtual prop and the moving speed of the virtual prop in the moving process of the virtual prop.
706. And the terminal displays the special moving effect of the virtual prop under the condition that the virtual prop moves in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is not greater than the target distance.
And displaying the moving special effect of the virtual prop when the position of the virtual prop is in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is not greater than the target distance, wherein the virtual prop is indicated to be in the special effect visual range.
707. And under the condition that the virtual prop moves in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is larger than the target distance, the terminal does not display the special moving effect of the virtual prop.
Because the position of the virtual prop is changed in real time in the moving process of the virtual prop, whether the virtual prop is in the special effect visual range or not is determined in real time based on the position of the virtual prop, and the moving special effect of the virtual prop is displayed only when the virtual prop is in the special effect visual range. For example, in the process of moving the virtual prop, the virtual prop moves from the outside of the special effect visual range to the inside of the special effect visual range, and then moves out of the special effect visual range, so that the prop special effect of the virtual prop is displayed only when the virtual prop moves in the special effect visual range.
According to the method provided by the embodiment of the application, whether the prop special effect of the triggered virtual prop is displayed is determined based on the visual field and the target distance of the virtual object controlled by the local equipment, and the prop special effect of the virtual prop is displayed only when the triggered virtual prop is in the visual field of the virtual object controlled by the local equipment and the distance between the virtual prop and the virtual object is smaller than the target distance, and the prop special effect of the virtual prop is not required to be displayed but the distance between the virtual prop and the virtual object is larger than the target distance, so that resources required to be occupied by displaying the prop special effect are saved.
And the position of the virtual prop in the moving process is determined in real time, so that whether the virtual prop is in the special effect visual range is determined by the position of the virtual prop, the moving special effect of the virtual prop can be displayed only under the condition that the virtual prop moves in the special effect visual range, the display style of the special effect of the prop is enriched, and the accuracy of special effect display is ensured.
Based on the embodiment shown in fig. 2, in the process of moving the virtual prop after being triggered, the virtual prop can be contacted with the virtual object, and if the contact position is within the visible range of the special effect, the prop special effect of the virtual prop can be displayed at the contact position, and the specific process is described in the embodiment below.
Fig. 8 is a flowchart of a prop special effect display method provided in an embodiment of the present application, which is executed by a terminal, and as shown in fig. 8, the method includes:
801. the terminal displays a view field picture of the first virtual object controlled by the local terminal device.
This step is the same as steps 201 and 301 described above and will not be described again here.
802. And when the virtual prop is triggered, the terminal is contacted with any virtual object, the contact position is in the visual field of the first virtual object, and the distance between the contact position and the position of the first virtual object is not greater than the target distance, and the prop special effect of the virtual prop is displayed at the contact position.
In the embodiment of the application, the virtual prop is triggered to be contacted with any virtual object in the virtual scene, and a prop special effect is generated when the virtual prop is contacted with the virtual object, and the prop special effect can show the effect that the virtual prop is contacted with the virtual object. For example, the virtual prop is a virtual mine, and when the virtual mine is in contact with the ground, an explosion special effect is displayed at the contact position.
The determining whether the contact position is in the field of view is the same as the determining whether the position of the second virtual object is in the field of view in the step 305, and the determining whether the distance between the contact position and the position of the first virtual object is greater than the target distance is the same as the determining whether the distance between the position of the second virtual object and the position of the first virtual object is greater than the target distance in the step 305, which is not repeated herein.
In one possible implementation, the step 802 includes: and displaying the bullet hole special effect of the virtual bullet at the contact position when any virtual bullet is contacted with the virtual object after being ejected and the contact position is in the visual field and the distance between the contact position and the position of the first virtual object is not more than the target distance.
In the embodiment of the application, the contact of the virtual bullet with the virtual article means that the virtual bullet hits the virtual article, and the virtual bullet forms a bullet hole on the virtual article, so that if the contact position is within the visual field and the distance between the contact position and the first virtual object is not greater than the target distance, the bullet hole special effect of the virtual bullet is displayed at the contact position in the visual field picture. As shown in fig. 9, the first virtual object shoots based on a hand-held virtual firearm, the shot bullet is in contact with the wall, and the contact position is within the special effect visual range, and the bullet hole special effect 901 is displayed on the wall.
Optionally, in a case where the virtual bullet is in contact with the virtual article after being ejected, the contact position is within the field of view, and the distance between the contact position and the position of the first virtual object is not greater than the target distance, determining a bullet hole special effect matching the virtual article, and displaying the determined bullet hole special effect at the contact position.
In the embodiment of the application, the bullet holes formed on different virtual articles are different, namely, the bullet hole special effects generated when the bullet holes are formed on different virtual articles are also different, so that when the virtual bullets are determined to be in contact with the virtual articles, the bullet hole special effects matched with the virtual articles are obtained, the determined bullet hole special effects are displayed at the contact positions in the visual field pictures, the display patterns of the bullet hole special effects are enriched, and the accuracy of the displayed bullet hole special effects is ensured.
Optionally, the process of obtaining the bullet hole special effect matched with the virtual article includes: and acquiring the material quality of the virtual article, and inquiring the bullet hole special effect matched with the material quality in the database.
The database comprises corresponding relations between materials and bullet hole special effects, and after vegetables of the virtual article are obtained, the corresponding relations in the database are inquired, so that the bullet hole special effects matched with the materials can be determined. And acquiring the bullet hole special effect matched with the material of the virtual article, and determining the bullet hole special effect matched with the material as the bullet hole special effect matched with the virtual article. In the embodiment of the present application, the materials of different virtual articles may be the same or different, and if the materials of two virtual articles are the same, the bullet hole special effects displayed at the contact positions will be the same when the virtual bullets are respectively contacted with the two virtual articles; if the materials of the two virtual articles are different, when the virtual bullets are respectively contacted with the two virtual articles, the bullet hole special effects displayed at the contact positions are also different.
As shown in fig. 10, the bullet hole effect 1001 produced by the firing of a virtual bullet onto a virtual wall is different from the bullet hole effect 1002 produced by the firing of a virtual bullet onto a grass.
Optionally, the virtual bullet is controlled by other terminals to eject the virtual object based on the handheld virtual firearm, and the process of acquiring the contact position by the terminal includes: and the other terminals control the virtual object to shoot the virtual bullet based on the handheld virtual firearm, then shoot rays from the position of the virtual firearm, detect the collision box of any virtual object at the rays, determine the contact position between the rays and the virtual object as the contact position of the virtual bullet and the virtual object, synchronize the contact position to the server, and synchronize the server to the terminal controlling the first virtual object, so that the terminal obtains the contact position.
Optionally, the process of the terminal obtaining the contact position includes controlling, by the terminal, the first virtual object to eject based on the handheld virtual firearm: and the terminal controls the first virtual object to shoot the virtual bullet based on the handheld virtual firearm, emits rays from the position of the virtual firearm, detects the collision box of any virtual article in the rays, and determines the contact position between the rays and the virtual article as the contact position of the virtual bullet and the virtual article.
803. And when the terminal is contacted with any virtual object after the virtual prop is triggered, the contact position is in the visual field, and the distance between the contact position and the position of the first virtual object is larger than the target distance, the prop special effect of the virtual prop is not displayed at the contact position.
The virtual bullet is contacted with the virtual object, the contact position is in the visual field, but the distance between the contact position and the first virtual object is larger than the target distance, namely, the contact position is not in the special effect visual range, even if the bullet hole special effect of the virtual bullet is displayed, the special effect display effect is not obvious, so that the bullet hole special effect which is not in the special effect visual range is not displayed, the display effect is not reduced, and meanwhile, the resources occupied by the display special effect can be saved. As shown in fig. 11, the distance between the contact position of the virtual bullet with the virtual article and the first virtual object is long, and even if the bullet hole special effect 1101 is displayed, the bullet hole special effect 1101 is not displayed significantly, and therefore, the bullet hole special effect 1101 is not displayed any more.
In one possible implementation, after step 801, the contact position of the virtual prop with the virtual object is obtained first, and then whether to display the prop special effect of the virtual prop is determined according to the contact position, and then the process includes: under the condition that the virtual prop is contacted with any virtual object after being triggered, the contact position of the virtual prop and the virtual object is obtained; and displaying the special prop effect of the virtual prop at the contact position under the condition that the contact position is in the visual field and the distance between the contact position and the position of the first virtual object is not greater than the target distance, and displaying the special prop effect of the virtual prop at the contact position under the condition that the contact position is in the visual field and the distance between the contact position and the position of the first virtual object is greater than the target distance.
The process of obtaining the contact position is the same as the process of obtaining the contact position in step 802, and will not be described herein.
It should be noted that, in the embodiment of the present application, the virtual prop is described as being in contact with the virtual object, and in another embodiment, it is also possible that the virtual prop is in contact with the virtual object. For example, the terminal controls the first virtual object to shoot based on the handheld virtual firearm, the ray emitted by the muzzle of the virtual firearm performs collision detection, if the ray detects a collision box with other virtual objects, the bullet emitted by the virtual firearm hits the other virtual objects, and if the ray does not detect a collision box with other virtual objects, the bullet emitted by the virtual firearm does not hit the other virtual objects. In addition, the part hitting the other virtual object can be detected through rays, the damage to the other virtual object is determined based on the hit part, the terminal sends the damage to the other virtual object to the server, after the server verifies the damage, the life value of the other virtual object is reduced by the damage value corresponding to the damage, and the special effect of hitting the other virtual object by the virtual bullet, such as a bleeding special effect, is displayed at the position where the virtual bullet hits the other virtual object.
According to the method provided by the embodiment of the application, whether the prop special effect of the triggered virtual prop is displayed is determined based on the visual field and the target distance of the virtual object controlled by the local equipment, and the prop special effect of the virtual prop is displayed only when the triggered virtual prop is in the visual field of the virtual object controlled by the local equipment and the distance between the virtual prop and the virtual object is smaller than the target distance, and the prop special effect of the virtual prop is not required to be displayed but the distance between the virtual prop and the virtual object is larger than the target distance, so that resources required to be occupied by displaying the prop special effect are saved.
And in the case that the virtual prop is contacted with the virtual object and the contact position is in the special effect visible range, the prop special effect of the virtual prop is displayed at the contact position so as to present that the virtual prop is contacted with the virtual object, and the display style of the prop special effect is enriched.
And under the condition that the virtual prop is contacted with the virtual object and the contact position is in the special effect visible range, the special effect of the prop matched with the virtual object is determined, the determined bullet hole special effect is displayed at the contact position, the display style of the bullet hole special effect is enriched, and the accuracy of the displayed bullet hole special effect is ensured.
Based on the embodiment shown in fig. 2, each virtual object has a relative position relation with the handheld virtual prop, the position of the virtual prop is determined according to the position of the second virtual object, the orientation of the second virtual object and the relative position relation, and after the virtual prop is triggered, whether the virtual prop is in the special effect visual range is determined according to the position of the virtual prop.
Fig. 12 is a flowchart of a prop special effect display method provided in an embodiment of the present application, which is executed by a terminal, and as shown in fig. 12, the method includes:
1201. the terminal displays a view field picture of the first virtual object controlled by the local terminal device.
This step is the same as steps 201 and 301 described above and will not be described again here.
1202. And the terminal acquires the position of the first virtual object and the position of the second virtual object of the handheld virtual prop under the condition that the virtual prop is triggered.
This step is similar to step 304 described above and will not be described again.
1203. The terminal determines the position of the virtual prop based on the position of the second virtual object, the orientation of the second virtual object and the relative positional relationship between the second virtual object and the virtual prop.
The relative position relationship is used for representing the relationship between the position of the virtual object and the position of the virtual prop when the virtual object holds the virtual prop. Alternatively, the relative positional relationship is represented as a positional offset vector. The direction of the second virtual object is the direction facing the virtual object, and as the virtual prop held by the virtual object is in front of the virtual object, the offset direction of the position of the virtual prop relative to the position of the second virtual object can be determined through the direction of the virtual object, and the position of the virtual prop can be determined through the relative position relation and the offset direction.
1204. And the terminal displays the prop special effect of the virtual prop under the condition that the position of the virtual prop is in the visual field of the first virtual object and the distance between the position of the virtual prop and the position of the first virtual object is not greater than the target distance.
This step is similar to step 305 described above and will not be described again.
1205. And under the condition that the position of the virtual prop is in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is larger than the target distance, the terminal does not display the prop special effect of the virtual prop.
According to the method provided by the embodiment of the application, whether the prop special effect of the triggered virtual prop is displayed is determined based on the visual field and the target distance of the virtual object controlled by the local end device, and only when the triggered virtual prop is in the visual field of the virtual object controlled by the local end device and the distance between the virtual prop and the virtual object is smaller than the target distance, the prop special effect of the virtual prop is displayed in the visual field picture, and the prop special effect of the virtual prop which is in the visual field but is larger than the target distance is not required to be displayed, so that resources required to display the prop special effect are saved.
And the position of the virtual prop is determined according to the position of the second virtual object, the orientation of the second virtual object and the relative position relation, whether the virtual prop is in the special effect visual range is determined based on the position of the virtual prop, and the accuracy of the position of the virtual prop is ensured, so that the accuracy of displaying the special effect is ensured.
The embodiments shown in fig. 2, 3, 7, 8 and 12 can be arbitrarily combined. As shown in fig. 13, when the first virtual object is a virtual firearm, the virtual firearm is in the special effect visual range, the firing special effect 1301 of the virtual firearm is displayed, the virtual bullet 1302 ejected by the virtual firearm starts to move, the moving special effect 1303 of the virtual bullet 1302 is displayed in the process that the virtual bullet 1302 moves in the special effect visual range, the virtual bullet 1302 hits a virtual article, the contact position of the virtual bullet 1302 and the virtual article is in the special effect visual range, and the bullet hole special effect of the virtual bullet 1302 is displayed at the contact position.
As shown in fig. 14, a view field screen of the first virtual object is displayed, a second virtual object 1401 is displayed in the view field screen, when the second virtual object 1401 shoots based on the virtual firearm 1402 held by the hand, the second virtual object 1401 displays the firing special effect 1403 of the virtual firearm 1402 in the view field screen, the virtual bullet 1404 ejected from the virtual firearm 1402 starts to move, the moving special effect 1405 of the virtual bullet 1404 is displayed in the view field screen during the movement of the virtual bullet 1404 in the special effect visual range, the virtual article is hit by the virtual bullet 1404, the contact position of the virtual bullet 1404 and the virtual article is in the special effect visual range, and the bullet hole special effect of the virtual bullet 1404 is displayed at the contact position.
For another example, a visual field picture of the first virtual object is displayed, a second virtual object outside the visual field picture shoots based on a handheld virtual firearm, the second virtual object is not in a special effect visual range, a firing special effect of the virtual firearm is not displayed in the visual field picture, a virtual bullet shot by the virtual firearm starts to move, a moving special effect of the virtual bullet is displayed in the visual field picture in the process that the virtual bullet enters the special effect visual range and moves in the special effect visual range, a virtual article is hit by the virtual bullet, a contact position of the virtual bullet and the virtual article is in the special effect visual range, and a bullet hole special effect of the virtual bullet is displayed at the contact position.
Fig. 15 is a flowchart of a prop special effect display method provided in an embodiment of the present application. The execution body of the embodiment of the present application is any terminal in the above-mentioned implementation environment. Referring to fig. 15, the method includes the steps of:
1. at the start of the game, the first terminal displays a view screen of the first virtual object controlled by the terminal.
2. When the second terminal controls the virtual object to fire based on the handheld virtual firearm, whether the virtual object is in the special effect visible range in the visual field of the first virtual object is determined, if not, the firing special effect of the virtual firearm is not real, and if so, the firing special effect of the virtual firearm is displayed in the visual field picture.
3. The second terminal emits rays from the muzzle of the handheld virtual firearm to carry out ray detection, whether the rays collide with a collision box of the obstacle is detected, ray detection is continued when collision is not detected, and when collision is detected, a contact position is acquired and is synchronized to the first terminal through the server.
4. The first terminal determines whether the contact position is within a visual range of the effect, if so, the bullet hole effect is displayed at the contact position, and if not, the bullet hole effect is not displayed at the contact position.
Fig. 16 is a schematic structural diagram of a prop special effect display device provided in an embodiment of the present application, as shown in fig. 16, the device includes:
a display module 1601, configured to display a field of view screen of a first virtual object controlled by the local device;
the display module 1601 is further configured to display a prop special effect of the virtual prop when the triggered virtual prop is located in the field of view of the first virtual object and a distance between the virtual prop and the first virtual object is not greater than a target distance;
the display module 1601 is further configured to not display a prop special effect of the virtual prop when the triggered virtual prop is located in the field of view and a distance between the virtual prop and the first virtual object is greater than the target distance.
In one possible implementation, the display module 1601 is configured to display a prop special effect of the virtual prop if the virtual prop is triggered, a second virtual object holding the virtual prop is located in the field of view, and a distance between the second virtual object and the first virtual object is not greater than a target distance.
In another possible implementation, the display module 1601 is configured to display a firing special effect of the virtual firearm in a case where any one of the virtual firearms is triggered, a second virtual object of the handheld virtual firearm is located in the field of view, and a distance between the second virtual object and the first virtual object is not greater than the target distance.
In another possible implementation, the display module 1601 is configured to display a prop special effect of the virtual prop when the virtual prop is triggered to move in the field of view and a distance between the virtual prop and the first virtual object is not greater than a target distance.
In another possible implementation, the display module 1601 is configured to display a special effect of moving the virtual bullet when any virtual bullet is ejected and a distance between the virtual bullet and the first virtual object is not greater than a target distance.
In another possible implementation, the display module 1601 is configured to display, at the contact location, a prop special effect of the virtual prop when the virtual prop is triggered and the contact location is within the field of view and a distance between the contact location and a location of the first virtual object is not greater than a target distance.
In another possible implementation, the display module 1601 is configured to display the bullet hole special effect of the virtual bullet at the contact position in a case where any virtual bullet is in contact with the virtual article after being ejected, the contact position is in the field of view, and a distance between the contact position and the position of the first virtual object is not greater than the target distance.
In another possible implementation, as shown in fig. 17, the display module 1601 includes:
a determining unit 1611 for determining a bullet hole special effect matching the virtual article in a case where the virtual bullet is in contact with the virtual article after being ejected, the contact position is within the field of view, and a distance between the contact position and a position of the first virtual object is not greater than a target distance;
and a display unit 1612 for displaying the determined bullet hole special effect at the contact position.
In another possible implementation, as shown in fig. 17, the apparatus further includes:
an obtaining module 1602, configured to obtain, when the virtual prop is triggered, a position of a first virtual object and a position of a second virtual object of the handheld virtual prop;
and a display module 1601, configured to display a prop special effect of the virtual prop when the position of the second virtual object is within the field of view and a distance between the position of the second virtual object and the position of the first virtual object is not greater than the target distance.
In another possible implementation, as shown in fig. 17, the apparatus further includes:
an obtaining module 1602, configured to obtain, when the virtual prop is triggered, a position of a first virtual object and a position of a second virtual object of the handheld virtual prop;
a determining module 1603, configured to determine a position of the virtual prop based on the position of the second virtual object, the orientation of the second virtual object, and a relative positional relationship between the second virtual object and the virtual prop;
and a display module 1601, configured to display a prop special effect of the virtual prop when the position of the virtual prop is within the field of view and a distance between the position of the virtual prop and the position of the first virtual object is not greater than the target distance.
In another possible implementation, as shown in fig. 17, the apparatus further includes:
an obtaining module 1602, configured to obtain, when the virtual prop is triggered and starts moving, a position of the first virtual object and a position of a third virtual object that triggers the virtual prop;
a determining module 1603, configured to determine a current position of the virtual prop based on the position of the third virtual object and the moving direction of the virtual prop;
and a display module 1601, configured to display a special effect of moving the virtual prop when the virtual prop moves within the field of view and a distance between a position of the virtual prop and a position of the first virtual object is not greater than a target distance.
It should be noted that: the prop special effect display device provided in the above embodiment is only exemplified by the division of the above functional modules, and in practical application, the above functional allocation can be completed by different functional modules according to needs, that is, the internal structure of the computer device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the prop special effect display device and the prop special effect display method provided in the above embodiments belong to the same concept, and detailed implementation processes thereof are detailed in the method embodiments, and are not repeated here.
The embodiment of the application also provides computer equipment, which comprises a processor and a memory, wherein at least one computer program is stored in the memory, and the at least one computer program is loaded and executed by the processor to realize the operations executed by the prop special effect display method of the embodiment.
Optionally, the computer device is provided as a terminal. Fig. 18 shows a block diagram of a terminal 1800 according to an exemplary embodiment of the present application. The terminal 1800 may be a portable mobile terminal, such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. The terminal 1800 may also be referred to as a user device, portable terminal, laptop terminal, desktop terminal, or the like.
The terminal 1800 includes: a processor 1801 and a memory 1802.
Processor 1801 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 1801 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 1801 may also include a main processor and a coprocessor, the main processor being a processor for processing data in an awake state, also referred to as a CPU (Central Processing Unit ); a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 1801 may be integrated with a GPU (Graphics Processing Unit, image processor) for taking care of rendering and rendering of content that the display screen is required to display. In some embodiments, the processor 1801 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
The memory 1802 may include one or more computer-readable storage media, which may be non-transitory. The memory 1802 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1802 is used to store at least one computer program for execution by processor 1801 to implement the prop special effect display methods provided by the method embodiments herein.
In some embodiments, the terminal 1800 may also optionally include: a peripheral interface 1803 and at least one peripheral. The processor 1801, memory 1802, and peripheral interface 1803 may be connected by a bus or signal line. The individual peripheral devices may be connected to the peripheral device interface 1803 by buses, signal lines or circuit boards. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1804, a display screen 1805, a camera assembly 1806, audio circuitry 1807, and a power supply 1809.
The peripheral interface 1803 may be used to connect I/O (Input/Output) related at least one peripheral device to the processor 1801 and memory 1802. In some embodiments, processor 1801, memory 1802, and peripheral interface 1803 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 1801, memory 1802, and peripheral interface 1803 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 1804 is configured to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 1804 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 1804 converts electrical signals to electromagnetic signals for transmission, or converts received electromagnetic signals to electrical signals. Optionally, the radio frequency circuit 1804 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuitry 1804 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuitry 1804 may also include NFC (Near Field Communication ) related circuitry, which is not limited in this application.
The display 1805 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 1805 is a touch display, the display 1805 also has the ability to collect touch signals at or above the surface of the display 1805. The touch signal may be input as a control signal to the processor 1801 for processing. At this point, the display 1805 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 1805 may be one and disposed on the front panel of the terminal 1800; in other embodiments, the display 1805 may be at least two, disposed on different surfaces of the terminal 1800 or in a folded configuration; in other embodiments, the display 1805 may be a flexible display disposed on a curved surface or a folded surface of the terminal 1800. Even more, the display screen 1805 may be arranged in an irregular pattern other than rectangular, i.e., a shaped screen. The display 1805 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 1806 is used to capture images or video. Optionally, the camera assembly 1806 includes a front camera and a rear camera. The front camera is arranged on the front panel of the terminal, and the rear camera is arranged on the back of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, the camera assembly 1806 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuitry 1807 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1801 for processing, or inputting the electric signals to the radio frequency circuit 1804 for realizing voice communication. For stereo acquisition or noise reduction purposes, the microphone may be multiple, and disposed at different locations of the terminal 1800. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is then used to convert electrical signals from the processor 1801 or the radio frequency circuit 1804 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, the audio circuitry 1807 may also include a headphone jack.
A power supply 1809 is used to power the various components in the terminal 1800. The power supply 1809 may be an alternating current, a direct current, a disposable battery, or a rechargeable battery. When the power supply 1809 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 1800 also includes one or more sensors 1810. The one or more sensors 1810 include, but are not limited to: acceleration sensor 1811, gyro sensor 1812, pressure sensor 1813, optical sensor 1815, and proximity sensor 1816.
The acceleration sensor 1811 may detect the magnitudes of accelerations on three coordinate axes of a coordinate system established with the terminal 1800. For example, the acceleration sensor 1811 may be used to detect components of gravitational acceleration on three coordinate axes. The processor 1801 may control the display screen 1805 to display a user interface in either a landscape view or a portrait view based on gravitational acceleration signals acquired by the acceleration sensor 1811. The acceleration sensor 1811 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1812 may detect a body direction and a rotation angle of the terminal 1800, and the gyro sensor 1812 may collect a 3D motion of the user to the terminal 1800 in cooperation with the acceleration sensor 1811. The processor 1801 may implement the following functions based on the data collected by the gyro sensor 1812: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
Pressure sensor 1813 may be disposed on a side frame of terminal 1800 and/or below display 1805. When the pressure sensor 1813 is disposed at a side frame of the terminal 1800, a grip signal of the terminal 1800 by a user may be detected, and the processor 1801 performs a left-right hand recognition or a shortcut operation according to the grip signal collected by the pressure sensor 1813. When the pressure sensor 1813 is disposed at the lower layer of the display 1805, the processor 1801 controls the operability control on the UI interface according to the pressure operation of the user on the display 1805. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The optical sensor 1815 is used to collect the ambient light intensity. In one embodiment, the processor 1801 may control the display brightness of the display screen 1805 based on the intensity of ambient light collected by the optical sensor 1815. Specifically, when the intensity of the ambient light is high, the display brightness of the display screen 1805 is turned up; when the ambient light intensity is low, the display brightness of the display screen 1805 is turned down. In another embodiment, the processor 1801 may also dynamically adjust the shooting parameters of the camera assembly 1806 based on the intensity of ambient light collected by the optical sensor 1815.
A proximity sensor 1816, also known as a distance sensor, is provided on the front panel of the terminal 1800. Proximity sensor 1816 is used to collect the distance between the user and the front face of terminal 1800. In one embodiment, when the proximity sensor 1816 detects that the distance between the user and the front face of the terminal 1800 gradually decreases, the processor 1801 controls the display 1805 to switch from the on-screen state to the off-screen state; when the proximity sensor 1816 detects that the distance between the user and the front of the terminal 1800 gradually increases, the processor 1801 controls the display 1805 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the structure shown in fig. 18 is not limiting and may include more or fewer components than shown, or may combine certain components, or may employ a different arrangement of components.
Optionally, the computer device is provided as a server. Fig. 19 is a schematic structural diagram of a server provided in an embodiment of the present application, where the server 1900 may have a relatively large difference due to configuration or performance, and may include one or more processors (Central Processing Units, CPU) 1901 and one or more memories 1902, where at least one computer program is stored in the memories 1902, and the at least one computer program is loaded and executed by the processors 1901 to implement the methods provided in the above-described method embodiments. Of course, the server may also have a wired or wireless network interface, a keyboard, an input/output interface, and other components for implementing the functions of the device, which are not described herein.
The present application also provides a computer readable storage medium, in which at least one computer program is stored, where the at least one computer program is loaded and executed by a processor to implement the operations performed by the prop special effect display method of the above embodiment.
Embodiments of the present application also provide a computer program product, including a computer program, which when executed by a processor, implements operations performed by the prop special effect display method as described in the above aspect.
Those of ordinary skill in the art will appreciate that all or a portion of the steps implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the above storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the embodiments is merely an optional embodiment and is not intended to limit the embodiments, and any modifications, equivalent substitutions, improvements, etc. made within the spirit and principles of the embodiments of the present application are intended to be included in the scope of the present application.

Claims (10)

1. The prop special effect display method is characterized by comprising the following steps:
displaying a view field picture of a first virtual object controlled by the local terminal equipment;
under the condition that the virtual prop is triggered, acquiring the position of the first virtual object and the position of a second virtual object holding the virtual prop; displaying a prop special effect of the virtual prop under the condition that the position of the second virtual object is in the field of view of the first virtual object and the distance between the position of the second virtual object and the position of the first virtual object is not greater than a target distance; if the position of the second virtual object is within the field of view and the distance between the position of the second virtual object and the position of the first virtual object is greater than the target distance, not displaying the prop special effect of the virtual prop; or alternatively, the process may be performed,
acquiring the position of the first virtual object and the position of a second virtual object holding the virtual prop under the condition that the virtual prop is triggered; determining the position of the virtual prop based on the position of the second virtual object, the orientation of the second virtual object and the relative positional relationship between the second virtual object and the virtual prop; displaying a prop special effect of the virtual prop under the condition that the position of the virtual prop is in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is not greater than the target distance; when the position of the virtual prop is within the field of view and the distance between the position of the virtual prop and the position of the first virtual object is greater than the target distance, not displaying the prop special effect of the virtual prop; or alternatively, the process may be performed,
Acquiring the position of the first virtual object and the position of a third virtual object triggering the virtual prop under the condition that the virtual prop is triggered and starts to move; determining the current position of the virtual prop based on the position of the third virtual object and the moving direction of the virtual prop; displaying a special effect of movement of the virtual prop under the condition that the virtual prop moves in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is not greater than the target distance; when the virtual prop moves in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is larger than the target distance, the special moving effect of the virtual prop is not displayed; or alternatively, the process may be performed,
displaying a prop special effect of the virtual prop at a contact position when the virtual prop is contacted with any virtual object after being triggered, the contact position is in the visual field, and the distance between the contact position and the position of the first virtual object is not greater than the target distance; and when the virtual prop is triggered and then is contacted with the virtual object, the contact position is in the visual field, and the distance between the contact position and the position of the first virtual object is larger than the target distance, the prop special effect of the virtual prop is not displayed at the contact position.
2. The method according to claim 1, wherein the method further comprises:
and displaying the prop special effect of the virtual prop under the condition that the virtual prop is triggered, a second virtual object holding the virtual prop is positioned in the visual field, and the distance between the second virtual object and the first virtual object is not greater than the target distance.
3. The method of claim 2, wherein displaying the prop special effect of the virtual prop if the virtual prop is triggered, a second virtual object holding the virtual prop is within the field of view, and a distance between the second virtual object and the first virtual object is not greater than the target distance, comprises:
and displaying the firing special effect of the virtual firearm under the condition that any virtual firearm is triggered, a second virtual object holding the virtual firearm is positioned in the visual field, and the distance between the second virtual object and the first virtual object is not larger than the target distance.
4. The method according to claim 1, wherein the method further comprises:
and displaying the prop special effect of the virtual prop under the condition that the virtual prop moves in the visual field after being triggered and the distance between the virtual prop and the first virtual object is not larger than the target distance.
5. The method of claim 4, wherein displaying the prop special effect of the virtual prop if the virtual prop is moved within the field of view after being triggered and the distance between the virtual prop and the first virtual object is not greater than the target distance comprises:
in the case where any virtual bullet is moved within the visual field after being ejected and a distance between the virtual bullet and the first virtual object is not greater than the target distance, a movement special effect of the virtual bullet is displayed.
6. The method of claim 1, wherein displaying the prop special effect of the virtual prop at the contact location if the virtual prop is in contact with any virtual object after being triggered, the contact location is within the field of view, and a distance between the contact location and a location of the first virtual object is not greater than the target distance, comprises:
and displaying the bullet hole special effect of the virtual bullet at the contact position when any virtual bullet is contacted with the virtual object after being ejected, the contact position is in the visual field, and the distance between the contact position and the position of the first virtual object is not more than the target distance.
7. The method of claim 6, wherein displaying the bullet hole special effect of the virtual bullet at the contact location if the contact location is within the field of view and the distance between the contact location and the location of the first virtual object is not greater than the target distance after any virtual bullet is ejected, comprises:
determining a bullet hole special effect matched with the virtual article in the case that the virtual bullet is in contact with the virtual article after being ejected, the contact position is within the field of view, and a distance between the contact position and a position of the first virtual object is not greater than the target distance;
displaying the determined bullet hole special effect at the contact position.
8. A prop special effect display device, the device comprising:
the display module is used for displaying a visual field picture of the first virtual object controlled by the local terminal equipment;
the display module is further used for acquiring the position of the first virtual object and the position of a second virtual object holding the virtual prop in a hand manner under the condition that the virtual prop is triggered; displaying a prop special effect of the virtual prop under the condition that the position of the second virtual object is in the field of view of the first virtual object and the distance between the position of the second virtual object and the position of the first virtual object is not greater than a target distance; if the position of the second virtual object is within the field of view and the distance between the position of the second virtual object and the position of the first virtual object is greater than the target distance, not displaying the prop special effect of the virtual prop; or alternatively, the process may be performed,
Acquiring the position of the first virtual object and the position of a second virtual object holding the virtual prop under the condition that the virtual prop is triggered; determining the position of the virtual prop based on the position of the second virtual object, the orientation of the second virtual object and the relative positional relationship between the second virtual object and the virtual prop; displaying a prop special effect of the virtual prop under the condition that the position of the virtual prop is in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is not greater than the target distance; when the position of the virtual prop is within the field of view and the distance between the position of the virtual prop and the position of the first virtual object is greater than the target distance, not displaying the prop special effect of the virtual prop; or alternatively, the process may be performed,
acquiring the position of the first virtual object and the position of a third virtual object triggering the virtual prop under the condition that the virtual prop is triggered and starts to move; determining the current position of the virtual prop based on the position of the third virtual object and the moving direction of the virtual prop; displaying a special effect of movement of the virtual prop under the condition that the virtual prop moves in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is not greater than the target distance; when the virtual prop moves in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is larger than the target distance, the special moving effect of the virtual prop is not displayed; or alternatively, the process may be performed,
Displaying a prop special effect of the virtual prop at a contact position when the virtual prop is contacted with any virtual object after being triggered, the contact position is in the visual field, and the distance between the contact position and the position of the first virtual object is not greater than the target distance; and when the virtual prop is triggered and then is contacted with the virtual object, the contact position is in the visual field, and the distance between the contact position and the position of the first virtual object is larger than the target distance, the prop special effect of the virtual prop is not displayed at the contact position.
9. A computer device comprising a processor and a memory, the memory having stored therein at least one computer program that is loaded and executed by the processor to perform the operations performed by the prop special effect display method of any of claims 1 to 7.
10. A computer readable storage medium having stored therein at least one computer program loaded and executed by a processor to perform the operations performed by the prop special effect display method of any of claims 1 to 7.
CN202111500232.XA 2021-12-09 2021-12-09 Prop special effect display method, device, computer equipment and storage medium Active CN114100128B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111500232.XA CN114100128B (en) 2021-12-09 2021-12-09 Prop special effect display method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111500232.XA CN114100128B (en) 2021-12-09 2021-12-09 Prop special effect display method, device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114100128A CN114100128A (en) 2022-03-01
CN114100128B true CN114100128B (en) 2023-07-21

Family

ID=80363823

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111500232.XA Active CN114100128B (en) 2021-12-09 2021-12-09 Prop special effect display method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114100128B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117339202A (en) * 2022-06-27 2024-01-05 腾讯科技(深圳)有限公司 Prop special effect display method, prop special effect display device, electronic equipment and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105808815A (en) * 2015-01-16 2016-07-27 纳宝株式会社 Apparatus and method for generating and displaying cartoon content
CN109126120A (en) * 2018-08-17 2019-01-04 Oppo广东移动通信有限公司 motor control method and related product
CN110478895A (en) * 2019-08-23 2019-11-22 腾讯科技(深圳)有限公司 Control method, device, terminal and the storage medium of virtual objects
CN110585712A (en) * 2019-09-20 2019-12-20 腾讯科技(深圳)有限公司 Method, device, terminal and medium for throwing virtual explosives in virtual environment
CN112076465A (en) * 2020-08-06 2020-12-15 腾讯科技(深圳)有限公司 Virtual fort control method, device, terminal and storage medium
CN112090070A (en) * 2020-09-18 2020-12-18 腾讯科技(深圳)有限公司 Interaction method and device of virtual props and electronic equipment
CN112121434A (en) * 2020-09-30 2020-12-25 腾讯科技(深圳)有限公司 Interaction method and device of special effect prop, electronic equipment and storage medium
CN112604279A (en) * 2020-12-29 2021-04-06 珠海金山网络游戏科技有限公司 Special effect display method and device
WO2021184806A1 (en) * 2020-03-17 2021-09-23 腾讯科技(深圳)有限公司 Interactive prop display method and apparatus, and terminal and storage medium
CN113599810A (en) * 2021-08-06 2021-11-05 腾讯科技(深圳)有限公司 Display control method, device, equipment and medium based on virtual object
CN113750531A (en) * 2021-09-18 2021-12-07 腾讯科技(深圳)有限公司 Prop control method, device, equipment and storage medium in virtual scene

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI492150B (en) * 2013-09-10 2015-07-11 Utechzone Co Ltd Method and apparatus for playing multimedia information

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105808815A (en) * 2015-01-16 2016-07-27 纳宝株式会社 Apparatus and method for generating and displaying cartoon content
CN109126120A (en) * 2018-08-17 2019-01-04 Oppo广东移动通信有限公司 motor control method and related product
CN110478895A (en) * 2019-08-23 2019-11-22 腾讯科技(深圳)有限公司 Control method, device, terminal and the storage medium of virtual objects
CN110585712A (en) * 2019-09-20 2019-12-20 腾讯科技(深圳)有限公司 Method, device, terminal and medium for throwing virtual explosives in virtual environment
WO2021184806A1 (en) * 2020-03-17 2021-09-23 腾讯科技(深圳)有限公司 Interactive prop display method and apparatus, and terminal and storage medium
CN112076465A (en) * 2020-08-06 2020-12-15 腾讯科技(深圳)有限公司 Virtual fort control method, device, terminal and storage medium
CN112090070A (en) * 2020-09-18 2020-12-18 腾讯科技(深圳)有限公司 Interaction method and device of virtual props and electronic equipment
CN112121434A (en) * 2020-09-30 2020-12-25 腾讯科技(深圳)有限公司 Interaction method and device of special effect prop, electronic equipment and storage medium
CN112604279A (en) * 2020-12-29 2021-04-06 珠海金山网络游戏科技有限公司 Special effect display method and device
CN113599810A (en) * 2021-08-06 2021-11-05 腾讯科技(深圳)有限公司 Display control method, device, equipment and medium based on virtual object
CN113750531A (en) * 2021-09-18 2021-12-07 腾讯科技(深圳)有限公司 Prop control method, device, equipment and storage medium in virtual scene

Also Published As

Publication number Publication date
CN114100128A (en) 2022-03-01

Similar Documents

Publication Publication Date Title
CN111282275B (en) Method, device, equipment and storage medium for displaying collision traces in virtual scene
KR102619439B1 (en) Methods and related devices for controlling virtual objects
CN110917619B (en) Interactive property control method, device, terminal and storage medium
CN111589150B (en) Control method and device of virtual prop, electronic equipment and storage medium
CN111475573B (en) Data synchronization method and device, electronic equipment and storage medium
CN111744186B (en) Virtual object control method, device, equipment and storage medium
CN112057857B (en) Interactive property processing method, device, terminal and storage medium
CN113041622B (en) Method, terminal and storage medium for throwing virtual throwing object in virtual environment
CN111330274B (en) Virtual object control method, device, equipment and storage medium
US20220161138A1 (en) Method and apparatus for using virtual prop, device, and storage medium
CN111228809A (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN112933601B (en) Virtual throwing object operation method, device, equipment and medium
CN111389005B (en) Virtual object control method, device, equipment and storage medium
CN110917623A (en) Interactive information display method, device, terminal and storage medium
CN111760284A (en) Virtual item control method, device, equipment and storage medium
CN112316430B (en) Prop using method, device, equipment and medium based on virtual environment
CN114225406A (en) Virtual prop control method and device, computer equipment and storage medium
CN112704875B (en) Virtual item control method, device, equipment and storage medium
CN111659122B (en) Virtual resource display method and device, electronic equipment and storage medium
CN114100128B (en) Prop special effect display method, device, computer equipment and storage medium
CN112402964B (en) Using method, device, equipment and storage medium of virtual prop
CN111589137B (en) Control method, device, equipment and medium of virtual role
CN112755518B (en) Interactive property control method and device, computer equipment and storage medium
JP7447308B2 (en) Virtual item display methods, devices, electronic devices and computer programs
CN113713385B (en) Virtual prop control method, device, equipment, medium and computer program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40069742

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant