CN118022330A - Virtual object interaction method, device, equipment, medium and program product - Google Patents

Virtual object interaction method, device, equipment, medium and program product Download PDF

Info

Publication number
CN118022330A
CN118022330A CN202211384497.2A CN202211384497A CN118022330A CN 118022330 A CN118022330 A CN 118022330A CN 202211384497 A CN202211384497 A CN 202211384497A CN 118022330 A CN118022330 A CN 118022330A
Authority
CN
China
Prior art keywords
virtual object
virtual
interaction
interactive
specified
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211384497.2A
Other languages
Chinese (zh)
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202211384497.2A priority Critical patent/CN118022330A/en
Publication of CN118022330A publication Critical patent/CN118022330A/en
Pending legal-status Critical Current

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The application discloses an interaction method, device, equipment, medium and program product of a virtual object, and relates to the field of interface interaction. The method comprises the following steps: a second virtual object displayed within a line of sight of the first virtual object; receiving an interaction triggering operation aiming at a second virtual object under the condition that the first virtual object holds a specified virtual prop; and displaying an interactive animation which is transmitted to the second virtual object by the interactive element and is attached to the second virtual object, wherein the interactive element is used for transmitting from the second virtual object to the third virtual object under the condition that the third virtual object exists in the preset range of the second virtual object. The second virtual object is a temporary transfer medium of the interaction element, when a third virtual object exists in the preset range of the second virtual object, the interaction element can also emit from the second virtual object to the third virtual object, and interaction with the third virtual object can be realized even if the third virtual object is not in the sight range of the first virtual object.

Description

Virtual object interaction method, device, equipment, medium and program product
Technical Field
The embodiment of the application relates to the field of interface interaction, in particular to an interaction method, device, equipment, medium and program product of a virtual object.
Background
In applications that include virtual scenes, a user is typically able to control virtual objects to move within the virtual scene or interact with other virtual objects, such as: in a game, a player can control a virtual object to virtually fight against a Non-player control character (Non-PLAYER CHARACTER, NPC) or other virtual object in a virtual scene.
In the related art, when a player attacks on an hostile virtual object by using a virtual firearm, the center of gravity of the virtual firearm is aimed at the hostile virtual object in a sight range, and the shooting of a virtual bullet in the virtual firearm is triggered, so that the hostile virtual object in the sight range is attacked.
However, in the above manner of attacking the hostile virtual object, since the player has limited ability to observe the virtual scene, the range of the attack ability of the virtual firearm is limited, and it is necessary to continuously change the positions to search for different hostile virtual objects, so that the man-machine interaction efficiency is low.
Disclosure of Invention
The embodiment of the application provides an interaction method, device, equipment, medium and program product of a virtual object, which can improve the interaction human-computer interaction efficiency of a main control virtual object in a virtual scene. The technical scheme is as follows:
in one aspect, a method for interaction of virtual objects is provided, the method comprising:
displaying a second virtual object in the virtual scene within the line of sight of the first virtual object;
Receiving an interaction triggering operation aiming at the second virtual object under the condition that the first virtual object holds a designated virtual prop, wherein the designated virtual prop is used for transmitting an interaction element;
And responding to the interaction triggering operation to hit the second virtual object, displaying an interaction animation of the interaction element transmitted to the second virtual object and attached to the second virtual object, wherein the interaction element is used for transmitting from the second virtual object to the third virtual object and acting on the third virtual object under the condition that the third virtual object exists in the preset range of the second virtual object, so as to form the interaction effect between the first virtual object and the third virtual object.
In another aspect, an interactive device for a virtual object is provided, the device including:
the display module is used for displaying a second virtual object in the virtual scene in the sight range of the first virtual object;
The receiving module is used for receiving interaction triggering operation aiming at the second virtual object under the condition that the first virtual object holds a specified virtual prop, and the specified virtual prop is used for transmitting interaction elements;
The display module is further configured to, in response to the triggering operation hitting the second virtual object, display an interactive animation that the interactive element emits to the second virtual object and is attached to the second virtual object, where the interactive element is configured to emit from the second virtual object to the third virtual object and act on the third virtual object when the third virtual object exists in a preset range of the second virtual object, so as to form an interactive effect between the first virtual object and the third virtual object.
In another aspect, a computer device is provided, where the computer device includes a processor and a memory, where the memory stores at least one instruction, at least one program, a set of codes, or a set of instructions, where the at least one instruction, the at least one program, the set of codes, or the set of instructions are loaded and executed by the processor to implement a method for interaction of a virtual object according to any one of the embodiments of the present application.
In another aspect, a computer readable storage medium is provided, where at least one instruction, at least one program, a set of codes, or a set of instructions is stored, where the at least one instruction, the at least one program, the set of codes, or the set of instructions are loaded and executed by a processor to implement a method for interaction of virtual objects according to any of the embodiments of the present application.
In another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the interaction method of the virtual object according to any of the above embodiments.
The technical scheme provided by the embodiment of the application has the beneficial effects that at least:
The second virtual object is triggered to be used for designating the virtual prop, the interaction element is emitted to the second virtual object, the interaction element is attached to the second virtual object, the second virtual object is used as a temporary transfer medium of the interaction element, when a third virtual object exists in the preset range of the second virtual object, the interaction element is emitted from the second virtual object to the third virtual object, interaction with the third virtual object can be achieved even if the third virtual object is not in the sight range of the first virtual object, and man-machine interaction efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic illustration of interaction of virtual objects provided by an exemplary embodiment of the present application;
FIG. 2 is a schematic illustration of interaction of virtual objects provided by another exemplary embodiment of the present application;
Fig. 3 is a block diagram of a terminal according to an exemplary embodiment of the present application;
FIG. 4 is a schematic illustration of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 5 is a flow chart of a method of interaction of virtual objects provided by an exemplary embodiment of the present application;
FIG. 6 is a schematic diagram of an interactive element transfer process provided based on the embodiment shown in FIG. 5;
FIG. 7 is a flow chart of a method of interaction of virtual objects provided by another exemplary embodiment of the present application;
FIG. 8 is a schematic diagram of displaying an activation progress element in a virtual environment interface provided based on the embodiment shown in FIG. 7;
FIG. 9 is a schematic diagram of displaying an activation progress element in a virtual environment interface provided based on the embodiment shown in FIG. 7;
FIG. 10 is a flowchart of a method for interaction of virtual objects provided by another exemplary embodiment of the present application;
FIG. 11 is a schematic overall flow diagram of virtual object interaction provided by an exemplary embodiment of the present application;
FIG. 12 is a schematic diagram of a radiation detection process provided based on the embodiment shown in FIG. 11;
FIG. 13 is a schematic view of a crash box provided based on the embodiment shown in FIG. 11;
FIG. 14 is a schematic diagram of a radiation detection process provided by an exemplary embodiment of the present application;
FIG. 15 is a block diagram illustrating an interaction device for virtual objects according to an exemplary embodiment of the present application;
FIG. 16 is a block diagram illustrating an interaction device for virtual objects according to another exemplary embodiment of the present application;
fig. 17 is a block diagram of a computer device according to an exemplary embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail with reference to the accompanying drawings.
In a gaming application or some virtual scene-based application, a player is typically able to control a virtual object to perform a variety of actions in a virtual scene, or a player is able to control a virtual object to interact with other virtual objects in a virtual scene.
Schematically, the player can control the main control virtual object to perform dancing interaction and virtual attack interaction with the virtual objects controlled by other players in the virtual scene; the master virtual object may also be controlled to interact with Non-player controlled characters (Non-PLAYER CHARACTER, NPC) in the virtual scene.
The player can also control the master virtual object to interact with other virtual objects in the virtual scene using various virtual props, such as: the virtual props are used for attacking the hostile virtual object or the virtual medical props are used for treating the teammate virtual object.
Taking the attack of the virtual prop on the hostile virtual object as an example, the main control virtual object can attack the hostile virtual object by using the virtual prop such as a virtual sniper gun, a virtual rifle, a virtual shotgun, a virtual machine gun and the like. Firstly, when a master control virtual object holds a virtual firearm, an aiming sight is displayed in an interface, and after a player aims a hostile virtual object through the aiming sight, the virtual firearm is triggered, so that a virtual bullet in the virtual firearm is launched to the hostile virtual object, and the hostile virtual object is damaged.
However, in the attack mode, since the player has limited ability to observe the virtual environment, the ability to discover hostile virtual objects in the virtual environment is limited, and the man-machine interaction efficiency is low.
Illustratively, as shown in FIG. 1, when the master virtual object holds a virtual firearm 100, a sight 110 is displayed in the interface, the sight 110 being embodied as a dot in FIG. 1. When the aiming sight 110 aims the hostile virtual object 120, the shooting of the interactive element 130 (i.e. virtual bullet) in the virtual firearm is triggered, and the form of attaching the interactive element 130 to the hostile virtual object 120 is displayed.
When the hostile virtual object 120 further includes the hostile virtual object 140 within the preset range, as shown in fig. 2, the interactive element 130 emits from the hostile virtual object 120 to the hostile virtual object 140 and acts on the hostile virtual object 140, for example: a virtual explosion effect or the like is generated on the hostile virtual object 140.
It should be noted that, in the above examples, the damage to the hostile virtual object caused by the interactive element is taken as an example; in some embodiments, the interactive elements may also be implemented as gain-type props, such as: the medical prop can generate treatment effect on other virtual objects with or without teammate relation with the main control virtual object in the virtual environment.
The terminal in the present application may be a desktop computer, a laptop computer, a mobile phone, a tablet computer, an electronic book reader, an MP3 (Moving Picture Experts Group Audio Layer III, dynamic video expert compression standard audio plane 3) player, an MP4 (Moving Picture Experts Group Audio Layer IV, dynamic video expert compression standard audio plane 4) player, or the like. The terminal is installed and operated with an application program supporting a virtual scene, such as an application program supporting a three-dimensional virtual scene. The application may be any one of a virtual reality application, a three-dimensional map application, a Third person shooter game (TPS), a First person shooter game (FPS), a multiplayer online tactical game (Multiplayer Online Battle ARENA GAMES, MOBA). Alternatively, the application may be a stand-alone application, such as a stand-alone three-dimensional game, or a network-connected application.
Fig. 3 shows a block diagram of an electronic device according to an exemplary embodiment of the present application. The electronic device 300 includes: an operating system 320 and application programs 322.
Operating system 320 is the underlying software that provides applications 322 with secure access to computer hardware.
The application 322 is an application supporting virtual scenes. Alternatively, the application 322 is an application that supports three-dimensional virtual scenes. The application 322 may be any one of a virtual reality application, a three-dimensional map program, a TPS game, an FPS game, and MOBA game. The application 322 may be a stand-alone application, such as a stand-alone three-dimensional game, or a network-connected application.
FIG. 4 illustrates a block diagram of a computer system provided in accordance with an exemplary embodiment of the present application. The computer system 400 includes: a first device 420, a server 440, and a second device 460.
The first device 420 installs and runs an application supporting a virtual scene. The application may be any one of a virtual reality application, a three-dimensional map program, a TPS game, an FPS game, and MOBA game. The first device 420 is a device used by a first user to control a second virtual object located in a virtual scene to perform activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking up, shooting, attacking, throwing. Illustratively, the second virtual object is a first virtual character, such as an emulated persona or a cartoon persona.
The first device 420 is connected to the server 440 via a wireless network or a wired network.
The server 440 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 440 is used to provide background services for applications supporting three-dimensional virtual scenes. Optionally, the server 440 takes on primary computing work, and the first device 420 and the second device 460 take on secondary computing work; or server 440 performs secondary computing, and first device 420 and second device 460 perform primary computing; or the server 440, the first device 420 and the second device 460 are cooperatively calculated by adopting a distributed computing architecture.
The second device 460 installs and runs an application supporting virtual scenarios. The application may be any one of a virtual reality application, a three-dimensional map program, an FPS game, a MOBA game, and a multiplayer gunfight survival game. The second device 460 is a device used by a second user that uses the second device 460 to control a second virtual object located in the virtual scene to perform activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking up, shooting, attacking, throwing. Illustratively, the second virtual object is a second virtual character, such as an emulated persona or a cartoon persona.
Optionally, the first avatar and the second avatar are in the same virtual scene. Alternatively, the first avatar and the second avatar may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights. Alternatively, the first avatar and the second avatar may belong to different teams, different organizations, or two parties with hostility.
Alternatively, the applications installed on the first device 420 and the second device 460 are the same, or the applications installed on the two devices are the same type of application for different control system platforms. The first device 420 may refer broadly to one of a plurality of devices and the second device 460 may refer broadly to one of a plurality of devices, the present embodiment being illustrated with only the first device 420 and the second device 460. The device types of the first device 420 and the second device 460 are the same or different, and the device types include: at least one of a game console, a desktop computer, a smart phone, a tablet computer, an electronic book reader, an MP3 player, an MP4 player, and a laptop portable computer. The following embodiments are illustrated with the device being a desktop computer.
Those skilled in the art will appreciate that the number of devices described above may be greater or lesser. Such as the above-mentioned devices may be only one, or the above-mentioned devices may be several tens or hundreds, or more. The number of devices and the types of the devices are not limited in the embodiment of the application.
It should be noted that, the server 440 may be implemented as a physical server or may be implemented as a Cloud server in the Cloud, where Cloud technology refers to a hosting technology that unifies serial resources such as hardware, software, and networks in a wide area network or a local area network to implement calculation, storage, processing, and sharing of data. The cloud technology is based on the general names of network technology, information technology, integration technology, management platform technology, application technology and the like applied by the cloud computing business mode, can form a resource pool, and is flexible and convenient as required. Cloud computing technology will become an important support. Background services of technical networking systems require a large amount of computing, storage resources, such as video websites, picture-like websites, and more portals. Along with the high development and application of the internet industry, each article possibly has an own identification mark in the future, the identification mark needs to be transmitted to a background system for logic processing, data of different levels can be processed separately, and various industry data needs strong system rear shield support and can be realized through cloud computing.
Alternatively, the server 440 described above may also be implemented as a node in a blockchain system.
In some embodiments, the method provided by the embodiment of the application can be applied to a cloud game scene, so that the calculation of data logic in the game process is completed through a cloud server, and the terminal is responsible for displaying a game interface.
It should be noted that, the information (including but not limited to user equipment information, user personal information, etc.), data (including but not limited to data for analysis, stored data, presented data, etc.), and signals related to the present application are all authorized by the user or are fully authorized by the parties, and the collection, use, and processing of the related data is required to comply with the relevant laws and regulations and standards of the relevant countries and regions. For example, the game data referred to in the present application are all acquired with sufficient authorization.
Referring to fig. 5, a flowchart of a virtual object interaction method provided by an exemplary embodiment of the present application is shown, and the method is applied to a terminal for illustration, as shown in fig. 5, and the method includes:
step 501, displaying a second virtual object in a virtual scene within a line of sight of the first virtual object.
Wherein the first virtual object is in a virtual scene.
The first virtual object is a virtual object which is controlled by the current terminal, namely the current terminal can control the first virtual object to move, act execution, shape transformation and the like in the virtual scene.
In addition to the first virtual object, the virtual scene also comprises other virtual objects, wherein the other virtual objects comprise first virtual object hostile virtual objects, or teammate virtual objects comprising the first virtual object in the other virtual objects, or teammate virtual objects and hostile virtual objects comprising the first virtual object in the other virtual objects. In some embodiments, the other virtual objects further include a split virtual object of the first virtual object, that is, the split virtual object and the first virtual object correspond to the same object parameters but to different scene positions.
In the embodiment of the application, the virtual scene further comprises a second virtual object, and the second virtual object is an hostile virtual object of the first virtual object; or the second virtual object is a teammate virtual object of the first virtual object; or the second virtual object is any virtual object in the virtual scene that has neither competing nor cooperative relationship with the first virtual object. Wherein the second virtual object is within the line of sight of the first virtual object, i.e. when the player observes the virtual scene with the perspective of the first virtual object, the second virtual object in the virtual scene can be observed.
Optionally, the player controls the activity of the first virtual object in the virtual scene by a control operation. The control operation may be used to control the first virtual object to move in the virtual scene, perform actions, change the shape, and also control the first virtual object to switch the virtual prop currently held, for example: the control operation is used for controlling the first virtual object to switch the held initial virtual prop into the appointed virtual prop.
Step 502, receiving an interaction triggering operation for a second virtual object under the condition that a first virtual object holds a specified virtual prop.
The interaction triggering operation is used for indicating the first virtual object to interact with other virtual objects or elements in the virtual scene through the appointed virtual prop.
In some embodiments, designating the virtual prop as a virtual prop that the first virtual object configured before the virtual pair began; or designating the virtual prop as a virtual prop obtained (such as picking up, point exchange, purchasing and the like) by the first virtual object in the virtual scene; or designating the virtual prop as the virtual prop randomly configured for the first virtual object at the beginning of the virtual game; or designating the virtual prop as the virtual prop activated to the first virtual object under the condition that the first virtual object meets the requirement of the game. In the embodiment of the application, the virtual prop which is activated based on the office requirement is taken as an example for explanation.
Optionally, the designating the virtual prop is based on the aiming of the second virtual object, receiving an interactive triggering operation; or the specified virtual prop is a virtual prop capable of automatically aiming other virtual objects in the video range, namely, when the interactive triggering operation is triggered, the system automatically aims a second virtual object.
In some embodiments, a specified virtual prop has a limit on the number of uses; or a use time limit; or the number of interactive elements, etc., and the embodiment of the present application is not limited thereto.
In step 503, in response to the triggering operation hitting the second virtual object, displaying an interactive animation that is emitted by the interactive element to the second virtual object and attached to the second virtual object, where the interactive element is used to emit from the second virtual object to the third virtual object and act on the third virtual object when the third virtual object exists within the preset range of the second virtual object.
When the interaction element acts on the third virtual object, an interaction effect between the first virtual object and the third virtual object is formed.
Optionally, the interaction element is configured to transmit from the second virtual object to the third virtual object when the third virtual object exists within the preset range of the second virtual object and the third virtual object is out of the line of sight of the first virtual object. That is, the current third virtual object is not in the line of sight of the first virtual object, and the second virtual object is in the line of sight of the first virtual object, so the first virtual object can only aim the second virtual object, but can realize interaction with the third virtual object outside the line of sight through attaching the second virtual object to the interaction element.
Illustratively, the interactive element is transmitted from the second virtual object to the third virtual object after being attached to the second virtual object for a preset period of time.
FIG. 6 is a schematic diagram of an interactive element transfer process according to an exemplary embodiment of the present application. In the virtual scene 610, after the first virtual object emits the interactive element 630 to the second virtual object 620, the interactive element 630 is first attached to the second virtual object 620, and within a preset duration of attachment, whether the third virtual object 640 exists in a preset range of the second virtual object 620 is detected, and when the third virtual object 640 exists, an effect of emitting the interactive element 630 to the third virtual object is displayed.
Optionally, the third virtual object may be an object that is continuously within the preset range of the second virtual object within the preset duration, or may be an object that enters the preset range of the second virtual object at any time within the preset duration.
In some embodiments, the transferring of the interactive element includes at least one of:
Firstly, after an interaction element is attached to a second virtual object for a preset time period, detecting that a third virtual object does not exist in a preset range of the second virtual object, and directly acting on the second virtual object by the interaction element;
Secondly, after the interactive element is attached to the second virtual object for a preset time period, detecting that a third virtual object does not exist in a preset range of the second virtual object, and recycling the interactive element by the first virtual object;
Thirdly, after the interactive element is attached to the second virtual object for a preset time period, if the third virtual object exists in the preset range of the second virtual object, the interactive element is automatically transmitted to the third virtual object by default;
fourth, after the interactive element is attached to the second virtual object for a preset time period, if a third virtual object exists in a preset range of the second virtual object, displaying an interactive option, wherein the interactive option comprises a first option and a second option, the first option is used for indicating that the interactive element acts on the second virtual object, the second option is used for indicating that the interactive element acts on the third virtual object, if a selection operation of the first option is received, the interactive element acts on the second virtual object in a direct triggering mode, and if a selection operation of the second option is received, the interactive element is triggered to be emitted to the third virtual object and acts on the third virtual object.
It should be noted that the above transfer manner is merely an illustrative example, and the embodiments of the present application are not limited thereto.
In some embodiments, the first virtual object, the second virtual object, and the third virtual object are in the same or different relationships to each other. The interaction elements function differently based on the different relationships of the first virtual object, the second virtual object, and the third virtual object to each other.
Schematically, the first virtual object and the second virtual object are in hostile relation, and the first virtual object and the third virtual object are in hostile relation, so that the first virtual object attacks the second virtual object and/or the third virtual object by taking the second virtual object as a medium;
Or the first virtual object and the second virtual object are teammate relations, and the first virtual object and the third virtual object are hostile relations, so that the first virtual object attacks the third virtual object by taking the friend virtual object as a medium;
Or the first virtual object and the second virtual object are in hostile relation, and the first virtual object and the third virtual object are in teammate relation, so that the first virtual object treats the third virtual object or assists the gain by taking the hostile virtual object as a medium;
Or the first virtual object and the second virtual object are teammate relations, and the first virtual object and the third virtual object are teammate relations, so that the first virtual object treats the second virtual object and/or the third virtual object or assists the gain by taking the teammate virtual object as a medium.
It should be noted that the above-described case is merely an illustrative example, and the embodiment of the present application is not limited thereto.
Optionally, when there is no barrier between the second virtual object and the third virtual object, triggering the interactive element to be emitted from the second virtual object to the third virtual object and acting on the third virtual object.
In determining whether the second virtual object and the third virtual object are within the line of sight of the first virtual object and determining whether an obstacle is present between the second virtual object and the third virtual object, in the present embodiment, the determination is made through the collision detection line.
Taking as an example whether the third virtual object is in the line of sight of the first virtual object and whether there is an obstacle blocking between the second virtual object and the third virtual object. Firstly, a first collision detection line between a first virtual object and a third virtual object is obtained, and a second collision detection line between a second virtual object and the third virtual object is obtained; in response to the first collision detection line indicating that an object collision exists, the second collision detection line indicating that no object collision has occurred, it is determined that the interactive element is emitted from the second virtual object to the third virtual object.
In some embodiments, when determining that the line of sight between the first virtual object and the third virtual object is blocked by the first collision detection line, determining a plurality of body parts of the first virtual object, and determining a plurality of body parts of the third virtual object, cross-connecting the plurality of body parts of the first virtual object and the plurality of body parts of the third virtual object to obtain a plurality of first collision detection lines, and when a collision detection line indicating collision with the object in the plurality of first collision detection lines reaches a preset proportion threshold value, indicating that the third virtual object is not in the line of sight of the first virtual object.
When the obstacle condition between the second virtual object and the third virtual object is determined through the second collision detection line, firstly determining a target body part attached by the interactive element on the second virtual object, determining a plurality of body parts of the third virtual object, connecting the target body part of the second virtual object with the plurality of body parts of the third virtual object to obtain a plurality of second collision detection lines, and when at least one second collision detection line indicates that no object collision occurs, indicating that a transmission route of the interactive element exists between the second virtual object and the third virtual object. And transmitting the interactive element through a second collision detection line which is not collided by the object, wherein when a plurality of second collision detection lines which are not collided by the object exist, one second collision detection line is randomly determined from the plurality of second collision detection lines to serve as a transmission path of the interactive element.
In summary, according to the method provided in this embodiment, by triggering the second virtual object to use the designated virtual prop, transmitting the interactive element to the second virtual object, and attaching the interactive element to the second virtual object, so as to represent the second virtual object as a temporary transfer medium of the interactive element, when the third virtual object exists in the preset range of the second virtual object, the interactive element is further transmitted from the second virtual object to the third virtual object, so that interaction with the third virtual object can be achieved even if the third virtual object is not in the line-of-sight range of the first virtual object, and man-machine interaction efficiency is improved.
According to the method provided by the embodiment, the interactive element is transmitted from the second virtual object to the third virtual object only when the third virtual object is out of the sight range of the first virtual object, so that the interactive efficiency between the first virtual object and the invisible virtual object is improved.
According to the method provided by the embodiment, the position relations among the first virtual object, the second virtual object and the third virtual object are determined through the plurality of collision detection lines, so that the man-machine interaction efficiency is improved.
In an alternative embodiment, the specified virtual prop is a prop that is activated in the virtual scene according to the game performance of the player. Fig. 7 is a flowchart of an interaction method of a virtual object according to another exemplary embodiment of the present application, and the method is applied to a terminal for example, as shown in fig. 7, and includes:
Step 701, displaying a second virtual object in a virtual scene within a line of sight of the first virtual object.
Wherein the first virtual object is in a virtual scene.
The first virtual object is a virtual object which is controlled by the current terminal, namely the current terminal can control the first virtual object to move, act execution, shape transformation and the like in the virtual scene.
In the embodiment of the application, the virtual scene further comprises a second virtual object, and the second virtual object is an hostile virtual object of the first virtual object; or the second virtual object is a teammate virtual object of the first virtual object; or the second virtual object is any virtual object in the virtual scene that has neither competing nor cooperative relationship with the first virtual object.
Step 702, an activation progress element in a cooled state is displayed.
The activation progress element is used for indicating the activation progress of the designated virtual prop. The active progress element currently in a cool state is used to indicate that a given virtual prop has not been activated, i.e., is in an unusable state.
Optionally, the activation progress of the activation progress element includes at least one of the following updating modes:
firstly, automatically updating the activation progress of a designated virtual prop along with the duration of virtual office;
second, based on the received specified interactive operation in the virtual scene, the activation progress of the specified virtual prop is updated. That is, in response to receiving a specified interactive operation in the virtual scene, the activation progress in the activation progress element is updated based on the specified interactive operation.
In this embodiment, the update of the activation schedule according to the specified interactive operation is taken as an example for explanation. When the activation progress is updated according to the appointed interactive operation, the activation progress can be updated according to the operation duration of the appointed interactive operation; the activation progress can also be updated according to the operation times of the appointed interactive operation; the activation progress may also be updated based on the interaction value that the specified interaction operation affects other virtual objects.
Illustratively, in response to receiving a specified interaction operation between the virtual scene and other virtual objects, acquiring an interaction value generated by the specified interaction operation and the other virtual objects; and updating the activation progress in the activation progress element according to a preset proportion based on the interaction value. Such as: the preset proportion is 20%, namely, the activation progress is adjusted according to 20% of the interaction value. Schematically, the appointed interaction operation is an attack operation of the first virtual object on other virtual objects through the appointed skill, and if the appointed skill causes 100-point damage to the other virtual objects, the activation progress is increased by 20 points.
In response to the activation progress reaching the specified progress threshold, the activation progress element in the activated state is displayed 703.
The activation progress element in the activated state is used to indicate that the currently specified virtual prop is in a usable state.
In some embodiments, when the activation progress reaches 100%, an activation progress element in an activated state is displayed.
Illustratively, as shown in fig. 8, an active progress element 810 in a cooled state is displayed in the virtual environment interface, and a progress value in the active progress element 810 is updated with the skill released by the first virtual object in the virtual scene. Optionally, the activation schedule is updated from 0%. When the activation progress of the activation progress element 810 is updated to 100%, the activation progress element 810 in an activated state is displayed, and the activation progress element 810 in an activated state is used to indicate that the currently specified virtual prop is in a usable state.
Step 704, in response to receiving a trigger operation on the active progress element in an active state, displays that the first virtual object holds the specified virtual prop.
Optionally, when the activation progress element is in the activated state, the activation progress element is a reachable element, that is, by touching the activation progress element, the first virtual object can be triggered to switch to the holding of the specified virtual prop.
Illustratively, as shown in fig. 9, an activation progress element 900 in an activated state is displayed in the virtual environment interface, and when a trigger operation for the activation progress element 900 is received, a first virtual object is displayed to hold a specified virtual prop 910.
Step 705, receiving an interaction trigger operation for the second virtual object in case the first virtual object holds the specified virtual prop.
The interaction triggering operation is used for indicating the first virtual object to interact with other virtual objects or elements in the virtual scene through the appointed virtual prop.
Optionally, the designating the virtual prop is based on the aiming of the second virtual object, receiving an interactive triggering operation; or the specified virtual prop is a virtual prop capable of automatically aiming other virtual objects in the video range, namely, when the interactive triggering operation is triggered, the system automatically aims a second virtual object.
In some embodiments, a specified virtual prop has a limit on the number of uses; or a use time limit; or the number of interactive elements, etc., and the embodiment of the present application is not limited thereto.
In step 706, in response to the triggering operation hitting the second virtual object, displaying an interactive animation that is emitted by the interactive element to the second virtual object and attached to the second virtual object, where the interactive element is used to emit from the second virtual object to the third virtual object and act on the third virtual object when the third virtual object exists within the preset range of the second virtual object.
When the interaction element acts on the third virtual object, an interaction effect between the first virtual object and the third virtual object is formed.
Optionally, the interaction element is configured to transmit from the second virtual object to the third virtual object when the third virtual object exists within the preset range of the second virtual object and the third virtual object is out of the line of sight of the first virtual object. That is, the current third virtual object is not in the line of sight of the first virtual object, and the second virtual object is in the line of sight of the first virtual object, so the first virtual object can only aim the second virtual object, but can realize interaction with the third virtual object outside the line of sight through attaching the second virtual object to the interaction element.
In some embodiments, in a case where a third virtual object exists within a preset range of the second virtual object and the third virtual object is out of a line of sight of the first virtual object, an emission animation of the interactive element emitted from the second virtual object to the third virtual object is displayed.
In some embodiments, the specified obstacle exists between the first virtual object and the third virtual object, wherein the specified obstacle may be a virtual object in the virtual scene or a terrain element in the virtual scene. Alternatively, when the emissive animation is displayed, the animation display has an obstacle due to the occlusion of the specified obstacle. In the embodiment of the application, the appointed barrier is displayed with preset transparency; and displaying the transmitting animation of the interactive element from the second virtual object to the third virtual object through the designated barrier displayed with the preset transparency.
In summary, according to the method provided in this embodiment, by triggering the second virtual object to use the designated virtual prop, transmitting the interactive element to the second virtual object, and attaching the interactive element to the second virtual object, so as to represent the second virtual object as a temporary transfer medium of the interactive element, when the third virtual object exists in the preset range of the second virtual object, the interactive element is further transmitted from the second virtual object to the third virtual object, so that interaction with the third virtual object can be achieved even if the third virtual object is not in the line-of-sight range of the first virtual object, and man-machine interaction efficiency is improved.
According to the method provided by the embodiment, the appointed virtual prop is provided for the first virtual object in an activated mode, so that the diversity of providing the virtual prop for the virtual object is increased, the perception capability of a player on the appointed virtual prop is improved, and the utilization efficiency of the appointed virtual prop is improved.
According to the method provided by the embodiment, the activation progress element is updated in equal proportion according to the interaction value of the appointed interaction operation between the virtual scene and other virtual objects by the first virtual object, the activation of the appointed virtual prop is associated with the appointed interaction operation, the operation application efficiency is improved, and the perception capability of a player on the appointed virtual prop is improved.
According to the method provided by the embodiment, the specified barrier is displayed with the preset transparency, so that the interactive expression effect between the first virtual object and the third virtual object is improved, and the man-machine interaction efficiency is improved.
In an alternative embodiment, the second virtual object may further include a plurality of third virtual objects within a preset range. Fig. 10 is a flowchart of an interaction method of a virtual object according to another exemplary embodiment of the present application, and the method is applied to a terminal for example, and as shown in fig. 10, the method includes:
Step 1001, displaying a second virtual object in a virtual scene within a line of sight of the first virtual object.
Wherein the first virtual object is in a virtual scene.
The first virtual object is a virtual object which is controlled by the current terminal, namely the current terminal can control the first virtual object to move, act execution, shape transformation and the like in the virtual scene.
In the embodiment of the application, the virtual scene further comprises a second virtual object, and the second virtual object is an hostile virtual object of the first virtual object; or the second virtual object is a teammate virtual object of the first virtual object; or the second virtual object is any virtual object in the virtual scene that has neither competing nor cooperative relationship with the first virtual object.
Step 1002, receiving an interaction trigger operation for a second virtual object under the condition that a first virtual object holds a specified virtual prop.
The interaction triggering operation is used for indicating the first virtual object to interact with other virtual objects or elements in the virtual scene through the appointed virtual prop.
Optionally, the designating the virtual prop is based on the aiming of the second virtual object, receiving an interactive triggering operation; or the specified virtual prop is a virtual prop capable of automatically aiming other virtual objects in the video range, namely, when the interactive triggering operation is triggered, the system automatically aims a second virtual object.
In some embodiments, a specified virtual prop has a limit on the number of uses; or a use time limit; or the number of interactive elements, etc., and the embodiment of the present application is not limited thereto.
Optionally, the designated virtual prop is a virtual prop activated by activating means after the player equips the designated skill prop.
In step 1003, in response to the interaction triggering operation hitting the second virtual object, an interaction animation is displayed in which the interaction element is emitted to the second virtual object and attached to the second virtual object.
The interactive element is used for transmitting from the second virtual object to the third virtual object and acting on the third virtual object under the condition that the third virtual object exists in the preset range of the second virtual object.
When the interaction element acts on the third virtual object, an interaction effect between the first virtual object and the third virtual object is formed.
Optionally, the interaction element is configured to transmit from the second virtual object to the third virtual object when the third virtual object exists within the preset range of the second virtual object and the third virtual object is out of the line of sight of the first virtual object. That is, the current third virtual object is not in the line of sight of the first virtual object, and the second virtual object is in the line of sight of the first virtual object, so the first virtual object can only aim the second virtual object, but can realize interaction with the third virtual object outside the line of sight through attaching the second virtual object to the interaction element.
In step 1004, when a third virtual object exists in the preset range of the second virtual object and the third virtual object is out of the line of sight range of the first virtual object, an emission animation of the interactive element from the second virtual object to the third virtual object is displayed.
Optionally, the third virtual object may be within a preset range of the second virtual object directly or within a preset range of the second virtual object indirectly. The indirect preset range means that there is an intermediate third virtual object within a direct preset distance range of the second virtual object, and there is also a third virtual object within a direct preset distance range of the intermediate third virtual object, and both third virtual objects are within a preset range of the second virtual object, where the intermediate third virtual object is within a direct preset range of the second virtual object, and the other third virtual object is within an indirect preset range of the second virtual object. When the third virtual object exists within the preset distance range of the other third virtual object, the other third virtual object can also be used as an intermediate third virtual object.
Optionally, when the second virtual object includes a plurality of third virtual objects within the direct preset range, the foregoing traversal is performed on the plurality of third virtual objects, so as to obtain the total number of the third virtual objects.
Or the interactive element is used for randomly traversing the first third virtual object within the specified distance range of the second virtual object, and randomly traversing the second third virtual object within the specified distance range of the first third virtual object until the traversing is completed of a plurality of third virtual objects.
In step 1005, a quantity prompt message is displayed, where the quantity prompt message is used to indicate a quantity of third virtual objects traversed by the interactive element based on the preset range.
Optionally, the second virtual object includes a third virtual object a within a preset distance range, the third virtual object a includes a third virtual object B within a preset distance range, the third virtual object B includes a third virtual object C within a preset distance range, and so on, so as to determine the number of third virtual objects within the preset range of the second virtual object.
In some embodiments, the specified obstacle exists between the first virtual object and the third virtual object, wherein the specified obstacle may be a virtual object in the virtual scene or a terrain element in the virtual scene. Alternatively, when the emissive animation is displayed, the animation display has an obstacle due to the occlusion of the specified obstacle. In the embodiment of the application, the appointed barrier is displayed with preset transparency; and displaying the transmitting animation of the interactive element from the second virtual object to the third virtual object through the designated barrier displayed with the preset transparency.
In summary, according to the method provided in this embodiment, by triggering the second virtual object to use the designated virtual prop, transmitting the interactive element to the second virtual object, and attaching the interactive element to the second virtual object, so as to represent the second virtual object as a temporary transfer medium of the interactive element, when the third virtual object exists in the preset range of the second virtual object, the interactive element is further transmitted from the second virtual object to the third virtual object, so that interaction with the third virtual object can be achieved even if the third virtual object is not in the line-of-sight range of the first virtual object, and man-machine interaction efficiency is improved.
According to the method provided by the embodiment, the number of the third virtual objects in the preset range of the second virtual object is prompted through the number prompt information, the third virtual objects outside the implementation range of the first virtual object are effectively prompted, the interface information transmission efficiency is improved, and the man-machine interaction efficiency is improved.
FIG. 11 is a schematic overall flow chart of virtual object interaction according to an exemplary embodiment of the present application, as shown in FIG. 11, the method includes the following steps:
step 1101, the player equips a skill weapon.
Optionally, the skill weapon is a virtual weapon for firing a space-apart attack bullet. Optionally, the skill weapon is a virtual weapon that the player equips before the game begins; or the virtual weapon is a virtual weapon equipped by the player after the start of the game, which is not limited in this embodiment.
Step 1102, a determination is made as to whether a skills weapon is activated.
Alternatively, the skill weapons are activated progressively based on the player's release of skill in the virtual game. Illustratively, each time a player releases a specified skill in a virtual game, a specified number is added to the activation number of the skill weapon, such as: each time a player releases a specified skill in a virtual game, the activation value of the skill weapon increases by 20%, and after the player releases 5 specified skills, the skill weapon is activated.
At step 1103, the skill diagram is highlighted when the skill weapon is activated.
When a skill weapon is activated, a skill icon in an activated state is displayed in the interface, i.e. highlighted. Wherein the highlighted skill icon is a touchable icon, and the skill weapon can be triggered by triggering operation of the highlighted skill icon.
Step 1104, determine whether to click on the use.
Judging whether the triggering operation of the skill icon is received, and when the triggering operation of the skill icon is received, representing the use of the skill weapon.
Step 1105, when the user clicks on the user, the skill weapon is switched out.
And when the clicking operation on the skill icon is received, displaying a picture for switching the main control virtual object and using the skill weapon.
Step 1106, a determination is made as to whether to fire.
Optionally, a firing control is displayed in the virtual environment interface, which when a trigger operation on the firing control is received, indicates firing of a virtual bullet by the technical weapon.
Step 1107, when firing, fire the bullet.
Optionally, the virtual cartridges are fired upon a triggering operation of the firing space, wherein the firing of the virtual cartridges is based on automatic aiming of the technical weapon, or the firing of the virtual cartridges is based on aiming of the targets by the player.
Step 1108, determine if there is a hit.
It is determined whether the virtual bullet hits the targeted target.
In step 1109, the bullet is stuck on the target when hit.
Alternatively, when the virtual bullet hits the target, the effect of the virtual bullet attaching to the target is displayed first, indicating that the target may act as a medium or as an acting target for the virtual bullet. Wherein, whether the target hits or not is detected through the collision detection line.
The player starts shooting at the muzzle and then emits a ray, the distance of which is configured by planning, each skill weapon may be different, as shown in fig. 12, and a target object 1200 is included in the virtual scene, and a collision detection ray 1210 is emitted based on the shooting direction of the master virtual object and the configuration information of the skill weapon. If the ray detects the first target (including person and obstacle), it returns the information of hit target, then detects whether it is human, if it is detected that it is human, it starts the position detection, as shown in fig. 13, each position of the target object 1300 is hung with a collision box, many positions will be on the model body, each position will have a collision box information, and if it is detected by ray, it can acquire the position information by collision box. Then, the hit position can be obtained according to the obtained position information, and the bullet can be directly attached to the opposite position according to the position information, so that the bullet can always move along with the target.
At step 1110, it is determined whether other targets are detected.
And judging whether other targets exist in a preset distance range of the periphery of the target.
Optionally, the positions of other enemies are acquired in real time with the first hit target as the center, then the distance between the target and the other enemies is calculated, if a certain target is detected to be within the detection range, the ray detection is performed, as shown in fig. 14, when the target object 1400 is detected to be within the preset distance range of the object 1410, the ray detection is performed between the two objects, and it is determined that no obstacle is blocked between the two objects. The radiation detection is performed by transmitting a radiation detection from the target object 1400 to the object 1410, if no other obstacle is detected in the middle, then the condition of the space attack is met, then the bullet is transmitted from the object 1410 to the target object 1400, and if the bullet hits, the virtual bullet acts on the target object 1400 and explodes.
At step 1111, when other targets are detected, the other targets are emitted and acted upon.
Step 1112, when no other targets are detected, acts on the current target.
When no other target within the target preset distance range is detected, the virtual bullet is directly acted on the current target, such as: creating an explosive effect on the current target.
In summary, according to the method provided in this embodiment, by triggering the second virtual object to use the designated virtual prop, transmitting the interactive element to the second virtual object, and attaching the interactive element to the second virtual object, so as to represent the second virtual object as a temporary transfer medium of the interactive element, when the third virtual object exists in the preset range of the second virtual object, the interactive element is further transmitted from the second virtual object to the third virtual object, so that interaction with the third virtual object can be achieved even if the third virtual object is not in the line-of-sight range of the first virtual object, and man-machine interaction efficiency is improved.
Fig. 15 is a block diagram of an interaction device for virtual objects according to an exemplary embodiment of the present application, and as shown in fig. 15, the device includes:
A display module 1510 for displaying a second virtual object in the virtual scene within the line of sight of the first virtual object;
a receiving module 1520, configured to receive an interaction triggering operation for the second virtual object in a case where the first virtual object holds a specified virtual prop, where the specified virtual prop is used to transmit an interaction element;
The display module 1510 is further configured to, in response to the interaction triggering operation hitting the second virtual object, display an interaction animation that the interaction element emits to the second virtual object and is attached to the second virtual object, where the interaction element is configured to emit from the second virtual object to the third virtual object and acts on the third virtual object when the third virtual object exists within a preset range of the second virtual object, so as to form an interaction effect between the first virtual object and the third virtual object.
In an optional embodiment, the interactive element is configured to transmit from the second virtual object to the third virtual object when a third virtual object exists within a preset range of the second virtual object and the third virtual object is out of a line of sight of the first virtual object.
In an alternative embodiment, as shown in fig. 16, the apparatus further comprises:
an acquiring module 1530, configured to acquire a first collision detection line between the first virtual object and the third virtual object; acquiring a second collision detection line between the second virtual object and the third virtual object;
a determining module 1540 is configured to determine that the interactive element is emitted from the second virtual object to the third virtual object in response to the first collision detection line indicating that there is an object collision, the second collision detection line indicating that no object collision has occurred.
In an alternative embodiment, the display module 1510 is further configured to display an activation progress element in a cooling state, where the activation progress element is configured to indicate an activation progress of the specified virtual prop; displaying an activation progress element in an activated state in response to the activation progress reaching a specified progress threshold; and responding to the trigger operation of the activated progress element in the activated state, and displaying that the first virtual object holds the appointed virtual prop.
In an alternative embodiment, the display module 1510 is further configured to update the activation progress in the activation progress element based on a specified interaction operation in the virtual scene in response to receiving the specified interaction operation.
In an alternative embodiment, the apparatus further comprises:
An obtaining module 1530, configured to obtain, in response to receiving a specified interaction operation between the virtual scene and another virtual object, an interaction value between the virtual object and the other virtual object generated by the specified interaction operation;
The display module 1510 is further configured to update the activation progress in the activation progress element according to a preset ratio based on the interaction value.
In an optional embodiment, the display module 1510 is further configured to display an emission animation of the interactive element from the second virtual object to the third virtual object when a third virtual object exists within a preset range of the second virtual object and the third virtual object is outside the line of sight range of the first virtual object.
In an alternative embodiment, a specified obstacle exists between the first virtual object and the third virtual object;
The display module 1510 is further configured to display the specified obstacle with a preset transparency; and displaying the transmitting animation of the interactive element transmitted from the second virtual object to the third virtual object through the specified barrier displayed with preset transparency.
In an optional embodiment, the second virtual object includes a plurality of third virtual objects within a preset range;
The display module 1510 is further configured to display a quantity prompt, where the quantity prompt is configured to indicate a quantity of the third virtual objects traversed by the interactive element based on the preset range, and the interactive element is configured to traverse a plurality of third virtual objects within the preset range of the second virtual object based on the preset range, and act on at least one third virtual object of the plurality of third virtual objects.
In an optional embodiment, the interactive element is configured to randomly traverse the first third virtual object within the specified distance range of the second virtual object, and randomly traverse the second third virtual object within the specified distance range of the first third virtual object until the traversing is completed for the plurality of third virtual objects.
In summary, in the device provided in this embodiment, by triggering the second virtual object to use the designated virtual prop, transmitting the interactive element to the second virtual object, and attaching the interactive element to the second virtual object, so as to represent the temporary transfer medium of the second virtual object as the interactive element, when the third virtual object exists in the preset range of the second virtual object, the interactive element is further transmitted from the second virtual object to the third virtual object, so that the interaction with the third virtual object can be realized even if the third virtual object is not in the line-of-sight range of the first virtual object, and the man-machine interaction efficiency is improved.
It should be noted that: in the interaction device for virtual objects provided in the above embodiment, only the division of the above functional modules is used as an example, and in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to perform all or part of the functions described above. In addition, the interaction device of the virtual object and the interaction method embodiment of the virtual object provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the interaction device of the virtual object are detailed in the method embodiment, which is not described herein again.
Fig. 17 shows a block diagram of a computer device 1700 provided by an exemplary embodiment of the application. The computer device 1700 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. Computer device 1700 may also be referred to by other names of user devices, portable terminals, laptop terminals, desktop terminals, and the like.
In general, the computer device 1700 includes: a processor 1701 and a memory 1702.
The processor 1701 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1701 may be implemented in at least one hardware form of DSP (DIGITAL SIGNAL Processing), FPGA (Field-Programmable gate array) GATE ARRAY, PLA (Programmable Logic Array ). The processor 1701 may also include a main processor and a coprocessor, wherein the main processor is a processor for processing data in an awake state, and is also called a CPU (Central Processing Unit ); a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 1701 may integrate a GPU (Graphics Processing Unit, image processor) for rendering and drawing of content required to be displayed by the display screen. In some embodiments, the processor 1701 may also include an AI processor for processing computing operations related to machine learning.
Memory 1702 may include one or more computer-readable storage media, which may be non-transitory. Memory 1702 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1702 is used to store at least one instruction for execution by processor 1701 to implement the method of interaction of virtual objects provided by the method embodiments of the present application.
In some embodiments, computer device 1700 also includes other components, and those skilled in the art will appreciate that the structure illustrated in FIG. 17 is not limiting of terminal 1700, and may include more or less components than those illustrated, or may combine certain components, or employ a different arrangement of components.
Alternatively, the computer-readable storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), solid state disk (SSD, solid STATE DRIVES), or optical disk, etc. The random access memory may include resistive random access memory (ReRAM, RESISTANCE RANDOM ACCESS MEMORY) and dynamic random access memory (DRAM, dynamic Random Access Memory), among others. The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
The embodiment of the application also provides a computer device, which comprises a processor and a memory, wherein at least one instruction, at least one section of program, a code set or an instruction set is stored in the memory, and the at least one instruction, the at least one section of program, the code set or the instruction set is loaded and executed by the processor to realize the interaction method of the virtual object according to any one of the embodiment of the application.
The embodiment of the application also provides a computer readable storage medium, wherein at least one instruction, at least one section of program, a code set or an instruction set is stored in the storage medium, and the at least one instruction, the at least one section of program, the code set or the instruction set is loaded and executed by a processor to realize the interaction method of the virtual object according to any one of the embodiments of the application.
Embodiments of the present application also provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the interaction method of the virtual object according to any of the above embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments of the present application is not intended to limit the application, but rather, the application is to be construed as limited to the appended claims.

Claims (14)

1. A method of interaction of virtual objects, the method comprising:
displaying a second virtual object in the virtual scene within the line of sight of the first virtual object;
Receiving an interaction triggering operation aiming at the second virtual object under the condition that the first virtual object holds a designated virtual prop, wherein the designated virtual prop is used for transmitting an interaction element;
And responding to the interaction triggering operation to hit the second virtual object, displaying an interaction animation of the interaction element transmitted to the second virtual object and attached to the second virtual object, wherein the interaction element is used for transmitting from the second virtual object to the third virtual object and acting on the third virtual object under the condition that the third virtual object exists in the preset range of the second virtual object, so as to form the interaction effect between the first virtual object and the third virtual object.
2. The method of claim 1, wherein the step of determining the position of the substrate comprises,
The interactive element is used for transmitting from the second virtual object to the third virtual object under the condition that the third virtual object exists in the preset range of the second virtual object and is out of the sight range of the first virtual object.
3. The method according to claim 2, wherein the method further comprises:
acquiring a first collision detection line between the first virtual object and the third virtual object;
acquiring a second collision detection line between the second virtual object and the third virtual object;
In response to the first collision detection line indicating that an object collision exists, the second collision detection line indicating that no object collision has occurred, determining that the interactive element is emitted from the second virtual object to the third virtual object.
4. A method according to any one of claims 1 to 3, wherein, in the case where the first virtual object holds a specified virtual prop, before receiving the interactive triggering operation for the second virtual object, further comprising:
Displaying an activation progress element in a cooling state, wherein the activation progress element is used for indicating the activation progress of the appointed virtual prop;
Displaying an activation progress element in an activated state in response to the activation progress reaching a specified progress threshold;
And responding to the trigger operation of the activated progress element in the activated state, and displaying that the first virtual object holds the appointed virtual prop.
5. The method according to claim 4, wherein the method further comprises:
in response to receiving a specified interactive operation in the virtual scene, an activation progress in the activation progress element is updated based on the specified interactive operation.
6. The method of claim 5, wherein the updating the activation progress in the activation progress element based on the interactive operation in response to receiving a specified interactive operation in the virtual scene comprises:
responding to receiving specified interaction operation between the virtual scene and other virtual objects, and acquiring interaction values between the specified interaction operation and the other virtual objects;
and updating the activation progress in the activation progress element according to a preset proportion based on the interaction value.
7. A method according to any one of claims 1 to 3, wherein said displaying an interactive animation of said interactive element being transmitted to and attached to said second virtual object in response to said interactive triggering operation hitting said second virtual object, further comprises:
And displaying a transmitting animation of the interactive element transmitted from the second virtual object to the third virtual object under the condition that the third virtual object exists in the preset range of the second virtual object and is out of the sight range of the first virtual object.
8. The method of claim 7, wherein a specified obstacle exists between the first virtual object and the third virtual object;
The displaying the launching animation of the interactive element from the second virtual object to the third virtual object comprises:
displaying the designated obstacle with a preset transparency;
And displaying the transmitting animation of the interactive element transmitted from the second virtual object to the third virtual object through the specified barrier displayed with preset transparency.
9. The method of claim 7, wherein the second virtual object includes a plurality of third virtual objects within a preset range;
after the display of the launching animation of the interactive element from the second virtual object to the third virtual object, the method further comprises:
Displaying quantity prompt information, wherein the quantity prompt information is used for indicating the quantity of the third virtual objects traversed by the interaction element based on the preset range, and the interaction element is used for traversing a plurality of third virtual objects in the preset range of the second virtual object based on the preset range and acting on at least one third virtual object in the plurality of third virtual objects.
10. The method of claim 9, wherein the step of determining the position of the substrate comprises,
The interactive element is used for randomly traversing the first third virtual object within the specified distance range of the second virtual object, and randomly traversing the second third virtual object within the specified distance range of the first third virtual object until the traversing is completed on the plurality of third virtual objects.
11. An interactive apparatus for virtual objects, the apparatus comprising:
the display module is used for displaying a second virtual object in the virtual scene in the sight range of the first virtual object;
The receiving module is used for receiving interaction triggering operation aiming at the second virtual object under the condition that the first virtual object holds a specified virtual prop, and the specified virtual prop is used for transmitting interaction elements;
The display module is further configured to, in response to the triggering operation hitting the second virtual object, display an interactive animation that the interactive element emits to the second virtual object and is attached to the second virtual object, where the interactive element is configured to emit from the second virtual object to the third virtual object and act on the third virtual object when the third virtual object exists in a preset range of the second virtual object, so as to form an interactive effect between the first virtual object and the third virtual object.
12. A computer device comprising a processor and a memory having stored therein at least one instruction that is loaded and executed by the processor to implement the method of interaction of virtual objects according to any of claims 1 to 10.
13. A computer readable storage medium having stored therein at least one instruction that is loaded and executed by a processor to implement the method of interaction of virtual objects of any of claims 1 to 10.
14. A computer program product comprising a computer program or instructions which, when executed by a processor, implement the method of interaction of virtual objects as claimed in any one of claims 1 to 10.
CN202211384497.2A 2022-11-07 2022-11-07 Virtual object interaction method, device, equipment, medium and program product Pending CN118022330A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211384497.2A CN118022330A (en) 2022-11-07 2022-11-07 Virtual object interaction method, device, equipment, medium and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211384497.2A CN118022330A (en) 2022-11-07 2022-11-07 Virtual object interaction method, device, equipment, medium and program product

Publications (1)

Publication Number Publication Date
CN118022330A true CN118022330A (en) 2024-05-14

Family

ID=91002906

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211384497.2A Pending CN118022330A (en) 2022-11-07 2022-11-07 Virtual object interaction method, device, equipment, medium and program product

Country Status (1)

Country Link
CN (1) CN118022330A (en)

Similar Documents

Publication Publication Date Title
JP7476235B2 (en) Method, apparatus, and computer program for controlling virtual objects
WO2021139371A1 (en) Virtual object control method, device, terminal, and storage medium
WO2022017063A1 (en) Method and apparatus for controlling virtual object to recover attribute value, and terminal and storage medium
US20230013014A1 (en) Method and apparatus for using virtual throwing prop, terminal, and storage medium
CN111111171B (en) Operation control method, operation control device, storage medium, and electronic device
WO2021244209A1 (en) Virtual object control method and apparatus, and terminal and storage medium
US20230054065A1 (en) Delivery of virtual effect
CN111084986A (en) Display control method, display control device, storage medium, and electronic device
US20230052088A1 (en) Masking a function of a virtual object using a trap in a virtual environment
WO2022156491A1 (en) Virtual object control method and apparatus, and device, storage medium and program product
US20230330530A1 (en) Prop control method and apparatus in virtual scene, device, and storage medium
US20230078571A1 (en) Information display method, apparatus, electronic device, computer-readable storage medium and computer program product
CN111359206A (en) Virtual object control method, device, terminal and storage medium
JP2023164787A (en) Picture display method and apparatus for virtual environment, and device and computer program
CN111202983A (en) Method, device, equipment and storage medium for using props in virtual environment
WO2024093940A1 (en) Method and apparatus for controlling virtual object group in virtual scene, and product
CN113893542A (en) Object control method and apparatus, storage medium, computer program, and electronic device
CN112121428B (en) Control method and device for virtual character object and storage medium
US20230030619A1 (en) Method and apparatus for displaying aiming mark
CN113694515B (en) Interface display method, device, terminal and storage medium
CN111111165A (en) Control method and device of virtual prop, storage medium and electronic device
CN113599822B (en) Virtual prop control method and device, storage medium and electronic equipment
CN113730908B (en) Picture display method and device, storage medium and electronic equipment
CN118022330A (en) Virtual object interaction method, device, equipment, medium and program product
CN117298580A (en) Virtual object interaction method, device, equipment, medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication