CN117298580A - Virtual object interaction method, device, equipment, medium and program product - Google Patents
Virtual object interaction method, device, equipment, medium and program product Download PDFInfo
- Publication number
- CN117298580A CN117298580A CN202210726519.2A CN202210726519A CN117298580A CN 117298580 A CN117298580 A CN 117298580A CN 202210726519 A CN202210726519 A CN 202210726519A CN 117298580 A CN117298580 A CN 117298580A
- Authority
- CN
- China
- Prior art keywords
- virtual
- virtual object
- prop
- interaction
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 199
- 238000000034 method Methods 0.000 title claims abstract description 81
- 230000004044 response Effects 0.000 claims description 45
- 238000010304 firing Methods 0.000 claims description 42
- 238000003860 storage Methods 0.000 claims description 15
- 230000008685 targeting Effects 0.000 claims description 15
- 230000009187 flying Effects 0.000 claims description 12
- 230000008859 change Effects 0.000 claims description 10
- 230000000694 effects Effects 0.000 claims description 9
- 238000004590 computer program Methods 0.000 claims description 8
- 230000002452 interceptive effect Effects 0.000 claims description 3
- 238000009957 hemming Methods 0.000 claims 1
- 230000008569 process Effects 0.000 description 17
- 238000010586 diagram Methods 0.000 description 15
- 238000012545 processing Methods 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 9
- 238000010079 rubber tapping Methods 0.000 description 7
- 208000027418 Wounds and injury Diseases 0.000 description 5
- 230000006378 damage Effects 0.000 description 5
- 208000014674 injury Diseases 0.000 description 5
- 230000006835 compression Effects 0.000 description 4
- 238000007906 compression Methods 0.000 description 4
- 208000015041 syndromic microphthalmia 10 Diseases 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 210000000988 bone and bone Anatomy 0.000 description 3
- 230000001795 light effect Effects 0.000 description 3
- 230000001960 triggered effect Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000009193 crawling Effects 0.000 description 2
- 230000009191 jumping Effects 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000036544 posture Effects 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 230000009183 running Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000009184 walking Effects 0.000 description 2
- 206010020400 Hostility Diseases 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000003999 initiator Substances 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
- A63F13/5372—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8076—Shooting
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application discloses an interaction method, device, equipment, medium and program product of a virtual object, and relates to the field of interface interaction. The method comprises the following steps: displaying an aiming range indicating element of the appointed virtual prop under the condition that the first virtual object holds the appointed virtual prop; displaying a target virtual object of the at least one second virtual object in a highlighting mode; and displaying the interaction result of the plurality of virtual bullets and the target virtual object. The aiming range indication element is displayed under the condition of holding the appointed virtual prop, so that a target virtual object aimed by the appointed virtual prop can be indicated to a player, after the virtual bullets are launched, the interaction condition between the virtual bullets and the target virtual object is effectively followed, and the problem that multiple times of operation interaction are needed and the human-computer interaction efficiency is low under the condition that a plurality of virtual bullets fly randomly in the blind direction of the flight direction is avoided.
Description
Technical Field
The embodiment of the application relates to the field of interface interaction, in particular to an interaction method, device, equipment, medium and program product of a virtual object.
Background
In applications that include virtual scenes, a user is typically able to control virtual objects to move within the virtual scene or interact with other virtual objects, such as: in a game, a Player can control a virtual object to virtually fight against a Non-Player Character (NPC) or other virtual object in a virtual scene.
In the related art, when a player attacks by using a virtual shotgun, as the virtual bullet launched by the virtual shotgun belongs to a divergent object, the divergent fly in a virtual scene attacks in various directions, thereby attacking an hostile virtual object.
However, due to the problem of attack divergence of the virtual shotgun, the player cannot confirm the attack condition of the hostile virtual object, or even cannot confirm which hostile virtual object is attacked, and the player is required to execute attack operations for a plurality of times to achieve the purpose of attack, so that the interface information amount is smaller, and the man-machine interaction efficiency is lower.
Disclosure of Invention
The embodiment of the application provides an interaction method, device, equipment, medium and program product of a virtual object, which can improve the interaction human-computer interaction efficiency of a main control virtual object in a virtual scene. The technical scheme is as follows:
In one aspect, a method for interaction of virtual objects is provided, the method comprising:
receiving a control operation of a first virtual object, wherein the first virtual object is in a virtual scene;
displaying an aiming range indicating element of a designated virtual prop under the condition that the first virtual object holds the designated virtual prop, wherein the designated virtual prop is used for simultaneously transmitting a plurality of virtual bullets and generating interaction with other virtual objects through a plurality of transmitting channels, and the aiming range indicating element is used for indicating the shooting range of the designated virtual prop when transmitting the virtual bullets;
in response to the firing range indicated by the targeting range indicating element including at least one second virtual object, displaying a target virtual object of the at least one second virtual object in a highlighting mode, the highlighting mode being used to indicate that the target virtual object is interactively locked by the designated virtual prop;
and responding to receiving the emission control operation of the appointed virtual prop, and displaying the interaction result of the interaction of the plurality of virtual bullets emitted by the appointed virtual prop and the target virtual object.
In another aspect, an interactive device for a virtual object is provided, the device including:
The receiving module is used for receiving control operation on a first virtual object, wherein the first virtual object is in a virtual scene;
the display module is used for displaying an aiming range indicating element of a specified virtual prop under the condition that the first virtual object holds the specified virtual prop, the specified virtual prop is used for simultaneously shooting a plurality of virtual bullets with a plurality of shooting channels to interact with other virtual objects, and the aiming range indicating element is used for indicating the shooting range of the specified virtual prop when the virtual bullets are shot;
the display module is further used for displaying a target virtual object in the at least one second virtual object in a highlighting mode in response to the shooting range indicated by the aiming range indication element, wherein the highlighting mode is used for indicating that the target virtual object is interactively locked by the appointed virtual prop;
the display module is further used for responding to the receiving of the emission control operation of the appointed virtual prop and displaying interaction results of the interactions of the plurality of virtual bullets emitted by the appointed virtual prop and the target virtual object.
In another aspect, a computer device is provided, where the computer device includes a processor and a memory, where the memory stores at least one instruction, at least one program, a set of codes, or a set of instructions, where the at least one instruction, the at least one program, the set of codes, or the set of instructions are loaded and executed by the processor to implement the method for interaction of virtual objects according to any one of the embodiments of the present application.
In another aspect, a computer readable storage medium is provided, where at least one instruction, at least one program, a set of codes, or a set of instructions is stored, where the at least one instruction, the at least one program, the set of codes, or the set of instructions are loaded and executed by a processor to implement a method for interaction of virtual objects as described in any of the embodiments of the application.
In another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the interaction method of the virtual object according to any of the above embodiments.
The beneficial effects that technical scheme that this application embodiment provided include at least:
the aiming range indication element is displayed under the condition of holding the appointed virtual prop, so that a target virtual object aimed by the appointed virtual prop can be indicated to a player, after the virtual bullets are launched, the interaction condition between the virtual bullets and the target virtual object is effectively followed, the problem that the interaction condition is unstable under the condition that a plurality of virtual bullets fly randomly in the flying direction blindly, and the interaction efficiency of human-computer interaction is low due to the fact that the interaction is needed to be operated for many times is solved, and the interaction effectiveness between the virtual bullets and the target virtual object is improved through locking interaction.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic illustration of interaction of virtual objects provided in an exemplary embodiment of the present application;
FIG. 2 is an interactive schematic view of a virtual object provided in another exemplary embodiment of the present application;
fig. 3 is a block diagram of a terminal according to an exemplary embodiment of the present application;
FIG. 4 is a schematic illustration of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 5 is a flow chart of a method of interaction of virtual objects provided by an exemplary embodiment of the present application;
FIG. 6 is a schematic diagram of a display of identification points on a target virtual object provided based on the embodiment shown in FIG. 5;
FIG. 7 is a schematic diagram of overlaying display elements on a target virtual object provided based on the embodiment shown in FIG. 5;
FIG. 8 is a flowchart of a method of interaction of virtual objects provided in another exemplary embodiment of the present application;
FIG. 9 is a schematic diagram of a switching target virtual object by motion control provided based on the embodiment shown in FIG. 8;
FIG. 10 is a schematic diagram of a switch target virtual object by a sliding operation provided based on the embodiment shown in FIG. 8;
FIG. 11 is a flowchart of a method of interaction of virtual objects provided in another exemplary embodiment of the present application;
FIG. 12 is a two-stage flight schematic of a virtual bullet provided based on the embodiment shown in FIG. 11;
FIG. 13 is a schematic diagram of equipment for designating virtual props provided based on the embodiment shown in FIG. 11;
FIG. 14 is a schematic overall flow diagram of virtual object interaction provided in one exemplary embodiment of the present application;
FIG. 15 is a schematic diagram of coordinate position conversion provided by an exemplary embodiment of the present application;
FIG. 16 is a schematic illustration of a virtual object display within a firing range provided by an exemplary embodiment of the present application;
FIG. 17 is a block diagram of an interaction device for virtual objects provided in an exemplary embodiment of the present application;
FIG. 18 is a block diagram of an interaction device for virtual objects provided in accordance with another exemplary embodiment of the present application;
fig. 19 is a block diagram of a computer device according to an exemplary embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
In a gaming application or some virtual scene-based application, a player is typically able to control a virtual object to perform a variety of actions in a virtual scene, or a player is able to control a virtual object to interact with other virtual objects in a virtual scene.
Schematically, the player can control the main control virtual object to perform dancing interaction and virtual attack interaction with the virtual objects controlled by other players in the virtual scene; the master virtual object may also be controlled to interact with Non-Player Character (NPC) in the virtual scene.
The player can also control the master virtual object to interact with other virtual objects in the virtual scene using various virtual props, such as: the virtual props are used for attacking the hostile virtual objects, or the virtual medical props are used for treating the teammate virtual objects.
Taking the attack of the virtual prop on the hostile virtual object as an example, the main control virtual object can attack the hostile virtual object by using the virtual prop such as a virtual sniper gun, a virtual rifle, a virtual shotgun, a virtual machine gun and the like. The virtual shotgun is a virtual prop with stronger killing power in the close range props, and because the shotgun can be used for shooting multiple bullets at one time compared with other props, each bullet can be used for calculating injuries independently, and therefore targets can be easily killed by using the virtual shotgun.
However, in the related art, when the virtual shotgun is used, due to the divergent injury characteristic of the virtual shotgun, when a certain distance exists between the hostile virtual object and the main control virtual object, the main control virtual object cannot control the attack condition of the hostile virtual object, so that multiple operation attacks are required, and the man-machine interaction efficiency is low.
In this embodiment of the present application, when there are multiple hostile virtual objects within the firing range of the virtual shotgun, the target hostile virtual objects therein are highlighted, for example: fluorescent highlighting, indication mark displaying, etc., and locking the target hostile virtual object to attack when the virtual shotgun simultaneously fires multiple virtual bullets. Firstly, the attack target of the main control virtual object can be definitely determined, and secondly, the attack success rate of the target to the virtual object can be improved, and the man-machine interaction efficiency is improved.
Illustratively, as shown in FIG. 1, when the master virtual object 100 holds a virtual shotgun, an aiming range indicating element 110 is displayed in the interface, the aiming range indicating element 110 being embodied as a circular range in FIG. 1. When the aiming scope indicating element 110 includes a hostile virtual object 120 within the aiming scope, highlighting the target hostile virtual object 121 therein, then it indicates that the current target hostile virtual object 121 is the virtual object that was attacked by the virtual shotgun lock.
The aiming range indicating element 110 is an element automatically displayed when the master virtual object 100 holds the virtual shotgun, that is, the aiming range indicating element 110 is continuously displayed during the aiming process of the master virtual object 100 holding the virtual shotgun, and the highlighted target hostile virtual object 121 is adjusted in real time according to the change of the hostile virtual object 120 displayed in the aiming range indicating element 110.
Such as: while the master virtual object 100 holds the virtual shotgun, the sighting range indicating element 110 is continuously displayed at time t 1 The aiming range indicating element 110 includes a hostile virtual object a and a hostile virtual object b, and randomly highlights the hostile virtual object a, and at time t 2 When the hostile virtual object a moves out of the aiming range indicating element 110 and the hostile virtual object c moves into the aiming range indicating element 110, the hostile virtual object b is randomly highlighted.
Illustratively, as shown in fig. 2, the aiming scope indicating element 200 includes a hostile virtual object 211 and a hostile virtual object 212 therein, wherein the hostile virtual object 211 is a highlighted virtual object, and when the master virtual object 220 triggers a virtual shotgun attack operation, an animation of the plurality of virtual bullets 230 being launched to the hostile virtual object 211 is displayed.
The terminals in this application may be desktop computers, laptop portable computers, cell phones, tablet computers, e-book readers, MP3 (Moving Picture Experts Group Audio Layer III, moving picture experts compression standard audio layer 3) players, MP4 (Moving Picture Experts Group Audio Layer IV, moving picture experts compression standard audio layer 4) players, and the like. The terminal is installed and operated with an application program supporting a virtual scene, such as an application program supporting a three-dimensional virtual scene. The application may be any one of a virtual reality application, a three-dimensional map application, a Third person shooter game (TPS), a First person shooter game (FPS), a multiplayer online tactical game (Multiplayer Online Battle Arena Games, MOBA). Alternatively, the application may be a stand-alone application, such as a stand-alone three-dimensional game, or a network-connected application.
Fig. 3 shows a block diagram of an electronic device according to an exemplary embodiment of the present application. The electronic device 300 includes: an operating system 320 and application programs 322.
Operating system 320 is the underlying software that provides applications 322 with secure access to computer hardware.
The application 322 is an application supporting virtual scenes. Alternatively, the application 322 is an application that supports three-dimensional virtual scenes. The application 322 may be any one of a virtual reality application, a three-dimensional map program, a TPS game, an FPS game, and a MOBA game. The application 322 may be a stand-alone application, such as a stand-alone three-dimensional game, or a network-connected application.
FIG. 4 illustrates a block diagram of a computer system provided in an exemplary embodiment of the present application. The computer system 400 includes: a first device 420, a server 440, and a second device 460.
The first device 420 installs and runs an application supporting a virtual scene. The application may be any one of a virtual reality application, a three-dimensional map program, a TPS game, an FPS game, and a MOBA game. The first device 420 is a device used by a first user to control a second virtual object located in a virtual scene to perform activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking up, shooting, attacking, throwing. Illustratively, the second virtual object is a first virtual character, such as an emulated persona or a cartoon persona.
The first device 420 is connected to the server 440 via a wireless network or a wired network.
The server 440 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 440 is used to provide background services for applications supporting three-dimensional virtual scenes. Optionally, the server 440 takes on primary computing work, and the first device 420 and the second device 460 take on secondary computing work; alternatively, the server 440 performs the secondary computing job and the first device 420 and the second device 460 perform the primary computing job; alternatively, the server 440, the first device 420 and the second device 460 may perform collaborative computing using a distributed computing architecture.
The second device 460 installs and runs an application supporting virtual scenarios. The application may be any one of a virtual reality application, a three-dimensional map program, an FPS game, a MOBA game, and a multiplayer gunfight survival game. The second device 460 is a device used by a second user that uses the second device 460 to control a second virtual object located in the virtual scene to perform activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking up, shooting, attacking, throwing. Illustratively, the second virtual object is a second virtual character, such as an emulated persona or a cartoon persona.
Optionally, the first avatar and the second avatar are in the same virtual scene. Alternatively, the first avatar and the second avatar may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights. Alternatively, the first avatar and the second avatar may belong to different teams, different organizations, or two parties with hostility.
Alternatively, the applications installed on the first device 420 and the second device 460 are the same, or the applications installed on the two devices are the same type of application for different control system platforms. The first device 420 may refer broadly to one of a plurality of devices and the second device 460 may refer broadly to one of a plurality of devices, the present embodiment being illustrated with only the first device 420 and the second device 460. The device types of the first device 420 and the second device 460 are the same or different, and the device types include: at least one of a game console, a desktop computer, a smart phone, a tablet computer, an electronic book reader, an MP3 player, an MP4 player, and a laptop portable computer. The following embodiments are illustrated with the device being a desktop computer.
Those skilled in the art will appreciate that the number of devices described above may be greater or lesser. Such as the above-mentioned devices may be only one, or the above-mentioned devices may be several tens or hundreds, or more. The number of devices and the types of devices are not limited in the embodiments of the present application.
It should be noted that, the server 440 may be implemented as a physical server or may be implemented as a Cloud server in the Cloud, where Cloud technology refers to a hosting technology that unifies serial resources such as hardware, software, and networks in a wide area network or a local area network to implement calculation, storage, processing, and sharing of data. The cloud technology is based on the general names of network technology, information technology, integration technology, management platform technology, application technology and the like applied by the cloud computing business mode, can form a resource pool, and is flexible and convenient as required. Cloud computing technology will become an important support. Background services of technical networking systems require a large amount of computing, storage resources, such as video websites, picture-like websites, and more portals. Along with the high development and application of the internet industry, each article possibly has an own identification mark in the future, the identification mark needs to be transmitted to a background system for logic processing, data of different levels can be processed separately, and various industry data needs strong system rear shield support and can be realized through cloud computing.
Alternatively, the server 440 described above may also be implemented as a node in a blockchain system.
In some embodiments, the method provided by the embodiment of the application can be applied to a cloud game scene, so that the calculation of data logic in the game process is completed through a cloud server, and the terminal is responsible for displaying a game interface.
It should be noted that, information (including but not limited to user equipment information, user personal information, etc.), data (including but not limited to data for analysis, stored data, presented data, etc.), and signals referred to in this application are all authorized by the user or are fully authorized by the parties, and the collection, use, and processing of relevant data is required to comply with relevant laws and regulations and standards of relevant countries and regions. For example, the game data referred to in this application are all acquired with sufficient authorization.
Referring to fig. 5, a flowchart of a virtual object interaction method provided in an exemplary embodiment of the present application is shown, and the method is applied to a terminal for illustration, as shown in fig. 5, and includes:
In step 501, a control operation is received for a first virtual object.
Wherein the first virtual object is in a virtual scene.
The first virtual object is a virtual object controlled by the current terminal, that is, the current terminal can control the first virtual object to move, execute actions, change shapes and the like in the virtual scene.
In addition to the first virtual object, the virtual scene further comprises other virtual objects, wherein the other virtual objects comprise first virtual object hostile virtual objects, or teammate virtual objects comprising the first virtual object, or teammate virtual objects and hostile virtual objects comprising the first virtual object. In some embodiments, the other virtual objects further include a split virtual object of the first virtual object, that is, the split virtual object and the first virtual object correspond to the same object parameters but to different scene positions.
The control operation may be used to control the first virtual object to move in the virtual scene, perform actions, change the shape, and also control the first virtual object to switch the virtual prop currently held, for example: the control operation is used for controlling the first virtual object to switch the held initial virtual prop into the appointed virtual prop.
Step 502, when the first virtual object holds the specified virtual prop, displaying the aiming range indication element of the specified virtual prop.
The specified virtual object is used for simultaneously shooting a plurality of virtual bullets with a plurality of shooting channels to interact with other virtual objects, and the aiming range indicating element is used for indicating the shooting range of the specified virtual prop when the virtual bullets are shot.
In some embodiments, the aiming scope indicating element of the specified virtual prop is automatically continuously displayed in the case that the first virtual object holds the specified virtual prop, that is, the display of the aiming scope indicating element is continuous during the aiming and transmitting process of the first virtual object holding the specified virtual prop, without triggering through a transmitting operation or other operations. Optionally, in response to the first virtual object switching from the initial virtual prop to the specified virtual prop, an aiming range indication element of the specified virtual prop is automatically displayed.
In some embodiments of the present invention, in some embodiments,display of aiming range indicating elementIncluding at least one of the following:
firstly, under the condition that a first virtual object holds a specified virtual prop, a round range corresponding to a preset radius is determined to be used as a shooting range by taking the center position of a terminal interface as the center of a circle, and aiming range indication elements of the specified virtual prop are displayed on the periphery of the shooting range.
Optionally, the effect that a plurality of virtual bullets scatter and fly out after being launched is simulated, a circular range is displayed in the terminal interface, and the plurality of virtual bullets fly in the circular range and interact with the bumped virtual object, such as: attack is performed on the impacted virtual object or treatment is performed for the impacted virtual object.
Second, in the case that the first virtual object holds the specified virtual prop, a random graphic range is randomly determined in the terminal interface as a shooting range, and aiming range indication elements of the specified virtual prop are displayed on the periphery of the shooting range.
Optionally, since the plurality of virtual bullets launched by the specified virtual prop are themselves divergent in the flight after the launch, a corresponding random graphics range is determined according to the random divergence path of each virtual bullet in the plurality of virtual bullets, and an aiming range indicating element corresponding to the random graphics range is displayed. Wherein a plurality of virtual bullets start flying within the range of the aiming range indicating element in an initial stage after firing.
And thirdly, the appointed virtual prop comprises a plurality of launching channels which are transversely arranged, a plurality of virtual bullets are launched from the plurality of channels which are transversely arranged at the same time, and when the first virtual object holds the appointed virtual prop, a rectangular range is displayed in a terminal interface to serve as a shooting range, and aiming range indication elements of the appointed virtual prop are displayed at the periphery of the shooting range.
Optionally, when the firing channels are arranged transversely, the virtual bullets are fired from the channels in a line, and due to the influence of virtual gravity in the game, the virtual bullets have a descending trend after firing, so a rectangular firing range is formed on a plane, aiming range indicating elements corresponding to the rectangular range are displayed, and the virtual bullets start flying in a line within the rectangular range after firing.
It should be noted that the above display of the aiming-range indicating element is only an illustrative example, and the embodiments of the present application are not limited thereto.
In response to the firing range indicated by the targeting range indicating element including at least one second virtual object, a target virtual object of the at least one second virtual object is displayed in a highlighting mode, step 503.
The highlighting mode is used to indicate that the target virtual object is interactively locked by the designated virtual prop.
Optionally, the highlighting mode is used to indicate that all or part of the plurality of virtual bullets issued by the specified virtual prop will lock and interact with the target virtual object.
In some embodiments, the target virtual object may be implemented as one of the at least one second virtual object; or when at least one second virtual object is implemented as a plurality of second virtual objects, the target virtual object may also be at least two virtual objects in the plurality of second virtual objects, which in the embodiment of the present application is not limited to the number of target virtual objects. In this embodiment, the target virtual object is implemented as a second virtual object.
Alternatively, the process may be carried out in a single-stage,implementation of highlighting modeComprises at least one of the following modes:
1. at least one second virtual object is included in the firing range indicated by the aiming range indicating element to specify display parameters for stroking a target virtual object of the second virtual objects.
Illustratively, specifying the display parameters includes: at least one of parameters such as display color, display thickness, display light effect, display line type, etc. Such as: carrying out tracing display on the target virtual object by using red lines; carrying out tracing display on the target virtual object by using a line with the thickness of 1 pound; carrying out tracing display on the target virtual object by using the flash light effect; the target virtual object is displayed with a dashed line type, etc.
In some embodiments, when the target virtual object is highlighted, the identification points are displayed on the plurality of object positions of the target virtual object, wherein the display size of the identification points is in positive correlation with the interaction probability of the object position receiving interaction, and the interaction probability is in positive correlation with the interaction effect of the object position receiving interaction.
The interaction effect refers to the influence degree of the appointed attribute value of the target virtual object when the virtual bullet interacts with the object part. Such as: designating attribute values as life values, wherein the target parts comprise a head part, four limbs parts and a trunk part, and when the head is hit by a virtual bullet, 20 points of loss are generated on the life values; when the limb part is hit by a virtual bullet, 15 points of loss are generated on the life value; when the trunk part is hit by the virtual bullet, 10 points of injury are generated to the life value, the probability of head receiving interaction is larger than the probability of four limbs, and the probability of four limbs receiving interaction is larger than the probability of trunk part receiving interaction. So that the head mark point is larger than the mark point of the four limbs, and the mark point of the four limbs is larger than the mark point of the trunk.
The identification points are used for assisting in indicating the condition that the body parts receive interaction when the body parts receive interaction, such as: the number of virtual bullets flying toward and contacting the head marking point, etc.
As shown in fig. 6, a virtual scene interface includes a target virtual object 600, a head of the target virtual object 600 is marked with an identification point 610, four limbs of the target virtual object are marked with an identification point 620, and a trunk part of the target virtual object is marked with an identification point 630, an area of the identification point 610 is larger than an area of the identification point 620, and an area of the identification point 620 is larger than an area of the identification point 630. According to the condition that the virtual bullets 640 fly to each identification point and collide with the identification points, the condition that the body parts corresponding to each identification point receive interaction of the virtual bullets 640 can be known. Optionally, the virtual bullet 640 forms a flashing special effect on the identification point when colliding with the identification point, thereby assisting in indicating a collision event.
2. At least one second virtual object is included in the firing range indicated by the aiming range indicating element to specify display parameters for overlay display of a target virtual object of the second virtual objects.
Illustratively, specifying the display parameters includes: at least one of parameters such as display color, display light effect, display transparency, etc. Such as: and (5) performing overlaying display on the target virtual object in the red semitransparent color.
In some embodiments, when the target virtual object is overlaid and displayed, the coverage of the target virtual object is in positive correlation with the specified attribute value of the target virtual object, and the interaction of the virtual bullet with the target virtual object is used for affecting the specified attribute value of the target virtual object.
Illustratively, the attribute value is designated as a life value, and if the life value of the target virtual object remains 80 (total life value 100), the height of the coverage area accounts for 80% of the total display height of the target virtual object; if the life value of the target virtual object remains 60, the coverage area height occupies 60% of the total display height of the target virtual object.
As shown in fig. 7, a target virtual object 700 is included in the virtual scene interface, and a display element 710 is overlaid on the current target virtual object 700, and the display element 710 is overlaid with 80% of the height of the target virtual object 700, so as to indicate that the remaining life value of the current target virtual object 700 is 80.
3. In response to the firing range indicated by the targeting range indicating element including at least one second virtual object, a lock display element is displayed on a circumference side of the target virtual object.
Illustratively, the lock display element may be implemented as an arrow display element at a top of head position of the target virtual object; alternatively, the lock display element may be implemented as a three-dimensional annular display element that surrounds the body of the target virtual object; alternatively, the lock display element may be implemented as a beam element that diverges upward from the ground surface with which the target virtual object is in contact, and the display manner of the lock display element is not limited in this embodiment.
It should be noted that the implementation of the highlighting mode is only an illustrative example, and the embodiments of the present application are not limited thereto.
In step 504, in response to receiving the launch control operation for the specified virtual prop, an interaction result of the plurality of virtual bullets launched by the specified virtual prop interacting with the target virtual object is displayed.
Optionally, when receiving the launching control operation on the designated virtual prop, triggering the plurality of virtual bullets to launch from the designated virtual prop and fly to the animation of the target virtual object, and displaying the interaction result of the plurality of virtual bullets and the target virtual object after the virtual bullets are contacted with the target virtual object. In this case, since the specified virtual prop locks and interacts with the target virtual object, that is, even if a plurality of virtual bullets are emitted divergently during the emission, all or part of the virtual bullets are focused on the target virtual object to interact with the target virtual object.
In summary, according to the method provided by the embodiment of the application, the aiming range indication element is displayed under the condition of holding the appointed virtual prop, so that the target virtual object aimed by the appointed virtual prop can be indicated to the player, after the virtual bullets are launched, the interaction condition between the virtual bullets and the target virtual object is effectively followed, the problem that the interaction condition is unstable under the condition that a plurality of virtual bullets fly randomly in the blind direction, and the interaction efficiency of human-computer interaction is low due to the fact that the interaction is required to be operated for a plurality of times is solved, and the interaction effectiveness between the virtual bullets and the target virtual object is improved through locking interaction.
According to the method provided by the embodiment, the aiming range indication elements in the circular range are displayed, the divergence ranges of the virtual bullets are controlled, the control efficiency of the player on the use of the appointed virtual props is improved, and the man-machine interaction efficiency is improved.
According to the method provided by the embodiment, aiming range indication elements of random graphic ranges are displayed, random flight directions of a plurality of virtual bullets during shooting are simulated, and the diversity of picture expression is improved.
According to the method provided by the embodiment, the target virtual object is traced through the appointed display parameters, and the identification points corresponding to the object parts are displayed at the same time, wherein the identification points are used for assisting in indicating the condition that the body parts receive interaction when each body part receives interaction, so that the data transmission quantity and the effectiveness of interface expression are improved.
According to the method provided by the embodiment, the target virtual object is subjected to coverage display through the specified display parameters, and the coverage area and the specified attribute value of the target virtual object are in positive correlation, so that the specified attribute value of the current target virtual object is laterally prompted, and the data transmission quantity and the effectiveness of interface expression are improved.
In some embodiments, when there are multiple second virtual objects, the target virtual object may be switched. Fig. 8 is a flowchart of a method for interaction of virtual objects according to another exemplary embodiment of the present application, and the method is applied to a terminal for illustration, as shown in fig. 8, the above step 503 may be further implemented as the following steps 5031 to 5034.
In step 5031, in response to the shooting range indicated by the aiming range indicating element including the plurality of second virtual objects, the marking element is displayed at the indication position corresponding to the plurality of second virtual objects.
The marking element is used to indicate a virtual object that is within a firing range of a specified virtual prop.
Optionally, when the virtual object is within the line of sight of the first virtual object and meets the line of sight requirement, the virtual object is determined to be a virtual object within the shooting range of the specified virtual prop.
Optionally, the line-of-sight requirement includes a display area requirement within the aiming-range indicating element, such as: when the virtual object is within the line of sight of the first virtual object and the area of the virtual object displayed within the aiming range indicating element is greater than the area threshold, the virtual object is determined to be the second virtual object. The display area can show the distance between the virtual object and the first virtual object, and the closer the distance is, the larger the total display area of the virtual object is; on the other hand, the body proportion of the virtual object in the aiming range indicating element can be embodied.
Optionally, the implementation range requirements include display scale requirements within the aiming range indicating element, such as: and when the virtual object is in the realization range of the first virtual object and the proportion of the area of the virtual object displayed in the aiming range indicating element to the total body area reaches the required proportion, determining the virtual object as the second virtual object.
Optionally, the display mode of the marking element includes at least one of the following modes:
1. performing a stroking display on the plurality of second virtual objects, that is, displaying a circle of stroking on the periphery of each second virtual object display area to indicate that the second virtual object is marked with a designated virtual prop;
2. Performing element coverage display on a plurality of second virtual objects, namely, overlaying semitransparent display elements on the second virtual objects to represent that the second virtual objects are marked by the appointed virtual prop;
3. a ring identification element is displayed on a perimeter side of the plurality of second virtual objects to indicate that the second virtual objects are assigned virtual prop markers.
Notably, the highlighting modes of the markup element and the target virtual object are implemented in different display forms.
In step 5032, a target virtual object is determined from the plurality of second virtual objects marked by the marking element, and the target virtual object is displayed in a highlighting mode.
When determining the target virtual object from the plurality of second virtual objects, any one of the following modes may be adopted: randomly determining a target virtual object from a plurality of second virtual objects; or, based on the attribute values corresponding to the second virtual objects, taking the second virtual object with the lowest attribute value or the highest attribute value from the second virtual objects as a target virtual object; or selecting a second virtual object closest to or farthest from the first virtual object from the plurality of second virtual objects as a target virtual object; or selecting a second virtual object at the leftmost side or the rightmost side of the interface from the plurality of second virtual objects as a target virtual object; or determining the target virtual object from the plurality of second virtual objects through the pre-trained object locking model, wherein the object locking model is a pre-trained machine learning model. Optionally, the object parameters of the plurality of second virtual objects and the object parameters of the first virtual object are input into the object locking model, and the locked target virtual object in the plurality of second virtual objects is output. Wherein the object parameters include at least one of location parameters, attribute parameters, level parameters, and interaction preference parameters in the virtual scene. The interaction preference parameter is a parameter obtained by analyzing historical interaction data of the virtual object, for example: in the history interaction process, more than 60% of the interaction process of the virtual object belongs to the active initiator, and the interaction preference parameter corresponds to the active initiation interaction parameter.
The selection manner of the target virtual object is not limited in the embodiments of the present application.
Optionally, in the process of aiming, when the target virtual object leaves the marked range of the aiming range indicating element, the target virtual object is redetermined from the range indicated by the aiming range indicating element, wherein the manner of redefining the target virtual object is consistent with the manner of initially determining the target virtual object.
In step 5033, an object switch operation is received.
The object switching operation is used for indicating to switch the interactively locked target virtual object in the second virtual objects marked by the marking elements. That is, the second virtual object A is included in the plurality of second virtual objects 1 And a second virtual object A 2 Current second virtual object A 1 Locked as target virtual object, the second virtual object A can be accessed by an object switching operation 2 Locking as a target virtual object.
Alternatively, the process may be carried out in a single-stage,method for receiving object switching operationComprises at least one of the following modes:
first, receiving a motion control operation of a terminal as an object switching operation, wherein the motion control operation is detected by a motion sensor in the terminal;
illustratively, a shake operation on the terminal is received as an object switching operation.
In some embodiments, motion control applications during program execution are first detected, such as: and acquiring triggering mode data of each operation, judging whether the triggering mode data of each operation comprises application of motion control data, and if the application of the motion control data does not exist, starting switching of a virtual object controlled by the motion control operation.
Illustratively, as shown in fig. 9, the target virtual object 910 is indicated in the aiming range indication element in the virtual environment interface, and after receiving the shake operation on the terminal 900, the target virtual object 920 is indicated in the aiming range indication element, and the original target virtual object 910 is reused as the second virtual object.
Optionally, when the target virtual object is switched through the motion control operation, randomly taking any second virtual object as the switched target virtual object; or taking the next second virtual object of the current target virtual object as the target virtual object after switching according to the preset arrangement sequence.
Secondly, receiving a knocking operation on a terminal display screen as an object switching operation;
optionally, the terminal display screen can detect a tapping event, where the tapping operation may be implemented as a single tapping operation or may be implemented as a multi-tapping operation, for example: double-click operation.
The tapping operation may be performed by a player to tap the terminal display screen through a knuckle, or may be performed by a player to tap the terminal display screen through a specific tapping prop, which is not limited in this embodiment.
Thirdly, receiving a sliding operation on a terminal display screen as an object switching operation, wherein the sliding direction of the sliding operation is used for indicating the switching direction of a switching target virtual object;
illustratively, when the rightward sliding operation on the terminal display screen is received, the second virtual object is found to the right side of the current target virtual object to switch, and illustratively, as shown in fig. 10, the target virtual object is indicated as 1010 in the aiming range indication element in the virtual environment interface, after the rightward sliding operation on the terminal display screen is received, the target virtual object 1020 is indicated as the target virtual object in the aiming range indication element, and the original target virtual object 1010 is taken as the second virtual object again.
Fourth, a view angle adjustment operation for adjusting a view angle of the first virtual object to a target virtual object out of a range of the aiming range indicating element is received as an object switching operation.
That is, the perspective of the first virtual object is adjusted so that the target virtual object is out of the range of the aiming-range indicating element, and the target virtual object is redetermined within the range of the aiming-range indicating element.
Step 5034, based on the object switching operation, the target virtual object before switching is canceled to be displayed in the highlighting mode, and the target virtual object after switching is displayed in the highlighting mode.
That is, after the object switching operation is completed, the target virtual object after switching is realized, the target virtual object after switching is highlighted, and the highlighting of the target virtual object before switching is canceled.
In summary, according to the method provided by the embodiment of the application, the aiming range indication element is displayed under the condition of holding the appointed virtual prop, so that the target virtual object aimed by the appointed virtual prop can be indicated to the player, after the virtual bullets are launched, the interaction condition between the virtual bullets and the target virtual object is effectively followed, the problem that the interaction condition is unstable under the condition that a plurality of virtual bullets fly randomly in the blind direction, and the interaction efficiency of human-computer interaction is low due to the fact that the interaction is required to be operated for a plurality of times is solved, and the interaction effectiveness between the virtual bullets and the target virtual object is improved through locking interaction.
According to the method provided by the embodiment, the target virtual objects are switched among the plurality of second virtual objects through the object switching operation, so that the complex process that the automatically selected target virtual objects cannot meet the interaction requirement of the first virtual objects and the interaction needs to be executed for a plurality of times is avoided, and the man-machine interaction efficiency is improved.
According to the method provided by the embodiment, the object switching operation is realized through the motion control operation of the terminal, the knocking operation of the display screen of the terminal and the sliding operation of the display screen of the terminal, so that the switching of the target virtual object can be realized simply and conveniently through the operations such as shaking the mobile phone, and the man-machine interaction efficiency is improved.
In some embodiments, the plurality of virtual bullets interact with random selections of different body parts. Fig. 11 is a flowchart of a method for interaction of virtual objects according to another exemplary embodiment of the present application, and the method is applied to a terminal for illustration, as shown in fig. 11, and includes:
step 1101, receiving a control operation for a first virtual object.
Wherein the first virtual object is in a virtual scene.
In step 1102, in the case that the first virtual object holds the specified virtual prop, an aiming range indication element of the specified virtual prop is displayed.
The specified virtual object is used for simultaneously shooting a plurality of virtual bullets with a plurality of shooting channels to interact with other virtual objects, and the aiming range indicating element is used for indicating the shooting range of the specified virtual prop when the virtual bullets are shot.
In some embodiments, the aiming scope indicating element of the specified virtual prop is automatically continuously displayed in the case that the first virtual object holds the specified virtual prop, that is, the display of the aiming scope indicating element is continuous during the aiming and transmitting process of the first virtual object holding the specified virtual prop, without triggering through a transmitting operation or other operations.
Optionally, in response to the first virtual object switching from the initial virtual prop to the specified virtual prop, an aiming range indication element of the specified virtual prop is automatically displayed.
In some embodiments, the specified virtual prop is implemented as a specified attack prop, such as: virtual shotgun, virtual bow, etc., attack other virtual objects by firing virtual bullets, thereby reducing the life values of other virtual objects. In other embodiments, the specified virtual prop is implemented as a specified medical prop, such as: the virtual medical gun treats other virtual objects by shooting virtual bullets, thereby improving the life values of the other virtual objects.
In response to the firing range indicated by the targeting range indicating element including at least one second virtual object, a target virtual object of the at least one second virtual object is displayed in a highlighting mode, step 1103.
The highlighting mode is used to indicate that the target virtual object is interactively locked by the designated virtual prop.
Optionally, the highlighting mode is used to indicate that all or part of the plurality of virtual bullets issued by the specified virtual prop will lock and interact with the target virtual object.
In step 1104, in response to receiving the launch control operation for the specified virtual prop, an interaction result of the multiple virtual bullets launched by the specified virtual prop being diverged to fly toward the target virtual object and interacting with respective object portions of the target virtual object is displayed.
Wherein the target virtual object includes a plurality of object locations.
Optionally, acquiring interaction probabilities corresponding to the object parts respectively, wherein the interaction probabilities are preset probabilities corresponding to the object parts; and determining interaction distribution conditions between the plurality of virtual bullets and the plurality of object parts based on the interaction probability, and displaying interaction results of the interaction of the plurality of virtual bullets scattered and flown to the plurality of object parts based on the interaction distribution conditions.
In some embodiments, the interaction probability is related to an interaction impact of the object site. Schematically, the target virtual object includes a specified attribute value, and the interaction probability and the influence of the attribute value generated in the interaction process of the object part are in positive correlation. Such as: designating attribute values as life values, wherein the target parts comprise a head part, four limbs parts and a trunk part, and when the head is hit by a virtual bullet, 20 points of loss are generated on the life values; when the limb part is hit by a virtual bullet, 15 points of loss are generated on the life value; when the trunk part is hit by the virtual bullet, 10 points of injury are generated to the life value, the probability of head receiving interaction is larger than the probability of four limbs, and the probability of four limbs receiving interaction is larger than the probability of trunk part receiving interaction.
Optionally, all or part of the plurality of virtual bullets fly toward the target virtual object to interact with the body part of the target virtual object. When part of the virtual bullets fly to the target virtual object, part of the virtual bullets fly to the target virtual object can be randomly determined, and part of the virtual bullets fly to the target virtual object can also be determined according to the position relation.
Illustratively, in response to receiving a launch control operation for a specified virtual prop, a projected area of the target virtual object projected to the firing range is determined. Optionally, a plane perpendicular to the emission direction of the designated virtual prop is taken as a projection plane, and the target virtual object is projected to the projection plane, so that the projection area of the projection result area is obtained.
A second number of virtual bullets to interact with the target virtual object is determined based on the first number of virtual bullets fired by the specified virtual prop and the projected area. Optionally, a reference area corresponding to the first number is obtained, a proportion of the projection area to the reference area is determined, and a second number of virtual bullets is determined from the first number according to the proportion, wherein the second number is the number of virtual bullets in the projection result area in the projection plane.
And displaying the interaction result of the second number of virtual bullets, which fly to the target virtual object along with the target virtual object. That is, the virtual bullets in the projection result area follow the locked target virtual object and fly to the target virtual object for interaction.
And displaying the flight results of the other virtual bullets continuously flying along the shooting path, wherein if the other virtual bullets have third virtual objects on the flight path continuously flying, displaying the interaction results of the third virtual objects and the other virtual bullets.
That is, the second number of virtual bullets follows the target virtual object to interact, and the other virtual bullets continue to fly according to the whole flight path and interact with the third virtual object according to the blocking condition of the third virtual object on the flight path.
In some embodiments, in response to the target virtual object having a position change in the virtual scene, a second number of virtual bullets are displayed to follow the target virtual object for the position change and to interact with the interaction result of the target virtual object.
Optionally, the flying of the virtual cartridges includes a two-stage flying process, that is, in response to receiving the firing control operation for the specified virtual prop, displaying a first flight phase of the plurality of virtual cartridges fired by the specified virtual prop in a first flight direction, the first flight direction being the aiming direction of the specified virtual prop. And displaying a second flight phase of the plurality of virtual bullets according to a second flight direction in response to the completion of the first flight phase, wherein the second flight direction is the direction pointing to the target virtual object, and displaying the interaction result of the plurality of virtual bullets and the target virtual object in response to the completion of the second flight phase.
Illustratively, as shown in fig. 12, after the first virtual object controls the designated virtual prop to launch the virtual bullet, the virtual bullet 1200 flies first in the launching direction, and after the flight reaches the required distance or the required duration, the virtual bullet 1200 flies toward the position where the target virtual object 1210 is located until interaction with the target virtual object 1210 occurs.
In some embodiments, the specified virtual prop corresponds to an upper limit on the holding time period, and the specified virtual prop is a virtual prop that is switched from the target virtual prop. In response to the holding time of the first virtual object on the appointed virtual prop reaching the upper limit of the holding time, automatically switching from the appointed virtual prop to an initial virtual prop, wherein the initial virtual prop is used for generating interaction with other virtual objects by single shot of a single virtual bullet through a single shot channel; and displaying an aiming indicating element of the initial virtual prop, wherein the aiming indicating element is used for indicating the shooting target position of the initial virtual prop when the virtual bullet is shot.
In some embodiments, the initial virtual prop is a virtual prop that the first virtual object continuously owns in the virtual scene, and the specified virtual prop is a virtual prop that the first virtual object temporarily owns in the virtual scene. Optionally, the initial virtual prop occupies a prop reserve position of the first virtual object, and the designated virtual prop does not occupy a prop reserve position of the first virtual object. Illustratively, as shown in FIG. 13, the designated virtual prop is implemented as a temporary virtual shotgun, the first virtual object includes a firearm rail 1310 and a firearm rail 1320, and both the firearm rail 1310 and the firearm rail 1320 are equipped with a virtual firearm, the first virtual object also owns a virtual shotgun 1330, the virtual shotgun 1330 does not occupy the firearm rail, and upon receiving a selection operation of the virtual shotgun 1330, the first virtual object is controlled to hold the virtual shotgun 1330 for attack. Wherein the virtual shotgun 1330 further includes an upper holding time limit, and when the time to hold the virtual shotgun 1330 or acquire the virtual shotgun 1330 reaches the upper holding time limit, the firearm is automatically switched back to the firearm in either firearm rail 1310 or firearm rail 1320. Alternatively, the virtual shotgun 1330 may have an unlimited number of bullets.
In some embodiments, the specified virtual prop is obtained in a different manner than other virtual props of the same type, by searching for a chip skill vending machine in the virtual scene, and then purchasing the skill through virtual resources by the chip skill vending machine.
In summary, according to the method provided by the embodiment of the application, the aiming range indication element is displayed under the condition of holding the appointed virtual prop, so that the target virtual object aimed by the appointed virtual prop can be indicated to the player, after the virtual bullets are launched, the interaction condition between the virtual bullets and the target virtual object is effectively followed, the problem that the interaction condition is unstable under the condition that a plurality of virtual bullets fly randomly in the blind direction, and the interaction efficiency of human-computer interaction is low due to the fact that the interaction is required to be operated for a plurality of times is solved, and the interaction effectiveness between the virtual bullets and the target virtual object is improved through locking interaction.
According to the method provided by the embodiment, interaction between part of virtual bullets and the target virtual object is determined according to the number of projection areas, and other virtual bullets continue to fly along the original path, so that interaction between the other virtual bullets and other virtual objects can be generated on the basis of ensuring that interaction between the other virtual bullets and the target virtual object exists, and interaction efficiency and interaction effectiveness are improved.
According to the method provided by the embodiment, two-stage flight is provided in the flight process of the virtual bullet, the normal flight path of the virtual bullet after being launched is simulated in the first flight stage, the interaction process of the virtual bullet to the target virtual object is reflected in the second flight stage, and the interface display diversity is improved.
According to the method provided by the embodiment, the upper limit of the holding time length is set for the appointed virtual prop, so that the problem that interaction capacity barriers are generated between the appointed virtual prop and other players due to the fact that the player can use the appointed virtual prop with good interaction performance for a long time is avoided.
FIG. 14 is a schematic overall flow chart of virtual object interaction according to an exemplary embodiment of the present application, as shown in FIG. 14, the method includes the following steps:
in step 1401, the player equips a virtual shotgun.
The virtual shotgun is a virtual prop that simultaneously launches a plurality of virtual bullets through a plurality of firing channels, wherein the plurality of virtual bullets are used for attacking other virtual objects, thereby affecting the life values of the other virtual objects.
In some embodiments, the virtual shotgun is a virtual prop that the player picks up directly in the virtual scene; or the virtual shotgun is a virtual prop purchased by a player through virtual resources in a virtual scene; alternatively, the virtual shotgun is a virtual prop that the player obtains outside of the game and equips inside the game, which is not limited in this embodiment.
Step 1402, determining whether there is a target in the range.
The range refers to the attack range of the virtual shotgun, and an indicating element is displayed in the terminal interface, wherein the indicating element is used for indicating the shooting range of the virtual shotgun.
Illustratively, a circular area of a predetermined size is displayed at the center of the terminal interface, which is the area corresponding to the firing range of the virtual shotgun.
It is determined whether there is a target within the firing range of the virtual shotgun, i.e., whether there are other virtual objects within the firing range of the virtual shotgun that can be attacked.
In some embodiments, a world coordinate position of a target within a shooting range in a virtual scene is acquired and converted to a screen coordinate position. In the process of coordinate position conversion, as shown in fig. 15, firstly, the aiming direction 1510 of the virtual shotgun is obtained, a reference straight line 1520 perpendicular to the aiming direction is made, a connecting line 1530 is connected between the target and the current main control virtual object, and a mapping line 1540 perpendicular to the reference straight line from the target is made, so as to obtain a right triangle formed by the reference straight line 1520, the connecting line 1530 and the mapping line 1540, the intersection point at the right angle is the point of the target mapped to the screen, and then whether the point is within the shooting range is judged. If this point is within the firing range, as shown in FIG. 16, it is shown that the virtual object 1600 is within the firing range 1610 of a circle.
If there is a target, step 1403, the targets within the range are marked.
If the target object exists in the shooting range, a marking element is displayed for the target object in the shooting range, and the marking element indicates that the target object is in the shooting range.
In step 1404, it is determined whether the targets are multiple within the range.
One or more targets may exist in the shooting range, wherein the distance between each target and the current main control virtual object may be the same or different, the farther the target is displayed in a smaller size in the judging range,
in step 1405, if the object is multiple, the object closest to the player is selected and the red highlighting element is overlaid.
Optionally, the target closest to the player is taken as the locked target, and a translucent red highlighting element is overlaid on the locked target to indicate that the locked target is attacked by the virtual shotgun locking. Wherein, the red highlighting element can only be seen by the current master control virtual object; alternatively, the red highlighting element can be seen by the player to which the respective virtual object corresponds, i.e., the red highlighting element can be displayed within the field of view of the respective virtual object that can see the lock target.
In step 1406, a determination is made as to whether to trigger a transmission.
That is, it is determined whether the player triggered the firing control to fire the virtual cartridges in the virtual shotgun.
Step 1407, if firing is triggered, firing multiple virtual bullets to fly in the first flight phase towards the firing direction.
When the virtual shotgun is triggered to fire, the process of flying the multiple virtual bullets along the firing direction is displayed first, that is, the effect of simulating the normal firing of the virtual bullets is taken as the first flight phase.
Step 1408, a determination is made as to whether the first flight phase is over.
Step 1409, if so, the virtual bullet is flown toward the locked target for a second flight phase.
When the first flight phase is over, the second flight phase in which the virtual bullets fly toward the locking target is displayed because of the need to have multiple virtual bullets interact with the locking target.
Alternatively, the virtual bullets may be a whole body directly targeted, or may be targeted to one or several body parts. Alternatively, when a plurality of virtual bullets lock the body part of the target, a plurality of bone nodes are hung on the model body of the target, and the bone nodes correspond to a plurality of parts on the target body. The corresponding position can be found according to the names of the bone nodes in the program, only the nodes needing to be followed are stored by a list, then each virtual bullet randomly selects a part name when in locking, and then the position of the target is found through the part name. Because the nodes are hung on the locking target, when the locking target moves, the world coordinate positions of the nodes also follow the change, so that the target can be hit as long as the nodes fly along with the parts.
Step 1410, determine if the target is hit.
In some embodiments, when the locked target suddenly dodges behind a virtual occlusion, such as: virtual walls, virtual piles of stones, there is a possibility that virtual bullets hit virtual blinds, and the locked targets are not attacked by virtual bullets.
If there is a hit, step 1411, the injury is calculated.
In summary, according to the method provided by the embodiment of the application, the aiming range indication element is displayed under the condition of holding the appointed virtual prop, so that the target virtual object aimed by the appointed virtual prop can be indicated to the player, after the virtual bullets are launched, the interaction condition between the virtual bullets and the target virtual object is effectively followed, the problem that the interaction condition is unstable under the condition that a plurality of virtual bullets fly randomly in the blind direction, and the interaction efficiency of human-computer interaction is low due to the fact that the interaction is required to be operated for a plurality of times is solved, and the interaction effectiveness between the virtual bullets and the target virtual object is improved through locking interaction.
Fig. 17 is a block diagram of an interaction device for virtual objects according to an exemplary embodiment of the present application, and as shown in fig. 17, the device includes:
a receiving module 1710, configured to receive a control operation on a first virtual object, where the first virtual object is in a virtual scene;
A display module 1720, configured to display, when the first virtual object holds a specified virtual prop, a targeting range indication element of the specified virtual prop, where the specified virtual prop is used to simultaneously launch a plurality of virtual bullets with a plurality of firing channels to generate interactions with other virtual objects, and the targeting range indication element is used to indicate a shooting range of the specified virtual prop when the virtual bullets are launched;
the display module 1720 is further configured to display a target virtual object of the at least one second virtual object in a highlighting mode in response to the firing range indicated by the targeting range indicating element including the at least one second virtual object, the highlighting mode being configured to indicate that the target virtual object is interactively locked by the specified virtual prop;
the display module 1720 is further configured to display an interaction result of the interaction of the plurality of virtual bullets launched by the specified virtual prop with the target virtual object in response to receiving the launch control operation on the specified virtual prop.
In an optional embodiment, the display module 1720 is further configured to, in response to the shooting range indicated by the aiming range indication element including a plurality of second virtual objects, display a marking element at a designated location corresponding to the plurality of second virtual objects, where the marking element is configured to indicate a virtual object within the shooting range of the designated virtual prop;
The display module 1720 is further configured to determine the target virtual object from the plurality of second virtual objects marked by the marking element, and display the target virtual object in the highlighting mode.
In an optional embodiment, the receiving module 1710 is further configured to receive an object switching operation, where the object switching operation is used to instruct switching the target virtual object that is interactively locked among the plurality of second virtual objects marked by the marking element;
the display module 1720 is further configured to cancel displaying the target virtual object before the switching in the highlighting mode and display the target virtual object after the switching in the highlighting mode based on the object switching operation.
In an alternative embodiment, the receiving module 1710 is further configured to receive a motion control operation on a terminal as the object switching operation, where the motion control operation is detected by a motion sensor in the terminal; or,
the receiving module 1710 is further configured to receive a tapping operation on a display screen of the terminal as the object switching operation; or,
the receiving module 1710 is further configured to receive a sliding operation on a terminal display screen as the object switching operation, where a sliding direction of the sliding operation is used to indicate a switching direction of switching the target virtual object.
In an alternative embodiment, the display module 1720 is further configured to display a first flight phase of the plurality of virtual bullets launched by the specified virtual prop in a first flight direction in response to receiving a launch control operation for the specified virtual prop, the first flight direction being a targeting direction of the specified virtual prop;
the display module 1720 is further configured to display a second flight phase of the plurality of virtual bullets in a second flight direction in response to the first flight phase being completed, the second flight direction being a direction pointing toward the target virtual object;
the display module 1720 is further configured to display an interaction result of the plurality of virtual bullets with the target virtual object in response to the second flight phase being completed.
In an alternative embodiment, the target virtual object includes a plurality of object parts;
the display module 1720 is further configured to display an interaction result of the plurality of virtual bullets emitted by the specified virtual prop being emitted to the target virtual object and interacting with each object portion of the target virtual object in response to receiving the emission control operation on the specified virtual prop.
In an alternative embodiment, as shown in fig. 18, the apparatus further includes:
the obtaining module 1730 is configured to obtain interaction probabilities corresponding to the multiple object locations, where the interaction probabilities are preset probabilities corresponding to the object locations;
a determining module 1740, configured to determine an interaction distribution situation between the plurality of virtual bullets and the plurality of object parts based on the interaction probability;
the display module 1720 is further configured to display an interaction result of the plurality of virtual bullets that diverges and flies to the plurality of object locations based on the interaction distribution.
In an alternative embodiment, the apparatus further comprises:
the determining module 1740 is configured to determine, when the first virtual object holds the specified virtual prop, a circular range corresponding to a preset radius as the shooting range with a center position of the terminal interface as a center of a circle;
the display module 1720 is further configured to display an aiming range indicator element of the specified virtual prop around the shooting range.
In an alternative embodiment, the apparatus further comprises:
a determining module 1740 for determining a projected area of the target virtual object projected to the firing range in response to receiving a launch control operation for the specified virtual prop; determining a second number of virtual bullets to interact with the target virtual object based on the first number of virtual bullets launched by the specified virtual prop and the projected area;
The display module 1720 is further configured to display an interaction result of the second number of virtual bullets following the fly to the target virtual object and interacting with the target virtual object;
the display module 1720 is further configured to display a flight result of the other virtual bullets continuing to fly along the firing path, where if the other virtual bullets have a third virtual object on the flight path continuing to fly, an interaction result of the third virtual object and the other virtual bullets is displayed.
In an alternative embodiment, the display module 1720 is further configured to display, in response to the target virtual object having a position change in the virtual scene, an interaction result of the second number of virtual bullets following the target virtual object for the position change and interacting with the target virtual object.
In an alternative embodiment, the display module 1720 is further configured to automatically display an aiming range indicator element for the specified virtual prop in response to the first virtual object switching from the initial virtual prop to the specified virtual prop.
In an optional embodiment, the specified virtual prop corresponds to an upper limit of the holding time period;
the display module 1720 is further configured to automatically switch from the specified virtual prop to the initial virtual prop in response to the holding time of the first virtual object for the specified virtual prop reaching the upper limit of the holding time, where the initial virtual prop is configured to generate an interaction with other virtual objects by single firing of a single virtual bullet in a single firing channel; and displaying an aiming indication element of the initial virtual prop, wherein the aiming indication element is used for indicating the shooting target position of the initial virtual prop when the virtual bullet is shot.
In an alternative embodiment, the display module 1720 is further configured to, in response to the firing range indicated by the targeting range indicating element including at least one second virtual object, display a target virtual object of the second virtual object with a pointing display parameter;
the display module 1720 is further configured to display an identification point on the plurality of object locations of the target virtual object, where a display size of the identification point and an interaction probability of the object location receiving interaction are in a positive correlation, and the interaction probability and an interaction effect of the object location receiving interaction are in a positive correlation.
In an alternative embodiment, the display module 1720 is further configured to, in response to the firing range indicated by the targeting range indicating element including at least one second virtual object, overlay display a target virtual object of the second virtual objects with specified display parameters;
and the interaction of the virtual bullet and the target virtual object is used for influencing the appointed attribute value of the target virtual object.
To sum up, the device provided by the embodiment of the application displays the aiming range indication element under the condition of holding the appointed virtual prop, so that the target virtual object aimed by the appointed virtual prop can be indicated to the player, and after the virtual bullets are launched, the interaction condition between the virtual bullets and the target virtual object is effectively followed, so that the problem that multiple operations are needed to interact and the interaction efficiency of a human-computer is low due to unstable interaction condition under the condition that multiple virtual bullets fly randomly in the blind direction is avoided, and the interaction effectiveness between the multiple virtual bullets and the target virtual object is improved through locking interaction.
It should be noted that: in the interaction device for virtual objects provided in the above embodiment, only the division of the above functional modules is used as an example, and in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to perform all or part of the functions described above. In addition, the interaction device of the virtual object and the interaction method embodiment of the virtual object provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the interaction device of the virtual object are detailed in the method embodiment, which is not described herein again.
Fig. 19 shows a block diagram of a computer device 1900 according to an exemplary embodiment of the present application. The computer device 1900 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. Computer device 1900 may also be referred to as a user device, portable terminal, laptop terminal, desktop terminal, or the like.
Generally, the computer device 1900 includes: a processor 1901 and a memory 1902.
Processor 1901 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 1901 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 1901 may also include a main processor, which is a processor for processing data in the awake state, also called a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 1901 may incorporate a GPU (Graphics Processing Unit, image processor) for rendering and rendering content required for display by the display screen. In some embodiments, the processor 1901 may also include an AI processor for processing computing operations related to machine learning.
Memory 1902 may include one or more computer-readable storage media, which may be non-transitory. Memory 1902 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1902 is configured to store at least one instruction for execution by processor 1901 to implement the method of interaction of virtual objects provided by the method embodiments herein.
In some embodiments, computer device 1900 also includes other components, and those skilled in the art will appreciate that the structure illustrated in FIG. 19 is not limiting of terminal 1900, and may include more or fewer components than shown, or may combine certain components, or employ a different arrangement of components.
Alternatively, the computer-readable storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), solid state disk (SSD, solid State Drives), or optical disk, etc. The random access memory may include resistive random access memory (ReRAM, resistance Random Access Memory) and dynamic random access memory (DRAM, dynamic Random Access Memory), among others. The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
The embodiment of the application also provides a computer device, which comprises a processor and a memory, wherein at least one instruction, at least one section of program, a code set or an instruction set is stored in the memory, and the at least one instruction, the at least one section of program, the code set or the instruction set is loaded and executed by the processor to realize the interaction method of the virtual object in any one of the embodiments of the application.
The embodiment of the application further provides a computer readable storage medium, where at least one instruction, at least one section of program, a code set, or an instruction set is stored, where the at least one instruction, the at least one section of program, the code set, or the instruction set is loaded and executed by a processor to implement the interaction method of the virtual object according to any one of the embodiments of the application.
Embodiments of the present application also provide a computer program product or computer program comprising computer instructions stored in a computer-readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the interaction method of the virtual object according to any of the above embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments is merely exemplary in nature and is in no way intended to limit the invention, since it is intended that all modifications, equivalents, improvements, etc. that fall within the spirit and scope of the invention.
Claims (18)
1. A method of interaction of virtual objects, the method comprising:
receiving a control operation of a first virtual object, wherein the first virtual object is in a virtual scene;
displaying an aiming range indicating element of a designated virtual prop under the condition that the first virtual object holds the designated virtual prop, wherein the designated virtual prop is used for simultaneously transmitting a plurality of virtual bullets and generating interaction with other virtual objects through a plurality of transmitting channels, and the aiming range indicating element is used for indicating the shooting range of the designated virtual prop when transmitting the virtual bullets;
In response to the firing range indicated by the targeting range indicating element including at least one second virtual object, displaying a target virtual object of the at least one second virtual object in a highlighting mode, the highlighting mode being used to indicate that the target virtual object is interactively locked by the designated virtual prop;
and responding to receiving the emission control operation of the appointed virtual prop, and displaying the interaction result of the interaction of the plurality of virtual bullets emitted by the appointed virtual prop and the target virtual object.
2. The method of claim 1, wherein the displaying the target virtual object of the at least one second virtual object in a highlighting mode in response to the firing range indicated by the targeting range indication element including at least one second virtual object therein comprises:
responding to the shooting range indicated by the aiming range indication element to comprise a plurality of second virtual objects, and displaying a marking element at a designated position corresponding to the plurality of second virtual objects, wherein the marking element is used for indicating the virtual objects in the shooting range of the designated virtual prop;
the target virtual object is determined from a plurality of second virtual objects marked by the marking element, and the target virtual object is displayed in the highlighting mode.
3. The method according to claim 2, wherein the method further comprises:
receiving an object switching operation, wherein the object switching operation is used for indicating to switch the target virtual object which is interactively locked in a plurality of second virtual objects marked by the marking element;
based on the object switching operation, the target virtual object before switching is canceled from being displayed in the highlighting mode, and the target virtual object after switching is displayed in the highlighting mode.
4. A method according to claim 3, wherein the receive object switching operation comprises:
receiving a motion control operation of a terminal as the object switching operation, wherein the motion control operation is detected by a motion sensor in the terminal; or,
receiving a knocking operation on a terminal display screen as the object switching operation; or,
and receiving a sliding operation on a terminal display screen as the object switching operation, wherein the sliding direction of the sliding operation is used for indicating the switching direction of switching the target virtual object.
5. The method according to any one of claims 1 to 4, wherein displaying the interaction result of the plurality of virtual bullets launched by the specified virtual prop with the target virtual object in response to receiving the launch control operation on the specified virtual prop comprises:
In response to receiving a launch control operation for the specified virtual prop, displaying a first flight phase of a plurality of virtual bullets launched by the specified virtual prop in a first flight direction, the first flight direction being a targeting direction of the specified virtual prop;
in response to completion of the first flight phase, displaying a second flight phase of the plurality of virtual bullets in a second flight direction, the second flight direction being a direction pointing toward the target virtual object;
and displaying interaction results of a plurality of virtual bullets and the target virtual object in response to completion of the second flight phase.
6. The method of any one of claims 1 to 4, wherein the target virtual object comprises a plurality of object sites;
the response to receiving the emission control operation on the designated virtual prop, displaying the interaction result of the plurality of virtual bullets emitted by the designated virtual prop and the target virtual object, including:
and responding to receiving the emission control operation of the appointed virtual prop, and displaying an interaction result of the plurality of virtual bullets emitted by the appointed virtual prop to be emitted to the target virtual object in a divergent manner and interact with each object part of the target virtual object.
7. The method of claim 6, wherein displaying the interaction results of the plurality of virtual bullets emitted by the designated virtual prop flying toward the target virtual object and interacting with the respective object portions of the target virtual object comprises:
the method comprises the steps of obtaining interaction probabilities corresponding to a plurality of object parts respectively, wherein the interaction probabilities are preset probabilities corresponding to the object parts;
determining interaction distribution conditions between a plurality of virtual bullets and a plurality of object parts based on the interaction probability;
and displaying interaction results of the plurality of virtual bullets which fly to the plurality of object parts based on the interaction distribution.
8. The method of any one of claims 1 to 4, wherein displaying the aiming scope indicator element for a specified virtual prop if the first virtual object holds the specified virtual prop comprises:
under the condition that the first virtual object holds the appointed virtual prop, determining a round range corresponding to a preset radius as the shooting range by taking the central position of the terminal interface as the circle center;
and displaying the aiming range indicating element of the appointed virtual prop at the periphery of the shooting range.
9. The method according to any one of claims 1 to 4, wherein the displaying, in response to receiving the emission control operation for the specified virtual prop, an interaction result of a plurality of virtual bullets emitted by the specified virtual prop to interact with the target virtual object, includes:
determining a projection area of the target virtual object projected to the shooting range in response to receiving an emission control operation on the designated virtual prop;
determining a second number of virtual bullets to interact with the target virtual object based on the first number of virtual bullets launched by the specified virtual prop and the projected area;
displaying the interaction result of the second number of virtual bullets which fly to the target virtual object along with the virtual bullets and interact with the target virtual object;
and displaying the flight results of the other virtual bullets continuously flying along the shooting path, wherein if the other virtual bullets have third virtual objects on the flight path continuously flying, displaying the interaction results of the third virtual objects and the other virtual bullets.
10. The method of claim 9, wherein displaying the interaction result of the second number of virtual bullets following the fly toward the target virtual object, interacting with the target virtual object, comprises:
And responding to the position change of the target virtual object in the virtual scene, and displaying an interaction result of the second number of virtual bullets which follow the target virtual object to change the position and interact with the target virtual object.
11. The method of any one of claims 1 to 4, wherein displaying the aiming scope indicator element for a specified virtual prop if the first virtual object holds the specified virtual prop comprises:
and in response to the first virtual object switching from an initial virtual prop to the specified virtual prop, automatically displaying an aiming range indication element of the specified virtual prop.
12. The method of claim 11, wherein the specified virtual prop corresponds to an upper limit of holding time;
the method further comprises the steps of:
in response to the holding time of the first virtual object for the designated virtual prop reaching the upper limit of the holding time, automatically switching from the designated virtual prop to the initial virtual prop, wherein the initial virtual prop is used for generating interaction with other virtual objects by single-shot shooting of a single virtual bullet;
and displaying an aiming indication element of the initial virtual prop, wherein the aiming indication element is used for indicating the shooting target position of the initial virtual prop when the virtual bullet is shot.
13. The method of any one of claims 1 to 4, wherein the displaying the target virtual object of the at least one second virtual object in a highlighting mode in response to the firing range indicated by the targeting range indicating element including at least one second virtual object therein comprises:
in response to the firing range indicated by the aiming range indication element including at least one second virtual object, performing hemming display on a target virtual object in the second virtual objects by specifying display parameters;
and displaying the identification points on the plurality of object parts of the target virtual object, wherein the display size of the identification points and the interaction probability of the receiving interaction of the object parts are in positive correlation, and the interaction probability and the interaction effect of the receiving interaction of the object parts are in positive correlation.
14. The method of any one of claims 1 to 4, wherein the displaying the target virtual object of the at least one second virtual object in a highlighting mode in response to the firing range indicated by the targeting range indicating element including at least one second virtual object therein comprises:
in response to the shooting range indicated by the aiming range indication element including at least one second virtual object, performing overlaying display on a target virtual object in the second virtual object by specifying display parameters;
And the interaction of the virtual bullet and the target virtual object is used for influencing the appointed attribute value of the target virtual object.
15. An interactive apparatus for virtual objects, the apparatus comprising:
the receiving module is used for receiving control operation on a first virtual object, wherein the first virtual object is in a virtual scene;
the display module is used for displaying an aiming range indicating element of a specified virtual prop under the condition that the first virtual object holds the specified virtual prop, the specified virtual prop is used for simultaneously shooting a plurality of virtual bullets with a plurality of shooting channels to interact with other virtual objects, and the aiming range indicating element is used for indicating the shooting range of the specified virtual prop when the virtual bullets are shot;
the display module is further used for displaying a target virtual object in the at least one second virtual object in a highlighting mode in response to the shooting range indicated by the aiming range indication element, wherein the highlighting mode is used for indicating that the target virtual object is interactively locked by the appointed virtual prop;
The display module is further used for responding to the receiving of the emission control operation of the appointed virtual prop and displaying interaction results of the interactions of the plurality of virtual bullets emitted by the appointed virtual prop and the target virtual object.
16. A computer device comprising a processor and a memory having stored therein at least one instruction that is loaded and executed by the processor to implement the method of interaction of virtual objects of any of claims 1 to 14.
17. A computer readable storage medium having stored therein at least one instruction that is loaded and executed by a processor to implement the method of interaction of virtual objects of any of claims 1 to 14.
18. A computer program product comprising a computer program or instructions which, when executed by a processor, implement the method of interaction of virtual objects as claimed in any one of claims 1 to 14.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210726519.2A CN117298580A (en) | 2022-06-23 | 2022-06-23 | Virtual object interaction method, device, equipment, medium and program product |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210726519.2A CN117298580A (en) | 2022-06-23 | 2022-06-23 | Virtual object interaction method, device, equipment, medium and program product |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117298580A true CN117298580A (en) | 2023-12-29 |
Family
ID=89287197
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210726519.2A Pending CN117298580A (en) | 2022-06-23 | 2022-06-23 | Virtual object interaction method, device, equipment, medium and program product |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117298580A (en) |
-
2022
- 2022-06-23 CN CN202210726519.2A patent/CN117298580A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7476235B2 (en) | Method, apparatus, and computer program for controlling virtual objects | |
CN113181650B (en) | Control method, device, equipment and storage medium for calling object in virtual scene | |
WO2021139371A1 (en) | Virtual object control method, device, terminal, and storage medium | |
CN110732135B (en) | Virtual scene display method and device, electronic equipment and storage medium | |
JP2022539289A (en) | VIRTUAL OBJECT AIMING METHOD, APPARATUS AND PROGRAM | |
US20230068653A1 (en) | Method and apparatus for controlling virtual object to use virtual prop, terminal, and medium | |
US20230013014A1 (en) | Method and apparatus for using virtual throwing prop, terminal, and storage medium | |
CN110465087B (en) | Virtual article control method, device, terminal and storage medium | |
JP7534539B2 (en) | Method, device, terminal, and program for inserting virtual items | |
US20230330530A1 (en) | Prop control method and apparatus in virtual scene, device, and storage medium | |
WO2022156491A1 (en) | Virtual object control method and apparatus, and device, storage medium and program product | |
WO2022095672A1 (en) | Screen display method and apparatus, device and storage medium | |
WO2022007567A1 (en) | Virtual resource display method and related device | |
US20230030619A1 (en) | Method and apparatus for displaying aiming mark | |
JP2023164787A (en) | Picture display method and apparatus for virtual environment, and device and computer program | |
WO2022105480A1 (en) | Virtual object control method, device, terminal, storage medium, and program product | |
CN111202983A (en) | Method, device, equipment and storage medium for using props in virtual environment | |
WO2024098628A9 (en) | Game interaction method and apparatus, terminal device, and computer-readable storage medium | |
CN113694515B (en) | Interface display method, device, terminal and storage medium | |
CN113680061A (en) | Control method, device, terminal and storage medium of virtual prop | |
CN117298580A (en) | Virtual object interaction method, device, equipment, medium and program product | |
CN112121433A (en) | Method, device and equipment for processing virtual prop and computer readable storage medium | |
CN113663329B (en) | Shooting control method and device for virtual character, electronic equipment and storage medium | |
CN118022330A (en) | Virtual object interaction method, device, equipment, medium and program product | |
CN114210062A (en) | Using method, device, terminal, storage medium and program product of virtual prop |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |