CN114247140A - Information display method, device, equipment and medium - Google Patents

Information display method, device, equipment and medium Download PDF

Info

Publication number
CN114247140A
CN114247140A CN202111662007.6A CN202111662007A CN114247140A CN 114247140 A CN114247140 A CN 114247140A CN 202111662007 A CN202111662007 A CN 202111662007A CN 114247140 A CN114247140 A CN 114247140A
Authority
CN
China
Prior art keywords
shooting
virtual
virtual object
prop
remote
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111662007.6A
Other languages
Chinese (zh)
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Publication of CN114247140A publication Critical patent/CN114247140A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses an information display method, device, equipment and medium, and relates to the field of human-computer interaction. The method comprises the following steps: displaying a virtual environment picture of a first virtual object and a shooting indicator, wherein the first virtual object is provided with a remote shooting prop, and the shooting indicator is used for indicating at least one of a shooting direction and a shooting coverage range of the remote shooting prop; responding to a shooting operation of the remote shooting prop, and controlling the first virtual object to shoot a second virtual object by using the remote shooting prop; displaying a hit feedback control based on the shooting indicator on the virtual environment picture, wherein the hit feedback control is used for indicating shooting feedback information of the second virtual object. The method and the device can display the shooting feedback information based on the shooting coverage area indicator on the virtual environment interface, and improve the viewing efficiency of the user.

Description

Information display method, device, equipment and medium
The present application claims priority from chinese patent application No. 202111305192.3 entitled "information display method, apparatus, device, and medium" filed on 11/05/2021, which is incorporated herein by reference in its entirety.
Technical Field
The present application relates to the field of human-computer interaction, and in particular, to an information display method, apparatus, device, and medium.
Background
In the shooting game, the user can control the virtual object to shoot by using the virtual gun, and the type of the virtual gun is set according to the type of the real gun.
In the related technology, a user controls a first virtual object to shoot by using a remote shooting prop, the remote shooting prop can shoot virtual ammunition, and after the virtual ammunition hits a second virtual object in a virtual environment, at least one hit special effect is displayed based on the second virtual object.
However, when the distance between the first virtual object and the second virtual object is long, it is difficult for the user to see the hit effect on the second virtual object. Or, when the second virtual object moves behind the virtual obstacle, the user cannot see the second virtual object, and therefore the user cannot see the hit special effect on the second virtual object. Therefore, the related art does not facilitate the user to view the hit feedback information.
Disclosure of Invention
The embodiment of the application provides an information display method, an information display device, information display equipment and an information display medium. The technical scheme is as follows:
according to an aspect of the present application, there is provided an information display method including:
displaying a virtual environment picture of a first virtual object and a shooting indicator, wherein the first virtual object is provided with a remote shooting prop, and the shooting indicator is used for indicating at least one of a shooting direction and a shooting coverage range of the remote shooting prop;
responding to a shooting operation of the remote shooting prop, and controlling the first virtual object to shoot a second virtual object by using the remote shooting prop;
displaying a hit feedback control based on the shooting indicator on the virtual environment picture, wherein the hit feedback control is used for indicating shooting feedback information of the second virtual object.
According to another aspect of the present application, there is provided an information display apparatus including:
the shooting system comprises a display module, a shooting module and a shooting module, wherein the display module is used for displaying a virtual environment picture and a shooting indicator of a first virtual object, the first virtual object is provided with a remote shooting prop, and the shooting indicator is used for indicating at least one of a shooting direction and a shooting coverage range of the remote shooting prop;
the control module is used for responding to the shooting operation of the remote shooting prop and controlling the first virtual object to use the remote shooting prop to shoot a second virtual object;
the display module is further configured to display a hit feedback control based on a shooting indicator of the remote shooting prop on the virtual environment screen, where the hit feedback control is used to indicate shooting feedback information for the second virtual object.
According to another aspect of the present application, there is provided a computer device including: a processor and a memory, the memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions, the at least one instruction, the at least one program, set of codes, or set of instructions being loaded and executed by the processor to implement the information display method as described above.
According to another aspect of the present application, there is provided a computer storage medium having at least one program code stored therein, the program code being loaded and executed by a processor to implement the information display method as described above.
According to another aspect of the application, a computer program product or a computer program is provided, comprising computer instructions, which are stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and executes the computer instructions, so that the computer device executes the information display method.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
according to the embodiment of the application, after the first virtual object is controlled to shoot the second virtual object by using the remote shooting prop, the shooting feedback information is displayed based on the shooting coverage area indicator on the virtual environment interface. The method for viewing the shooting feedback information is provided for the user, the shooting feedback information is displayed directly on the basis of the shooting coverage area indicator, the hit special effect is not displayed on the second virtual object in the virtual environment, and even if the second virtual object is far away from the first virtual object or the second virtual object avoids obstacles, the user controlling the first virtual object can still view the hit feedback information in the hit feedback control, so that the user can directly view the hit feedback information conveniently.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic block diagram of a computer system provided in an exemplary embodiment of the present application;
FIG. 2 is a schematic flow chart diagram of an information display method provided by an exemplary embodiment of the present application;
FIG. 3 is a schematic diagram of a camera model provided by an exemplary embodiment of the present application;
FIG. 4 is an interface schematic diagram of an information display method provided by an exemplary embodiment of the present application;
FIG. 5 is an interface schematic diagram of an information display method provided by an exemplary embodiment of the present application;
FIG. 6 is a schematic illustration of a related art interface provided by an exemplary embodiment of the present application;
FIG. 7 is a flowchart illustrating an information display method according to an exemplary embodiment of the present application;
FIG. 8 is an interface schematic diagram of an information display method provided by an exemplary embodiment of the present application;
FIG. 9 is a schematic illustration of a related art interface provided by an exemplary embodiment of the present application;
FIG. 10 is a schematic flow chart diagram of an information display method provided by an exemplary embodiment of the present application;
FIG. 11 is an interface schematic diagram of an information display method provided by an exemplary embodiment of the present application;
FIG. 12 is a schematic flow chart diagram of an information display method provided by an exemplary embodiment of the present application;
FIG. 13 is a schematic flow chart diagram of an information display method provided by an exemplary embodiment of the present application;
FIG. 14 is a schematic flow chart diagram of a method for determining the direction of discharge of a remotely fired prop according to an exemplary embodiment of the present application;
FIG. 15 is a schematic illustration of a method of determining the direction of discharge of a remotely fired prop according to an exemplary embodiment of the present application;
FIG. 16 is a schematic illustration of a method of determining the direction of discharge of a remotely fired prop according to an exemplary embodiment of the present application;
FIG. 17 is a schematic flow chart diagram of an information display method provided by an exemplary embodiment of the present application;
fig. 18 is a schematic structural diagram of an information display device provided in an exemplary embodiment of the present application;
fig. 19 is a schematic structural diagram of a computer device according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, terms referred to in the embodiments of the present application are described:
virtual environment: is a virtual environment that is displayed (or provided) when an application is run on the terminal. The virtual environment may be a three-dimensional virtual environment or a two-dimensional virtual environment. The three-dimensional virtual environment may be a simulation environment of a real world, a semi-simulation semi-fictional environment, or a pure fictional environment.
Virtual object: refers to a movable object in a virtual environment. The movable object can be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in the virtual environment. Alternatively, when the virtual environment is a three-dimensional virtual environment, the virtual objects are three-dimensional stereo models created based on an animated skeleton technique, each virtual object having its own shape and volume in the three-dimensional virtual environment, occupying a part of the space in the three-dimensional virtual environment. Alternatively, when the virtual environment is a two-dimensional virtual environment, the virtual objects are two-dimensional plane models created based on animation technology, and each virtual object has its own shape and area in the two-dimensional virtual environment, occupying a part of the area in the two-dimensional virtual environment.
FPS (first Person Shooting game): the game is a game which provides a plurality of base points in the virtual world, and users in different battles control the virtual characters to fight in the virtual world, take the base points or destroy enemy battle base points or kill all or part of characters in battles. In general, a user plays a game in a first person perspective in the FPS game, and the user can also select a third person perspective to play the game. For example, the FPS game may divide the user into two enemy parades, and disperse the virtual characters controlled by the user in the virtual world to compete with each other, so as to kill all the users of the enemy as winning conditions. The FPS game is in units of a round, and the duration of a round of the FPS game is from the time of the start of the game to the time of achievement of a winning condition.
FIG. 1 shows a block diagram of a computer system provided in an exemplary embodiment of the present application. The computer system 100 includes: a first terminal 120, a server cluster 140, and a second terminal 160.
The first terminal 120 is installed and operated with an application program supporting a virtual environment. The application program may be any one of a racing game, an MOBA (Multiplayer Online Battle Arena) game, a virtual reality application program, a three-dimensional map program, an FPS game, and a Multiplayer gunfight type live game. The first terminal 120 is a terminal used by a first user who uses the first terminal 120 to manipulate a first virtual object located in a three-dimensional virtual environment for activities including, but not limited to: attacking, releasing skill, purchasing props, treating, adjusting body posture, crawling, walking, riding, flying, jumping, driving, picking up, shooting, throwing. Illustratively, the first virtual object is a first avatar.
The first terminal 120 is connected to the server cluster 140 through a wireless network or a wired network.
The server cluster 140 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server cluster 140 is used to provide background services for applications that support virtual environments. Optionally, the server cluster 140 undertakes primary computational work and the first terminal 120 and the second terminal 160 undertakes secondary computational work; alternatively, the server cluster 140 undertakes the secondary computing work and the first terminal 120 and the second terminal 160 undertakes the primary computing work; or, the server cluster 140, the first terminal 120, and the second terminal 160 perform cooperative computing by using a distributed computing architecture.
The second terminal 160 is installed and operated with an application program supporting a virtual environment. The application program can be any one of a racing game, an MOBA game, a virtual reality application program, a three-dimensional map program, an FPS game and a multi-player gunfight survival game. The second terminal 160 is a terminal used by a second user, who uses the second terminal 160 to operate a second virtual object located in the three-dimensional virtual environment for activities including, but not limited to: attacking, releasing skill, purchasing props, treating, adjusting body posture, crawling, walking, riding, flying, jumping, driving, picking up, shooting, throwing. Illustratively, the second avatar is a second avatar. The first virtual object and the second virtual object may belong to the same team, the same organization, have a friend relationship, or have a temporary communication right. The first virtual object and the second virtual object may not belong to the same team, or the same organization, or the same camp.
Alternatively, the applications installed on the first terminal 120 and the second terminal 160 are the same, or the same type of application of different platforms. The first terminal 120 may generally refer to one of a plurality of terminals, and the second terminal 160 may generally refer to one of a plurality of terminals, and this embodiment is only illustrated by the first terminal 120 and the second terminal 160. The device types of the first terminal 120 and the second terminal 160 are the same or different, and include: at least one of a smartphone, a tablet, an e-book reader, an MP3 player, an MP4 player, a laptop portable computer, and a desktop computer.
Fig. 2 is a flowchart illustrating an information display method according to an embodiment of the present application. The method may be performed by the first terminal 120 or the second terminal 160 shown in fig. 1, the method comprising the steps of:
step 202: and displaying a virtual environment picture of a first virtual object and a shooting indicator, wherein the first virtual object is provided with a remote shooting prop, and the shooting indicator is used for indicating at least one of a shooting direction and a shooting coverage range of the remote shooting prop.
The first virtual object refers to a master object of the user in the virtual environment. Illustratively, the first virtual object is at least one of a virtual character, a virtual animal, an animation character, a virtual vehicle, and a virtual animal.
The remote shooting prop is a prop for realizing remote striking. Optionally, the remote shooting property comprises at least one of a firearm, an arrow, a bolt, a rocket barrel, and a missile.
In an alternative embodiment, the firing indicator refers to a firing coverage area indicator of a remote firing prop. Illustratively, where the remote-shooting prop is a scattering-type remote-shooting prop, the shooting coverage indicator is a control having a two-dimensional shape on the virtual environment view. For example, when the scattering-type remote-shooting prop is a shotgun, the shooting coverage area indicator is a circular control on the virtual environment screen.
In an alternative embodiment, the firing indicator refers to a sight of the remote firing prop. Illustratively, the sight of the remote shooting prop is a cross-shaped control on the virtual environment screen.
The virtual environment is a three-dimensional environment in which the first virtual object is located in the virtual world during the running process of an application program in the terminal. Optionally, in an embodiment of the present application, the virtual environment is observed through a camera model.
Optionally, the camera model automatically follows the virtual character in the virtual world, that is, when the position of the first virtual object in the virtual world changes, the camera model changes while following the position of the first virtual object in the virtual world, and the camera model is always within the preset distance range of the first virtual object in the virtual world. Optionally, during the automatic following process, the relative position of the camera model and the first virtual object does not change.
The camera model refers to a three-dimensional model located around a first virtual object in the virtual world, and is located near or at the head of the first virtual object when the first-person perspective is adopted; when the third person scale viewing angle is adopted, the camera model may be located behind and bound to the first virtual object, or may be located at any position away from the first virtual object by a preset distance, and the first virtual object located in the virtual world may be observed from different angles through the camera model, and optionally, when the third person scale viewing angle is the over-shoulder viewing angle of the first person scale, the camera model is located behind the first virtual object (for example, the head and the shoulder of the first virtual object). Optionally, the viewing angle includes other viewing angles, such as a top viewing angle, in addition to the first person viewing angle and the third person viewing angle; the camera model may be located overhead of the head of the first virtual object when a top view is employed, the top view being a view of the virtual world looking from an overhead top view. Optionally, the camera model is not actually displayed in the virtual world, i.e. the camera model is not displayed in the virtual world displayed by the user interface. Illustratively, as shown in FIG. 4, the virtual world is viewed by a first avatar 401 from a first human perspective.
To illustrate the case where the camera model is located at an arbitrary position away from the first virtual object by a preset distance, optionally, one first virtual object corresponds to one camera model, and the camera model may rotate around the first virtual object as a rotation center, for example: the camera model is rotated with an arbitrary point of the first virtual object as a rotation center, the camera model rotates not only angularly but also shifts in displacement during the rotation, and a distance between the camera model and the rotation center is kept constant during the rotation, that is, the camera model rotates on a surface of a sphere with the rotation center as a sphere center, wherein the arbitrary point of the first virtual object may be a head, a trunk, or an arbitrary point around the first virtual object of the virtual character, which is not limited in the embodiment of the present application. Optionally, when the camera model observes the first virtual object, the center of the view angle of the camera model points in a direction in which a point of the spherical surface on which the camera model is located points at the center of the sphere.
Referring to fig. 3, schematically, a point is determined in the first virtual object 31 as a rotation center 32, and the camera model rotates around the rotation center 32, and optionally, the camera model is configured with an initial position, which is a position above and behind the first virtual object (for example, a position behind the brain). Illustratively, as shown in fig. 3, the initial position is position 33, and when the camera model rotates to position 34 or position 35, the direction of the angle of view of the camera model changes as the camera model rotates.
Optionally, the camera model may also observe the virtual character at a preset angle in different directions of the first virtual object.
Illustratively, as shown in fig. 4, at least one of a movement rocker 402, a shoot control 403, a jump control 404, a lie down control 405, a squat control 406, a map 407, and a shoot indicator 408 is displayed on the user interface. The moving rocker 402 is used for controlling the moving mode of the first virtual object in the virtual world. The prop using control 403 is used to control the first virtual object to use a currently held prop, for example, the first virtual object currently holds a remote shooting prop, and then after clicking the prop using control 403, the user can control the first virtual object to use the remote shooting prop to shoot, and for example, after clicking the prop using control 403, the user can control the first virtual object to use the throw-type remote shooting prop to throw. The jump control 404 is used to control the first virtual object to jump in the virtual world. The groveling control 405 is used to control the first virtual object to groveling down in the virtual world. Squat control 406 is used to control the first virtual object to squat in the virtual world. The map 407 is used to display a thumbnail of the virtual world. The firing indicator 408 is used to indicate at least one of a firing direction and a coverage of a prop held by the first virtual object. It should be noted that there may be more or fewer controls displayed on the user interface, and this is not specifically limited in this embodiment of the application.
Step 204: and responding to the shooting operation of the remote shooting prop, and controlling the first virtual object to shoot the second virtual object by using the remote shooting prop.
The shooting operation is used for controlling the first virtual object to shoot. Alternatively, the shooting operation may be pressing one or more preset physical keys to shoot, or the shooting operation may be performing the shooting operation by a signal generated by long-pressing, clicking, double-clicking, and/or sliding on a designated area of the touch screen. In VR (Virtual Reality) games, a shooting operation may be a user performing a specific action or inputting a specific voice to trigger the shooting operation. Illustratively, as shown in fig. 4, a user clicks on a prop use control 403, and controls a first virtual object 401 to shoot at a remote shooting prop 409.
The second virtual object is a virtual object in the virtual environment other than the first virtual object. The second virtual object may be in the same row as the first virtual object, or the second virtual object may be in a different row from the first virtual object. Illustratively, as shown in FIG. 4, the second virtual object 410 is a virtual object that is banked differently than the first virtual object 401.
In an alternative embodiment, the second virtual object is at least one of a neutral virtual object, a virtual barrier, a virtual prop, a virtual article. For example, when the second virtual object is a neutral treasure box in the virtual environment, the user may control the first virtual object to shoot at the neutral treasure box.
Step 206: and displaying a hit feedback control based on the shooting indicator of the remote shooting prop on the virtual environment picture, wherein the hit feedback control is used for indicating shooting feedback information of the second virtual object.
The hit feedback control can be a transparent control on the virtual environment picture or a non-transparent control on the virtual environment picture. Wherein the relationship between the firing indicators hitting the feedback control and the remote prop includes at least one of: displaying a hit feedback control with a two-dimensional shape on a virtual environment picture by taking a shooting indicator of a remote shooting prop as a center; or, on the virtual environment picture, surrounding the shooting indicator of the remote shooting prop, displaying a hit feedback control with a two-dimensional shape; or, displaying a hit feedback control with a two-dimensional shape on the virtual environment picture along with a shooting indicator of the remote shooting prop. Illustratively, as shown in FIG. 5, a transparent hit feedback control 502 having a two-dimensional shape is displayed on the virtual environment screen centered on a firing indicator 501 of a remote firing prop.
The two-dimensional shape of the hit feedback control includes, but is not limited to, at least one of a circle, a rectangle, a triangle, a parallelogram, and an irregular figure. Wherein, the user or the technician can adjust the two-dimensional shape of the hit feedback control according to the actual requirement.
Optionally, the position of the hit feedback control on the virtual environment picture moves along with the shooting direction of the remote shooting prop. For example, when the shooting direction of the remote shooting prop is the direction a, the position B of the feedback control on the virtual environment screen is hit, and when the shooting direction of the remote shooting prop is the direction C, the position D of the feedback control on the virtual environment screen is hit.
Optionally, the position of the hit feedback control on the virtual environment screen moves with the gaze direction of the first virtual object. Illustratively, when the gaze direction of the first virtual object is the W direction, the X position of the feedback control on the virtual environment screen is hit, and when the gaze direction of the first virtual object is the Y direction, the Z position of the feedback control on the virtual environment screen is hit.
The shooting feedback information is used for representing feedback information after the remote shooting prop hits the second virtual object. The shooting feedback information comprises at least one of a hit position of the remote shooting prop, a hit time of the remote shooting prop, a type of virtual ammunition hitting the second virtual object, and a type of the second virtual object.
Optionally, in response to the remote shooting prop hitting the second virtual object, displaying feedback information within the hit feedback control. For example, as shown in fig. 5, after a first virtual object 504 hits a second virtual object 503 using a remote shooting prop, a transparent shooting feedback control 502 is displayed on the virtual environment screen with reference to the shooting indicator 501, and feedback points are displayed in the hitting feedback control 502, where the feedback points correspond to virtual ammunition ejected by the remote shooting prop one-to-one.
In a practical embodiment, the feedback point is at least one of a circle, a triangle, a parallelogram, a hexagon and a triangle. The user can adjust the shape of the feedback point according to actual requirements.
Optionally, on the virtual environment screen, a hit feedback control is displayed on the shooting indicator based on the remote shooting prop, and feedback characters are displayed in the feedback control and used for representing shooting feedback information on the second virtual object.
And in order to facilitate the user to check the shooting feedback information provided by the hit feedback control, the hit feedback control is displayed by taking the shooting indicator of the remote shooting prop as a reference. Wherein the firing indicator comprises a firing coverage area indicator or a sight bead. The shooting coverage area indicator refers to the coverage area of virtual ammunition ejected by the remote shooting prop in a virtual environment, for example, when the scattering type remote shooting prop is a shotgun, the shotgun can emit a plurality of ammunitions at a time, the ammunitions ejected by the shotgun can be randomly distributed in the shooting coverage area, and the shooting coverage area indicator is used for indicating the shooting coverage area. Optionally, a hit feedback control having a two-dimensional shape is displayed on the virtual environment screen with reference to the firing coverage area indicator of the remote firing prop. Or displaying a hit feedback control with a two-dimensional shape on the virtual environment picture by taking a sight of the remote shooting prop as a reference.
In an optional implementation manner of the embodiment of the present application, when the remote shooting prop hits a virtual obstacle in the virtual environment, a bullet hole special effect is displayed on the virtual obstacle. Optionally, launching a detection ray by taking the muzzle position of the remote shooting prop as a ray starting point and taking the launching direction of a virtual ammunition launched by the remote shooting prop as a ray direction; and responding to the detection ray and the collision box to generate an intersection point, and displaying the bullet hole special effect corresponding to the intersection point on the virtual barrier.
In summary, this embodiment controls the first virtual object to use the remote shooting prop to shoot the second virtual object, and then displays the shooting feedback information based on the shooting coverage area indicator on the virtual environment interface. The method for viewing the shooting feedback information is provided for the user, the shooting feedback information is displayed directly on the basis of the shooting coverage area indicator, the hit special effect is not displayed on the second virtual object in the virtual environment, and even if the second virtual object is far away from the first virtual object or the second virtual object avoids obstacles, the user controlling the first virtual object can still view the hit feedback information in the hit feedback control, so that the user can directly view the hit feedback information conveniently.
In the related art, when a first virtual object uses a scattering-type remote shooting prop to shoot, if the scattering-type remote shooting prop hits a virtual obstacle in a virtual environment, a bullet hole special effect is displayed on the virtual obstacle, as shown in fig. 6, the first virtual object uses the scattering-type remote shooting prop to shoot the virtual obstacle, and the bullet hole special effect 601 is displayed on the virtual obstacle. However, when the first virtual object shoots the second virtual object using the scattering type virtual gun, the user cannot see how many virtual ammunitions of the scattering type virtual gun hit the second virtual object, and when the first virtual object is far from the second virtual object, it is difficult for the user to observe the hit position on the virtual environment interface.
Fig. 7 is a flowchart illustrating an information display method according to an embodiment of the present application. The method may be performed by the first terminal 120 or the second terminal 160 shown in fig. 1, the method comprising the steps of:
step 701: a virtual environment picture of the first virtual object and a firing indicator are displayed.
The first virtual object refers to a master object of the user in the virtual environment. Illustratively, the first virtual object is at least one of a virtual character, a virtual animal, an animation character, a virtual vehicle, and a virtual animal.
The first virtual object is provided with a scattering type remote shooting prop which shoots out at least two virtual ammunitions at a time. Illustratively, the scattering-type remote shooting prop includes at least one of a shotgun and a cluster bomb.
The virtual environment is a three-dimensional environment in which the first virtual object is located in the virtual world during the running process of an application program in the terminal. Optionally, in an embodiment of the present application, the virtual environment is observed through a camera model.
The firing indicator is to indicate at least one of a firing direction and a firing coverage of the remote-firing prop.
In the case where the remote shooting prop is a firearm, the direction of shooting refers to the sight of the firearm.
In the case where the remote shooting prop is a shotgun, the shooting coverage refers to the coverage of the ammunition fired by the shotgun.
Step 702: and responding to the shooting operation of the scattering type remote shooting prop, and controlling the first virtual object to shoot the second virtual object by using the remote shooting prop.
The shooting operation is used for controlling the first virtual object to shoot. Alternatively, the shooting operation may be pressing one or more preset physical keys to shoot, or the shooting operation may be performing the shooting operation by a signal generated by long-pressing, clicking, double-clicking, and/or sliding on a designated area of the touch screen. In a VR game, a shooting operation may be a user performing a particular action, or inputting a particular voice to trigger a shooting operation.
The second virtual object is a virtual object in the virtual environment other than the first virtual object. The second virtual object may be in the same row as the first virtual object, or the second virtual object may be in a different row from the first virtual object.
Step 703: on the virtual environment picture, a hit feedback control with a two-dimensional shape is displayed by taking a shooting coverage area indicator of the scattering type remote shooting prop as a reference, and at least one of a first form of feedback point and a second form of feedback point is displayed in the hit feedback control.
The hit feedback control can be a transparent control on the virtual environment picture or a non-transparent control on the virtual environment picture. Wherein the relationship between the firing indicators hitting the feedback control and the remote prop includes at least one of: displaying a hit feedback control with a two-dimensional shape on a virtual environment picture by taking a shooting indicator of a remote shooting prop as a center; or, on the virtual environment picture, surrounding the shooting indicator of the remote shooting prop, displaying a hit feedback control with a two-dimensional shape; or, displaying a hit feedback control with a two-dimensional shape on the virtual environment picture along with a shooting indicator of the remote shooting prop.
And the feedback points displayed in the hit feedback control correspond to the virtual ammunition ejected by the scattering type remote shooting prop one by one.
The first form of feedback point is used for representing virtual ammunition ejected by the scattering type remote shooting prop and hitting the second virtual object, the second form of feedback point is used for representing virtual ammunition ejected by the scattering type remote shooting prop and not hitting the second virtual object, and the first form and the second form are different. Wherein the form of the feedback point includes at least one of color, shape and transparency. Illustratively, the first modality and the second modality are different, meaning that the transparency of the feedback point of the first modality and the transparency of the feedback point of the second modality are different. Illustratively, as shown in fig. 5, when virtual ammunition ejected by a scattering-type remote shooting prop hits a second virtual object 503, a feedback point of a first form is displayed within a transparent hit feedback control 502. When the virtual ammunition ejected by the scattering type remote shooting prop misses the second virtual object 503, the feedback point of the second form is displayed in the transparent hit feedback control 502, wherein the transparency of the feedback point of the first form is 0%, and the transparency of the feedback point of the second form is 100%, so that the feedback point of the first form is a feedback point visible in the hit feedback control 502, and the feedback point of the second form is a feedback point invisible in the hit feedback control 502.
Step 704: and responding to the scattering type remote shooting prop hitting the second virtual object, displaying a hitting feedback control based on a shooting indicator of the scattering type remote shooting prop on the virtual environment picture, wherein at least one of a feedback point in a third form and a feedback point in a fourth form is displayed in the hitting feedback control.
And the feedback points displayed in the hit feedback control correspond to the virtual ammunition ejected by the scattering type remote shooting prop one by one.
The feedback point of the third form is used for indicating that virtual ammunition ejected by the scattering type remote shooting prop hits a first part of the second virtual object, the feedback point of the fourth form is used for indicating that virtual ammunition ejected by the scattering type remote shooting prop hits a second part of the second virtual object, the third form and the fourth form are different, and the first part and the second part are different parts of the second virtual object. Wherein the form of the feedback point includes at least one of color, shape and transparency.
Illustratively, as shown in fig. 8, when virtual ammunition ejected by the scattering-type remote shooting prop hits the torso part of the second virtual object, a feedback point 801 of a third form is displayed in the shooting coverage area indicator 803. When the virtual ammunition fired by the scattering type remote shooting prop hits the head of the second virtual object, a feedback point 802 of a fourth modality is displayed within the shooting coverage area indicator 803. The feedback point 801 in the third form is white, and the feedback point 802 in the fourth form is black.
Step 705: and in response to the scattering type remote shooting prop hitting the second virtual object, displaying a hitting feedback control with a two-dimensional shape on the virtual environment picture by taking a shooting coverage area indicator of the scattering type remote shooting prop as a reference, and displaying at least one of a feedback point in a fifth form and a feedback point in a sixth form in the hitting feedback control.
And the feedback points displayed in the hit feedback control correspond to the virtual ammunition ejected by the scattering type remote shooting prop one by one.
The feedback point of the fifth form is used for indicating that virtual ammunition ejected by the scattering type remote shooting prop hits the second virtual object at the first time, the feedback point of the sixth form is used for indicating that virtual ammunition ejected by the scattering type remote shooting prop hits the second virtual object at the second time, the fifth form and the sixth form are different, and the first time and the second time are different. Wherein the form of the feedback point includes at least one of color, shape and transparency.
Illustratively, a fifth modality of feedback point is displayed within the firing coverage area indicator when virtual ammunition fired by the scattering-type remote-firing prop hits the second virtual object at the first time. And when the virtual ammunition ejected by the scattering type remote shooting prop hits the second virtual object at the second time, displaying a feedback point of a sixth form in the shooting coverage area indicator. Wherein the feedback points of the fifth form are star-shaped, and the feedback points of the sixth form are circular.
Step 706: and in response to the scattering type remote shooting prop hitting the second virtual object, displaying a hitting feedback control with a two-dimensional shape on the virtual environment picture by taking a shooting coverage area indicator of the scattering type remote shooting prop as a reference, and displaying at least one of a feedback point in a seventh form and a feedback point in an eighth form in the hitting feedback control.
And the feedback points displayed in the hit feedback control correspond to the virtual ammunition ejected by the scattering type remote shooting prop one by one. Wherein the form of the feedback point includes at least one of color, shape and transparency.
The feedback point of the seventh form is used for indicating that the virtual ammunition of the first type ejected by the scattering type remote shooting prop hits the second virtual object, the feedback point of the eighth form is used for indicating that the virtual ammunition of the first type ejected by the scattering type remote shooting prop hits the second virtual object, the seventh form and the eighth form are different, and the first type and the second type are different types.
Illustratively, a seventh modality of feedback point is displayed within the firing coverage area indicator when a common type of virtual ammunition fired by the scattering-type remote-firing prop hits a second virtual object. And when the virtual ammunition of the armor piercing type ejected by the scattering type remote shooting prop hits the second virtual object, displaying a feedback point of an eighth form in the shooting coverage area indicator. Wherein, the feedback point of the seventh form is star-shaped, and the feedback point of the sixth form is circular.
In an optional implementation manner, after the client receives the shooting operation, the client sends the shooting information to the server; the server receives the shooting information and calculates a shooting result according to the shooting information; the server sends the shooting result to the client; and the client displays the shooting feedback control on the virtual environment interface according to the shooting result.
In summary, this embodiment controls the first virtual object to use the remote shooting prop to shoot the second virtual object, and then displays the shooting feedback information based on the shooting coverage area indicator on the virtual environment interface. The method for viewing the shooting feedback information is provided for the user, the shooting feedback information is displayed directly on the basis of the shooting coverage area indicator, the hit special effect is not displayed on the second virtual object in the virtual environment, and even if the second virtual object is far away from the first virtual object or the second virtual object avoids obstacles, the user controlling the first virtual object can still view the hit feedback information in the hit feedback control, so that the user can directly view the hit feedback information conveniently.
In addition, according to the embodiment, feedback points of corresponding forms are displayed in the firing coverage area indicator according to at least one of whether the virtual ammunition hits the second virtual object, the type of the virtual ammunition, the hitting position of the virtual ammunition and the hitting time of the virtual ammunition, so that a user can determine specific hitting feedback information according to the form of the feedback points, and the viewing efficiency of the user is improved.
For embodiments in which the second virtual object is in a no-enemy state:
in the related art, when the virtual object revives, the invincibility state is increased for several seconds, and the virtual object in the invincibility state is not damaged. However, as shown in fig. 9, the virtual object 901 and the virtual object 902 are in a no-enemy state, the virtual object 903 and the virtual object 904 are not in a no-enemy state, and when the virtual object 901, the virtual object 902, the virtual object 903, and the virtual object 904 are mixed together, it is difficult for the user to recognize that the virtual object 901 and the virtual object 902 are in a no-enemy state. Therefore, the embodiment of the present application may display the hit feedback information to prompt the user to hit the virtual object in the invincible state.
Fig. 10 is a flowchart illustrating an information display method according to an embodiment of the present application. The method may be performed by the first terminal 120 or the second terminal 160 shown in fig. 1, the method comprising the steps of:
step 1001: a virtual environment picture of the first virtual object and a firing indicator are displayed.
The first virtual object refers to a master object of the user in the virtual environment. Illustratively, the first virtual object is at least one of a virtual character, a virtual animal, an animation character, a virtual vehicle, and a virtual animal.
The first virtual object is provided with a scattering type remote shooting prop which shoots out at least two virtual ammunitions at a time. Illustratively, the scattering-type remote shooting prop includes at least one of a shotgun and a cluster bomb.
The virtual environment is a three-dimensional environment in which the first virtual object is located in the virtual world during the running process of an application program in the terminal. Optionally, in an embodiment of the present application, the virtual environment is observed through a camera model.
The firing indicator is to indicate at least one of a firing direction and a firing coverage of the remote-firing prop.
Step 1002: and responding to the shooting operation of the remote shooting prop, and controlling the first virtual object to shoot the second virtual object by using the remote shooting prop.
The shooting operation is used for controlling the first virtual object to shoot. Alternatively, the shooting operation may be pressing one or more preset physical keys to shoot, or the shooting operation may be performing the shooting operation by a signal generated by long-pressing, clicking, double-clicking, and/or sliding on a designated area of the touch screen. In a VR game, a shooting operation may be a user performing a particular action, or inputting a particular voice to trigger a shooting operation.
The second virtual object is a virtual object in the virtual environment other than the first virtual object. The second virtual object may be in the same row as the first virtual object, or the second virtual object may be in a different row from the first virtual object.
Step 1003: and displaying a hit feedback control based on a shooting indicator of the remote shooting prop on the virtual environment picture.
The hit feedback control can be a transparent control on the virtual environment picture or a non-transparent control on the virtual environment picture. Wherein the relationship between the firing indicators hitting the feedback control and the remote prop includes at least one of: displaying a hit feedback control with a two-dimensional shape on a virtual environment picture by taking a shooting indicator of a remote shooting prop as a center; or, on the virtual environment picture, surrounding the shooting indicator of the remote shooting prop, displaying a hit feedback control with a two-dimensional shape; or, displaying a hit feedback control with a two-dimensional shape on the virtual environment picture along with a shooting indicator of the remote shooting prop.
Step 1004: and responding to the fact that the remote shooting prop hits the second virtual object and the second virtual object is in a no-enemy state, and displaying a feedback identifier in the hit feedback control.
The invincibility state is used to indicate that the second virtual object is ineffectual of harm. Optionally, the second virtual object is in a invincible state after revival. Or after the second virtual object uses the invincibility prop, the second virtual object is in the invincibility state. Alternatively, the second virtual object is in a no-enemy state after the second virtual object uses no-enemy skills.
Illustratively, as shown in fig. 11, the second virtual object 1101 is in a invincible state, and after hitting the second virtual object 1101 with a remote shooting prop, a feedback identifier 1102 is displayed within a hit feedback control.
In another optional implementation of the present application, in response to the remote shooting prop hitting the second virtual object and the second virtual object being in a no-enemy state, displaying a first feedback identification within the hit feedback control; and displaying a second feedback identifier in the hit feedback control in response to the remote shooting prop hitting the second virtual object and the second virtual object being in a non-invincible state. The first feedback identification and the second feedback identification are identifications with different forms. Illustratively, the first feedback indicator is a red indicator and the second feedback indicator is a green indicator.
In summary, this embodiment controls the first virtual object to use the remote shooting prop to shoot the second virtual object, and then displays the shooting feedback information based on the shooting coverage area indicator on the virtual environment interface. The method for checking the shooting feedback information is provided for the user, the shooting feedback information is directly displayed on the basis of the shooting coverage area indicator, the user can conveniently check the shooting feedback information, and the checking efficiency of the user is improved.
In addition, in this embodiment, after the second virtual object in the invincible state is hit, the feedback identifier is displayed, so that the user can determine the state of the second virtual object according to the feedback identifier, and the viewing efficiency of the user is improved.
For an example of whether a scattering-type remote shooting prop hits a second virtual object:
fig. 12 is a flowchart illustrating an information display method according to an embodiment of the present application. The method may be performed by the first terminal 120 or the second terminal 160 shown in fig. 1, the method comprising the steps of:
step 1201: a virtual environment picture of the first virtual object and a firing indicator are displayed.
The first virtual object refers to a master object of the user in the virtual environment. Illustratively, the first virtual object is at least one of a virtual character, a virtual animal, an animation character, a virtual vehicle, and a virtual animal.
The first virtual object is provided with a scattering type remote shooting prop which shoots out at least two virtual ammunitions at a time. Illustratively, the scattering-type remote shooting prop includes at least one of a shotgun and a cluster bomb.
The virtual environment is a three-dimensional environment in which the first virtual object is located in the virtual world during the running process of an application program in the terminal. Optionally, in an embodiment of the present application, the virtual environment is observed through a camera model.
The firing indicator is to indicate at least one of a firing direction and a firing coverage of the remote-firing prop.
Step 1202: and controlling the first virtual object to shoot the second virtual object by using the scattering type remote shooting prop in response to the shooting operation of the scattering type remote shooting prop.
The shooting operation is used for controlling the first virtual object to shoot. Alternatively, the shooting operation may be pressing one or more preset physical keys to shoot, or the shooting operation may be performing the shooting operation by a signal generated by long-pressing, clicking, double-clicking, and/or sliding on a designated area of the touch screen. In a VR game, a shooting operation may be a user performing a particular action, or inputting a particular voice to trigger a shooting operation.
The second virtual object is a virtual object in the virtual environment other than the first virtual object. The second virtual object may be in the same row as the first virtual object, or the second virtual object may be in a different row from the first virtual object.
Step 1203: and judging whether the virtual ammunition ejected by the scattering type remote shooting prop hits a second virtual object.
If the virtual ammunition ejected by the scattering type remote shooting prop hits the second virtual object, executing step 1204;
if the virtual ammunition ejected by the scattering type remote shooting prop misses the second virtual object, step 1205 is executed.
Step 1204: and displaying the feedback point of the first form in the hit feedback control.
And the feedback points displayed in the hit feedback control correspond to the virtual ammunition ejected by the scattering type remote shooting prop one by one.
The form of the feedback point includes at least one of color, shape, and transparency.
Step 1205: and displaying the feedback point of the second form in the hit feedback control.
The second configuration is different from the first configuration. Illustratively, after virtual ammunition ejected by the scattering type remote shooting prop hits the second virtual object, a circular feedback point displayed in the feedback control is hit. And after the virtual ammunition ejected by the scattering type remote shooting prop misses the second virtual object, hitting a feedback point of a square displayed in the feedback control.
In summary, in the embodiment, when the scattering type remote shooting prop hits the second virtual object, the feedback point of the first form is displayed, and when the scattering type remote shooting prop misses the second virtual object, the feedback point of the second form is displayed. And the user can conveniently determine whether the scattering type remote shooting prop hits the second virtual object according to the form of the feedback point.
An embodiment for whether a scattering-type remote-shooting prop hits the first location of the second virtual object:
fig. 13 is a flowchart illustrating an information display method according to an embodiment of the present application. The method may be performed by the first terminal 120 or the second terminal 160 shown in fig. 1, the method comprising the steps of:
step 1301: a virtual environment picture of the first virtual object and a firing indicator are displayed.
The first virtual object refers to a master object of the user in the virtual environment. Illustratively, the first virtual object is at least one of a virtual character, a virtual animal, an animation character, a virtual vehicle, and a virtual animal.
The first virtual object is provided with a scattering type remote shooting prop which shoots out at least two virtual ammunitions at a time. Illustratively, the scattering-type remote shooting prop includes at least one of a shotgun and a cluster bomb.
The virtual environment is a three-dimensional environment in which the first virtual object is located in the virtual world during the running process of an application program in the terminal. Optionally, in an embodiment of the present application, the virtual environment is observed through a camera model.
The firing indicator is to indicate at least one of a firing direction and a firing coverage of the remote-firing prop.
Step 1302: and controlling the first virtual object to shoot the second virtual object by using the scattering type remote shooting prop in response to the shooting operation of the scattering type remote shooting prop.
The shooting operation is used for controlling the first virtual object to shoot. Alternatively, the shooting operation may be pressing one or more preset physical keys to shoot, or the shooting operation may be performing the shooting operation by a signal generated by long-pressing, clicking, double-clicking, and/or sliding on a designated area of the touch screen. In a VR game, a shooting operation may be a user performing a particular action, or inputting a particular voice to trigger a shooting operation.
The second virtual object is a virtual object in the virtual environment other than the first virtual object. The second virtual object may be in the same row as the first virtual object, or the second virtual object may be in a different row from the first virtual object.
Step 1303: and judging whether the virtual ammunition ejected by the scattering type remote shooting prop hits the first part of the second virtual object.
If the virtual ammunition ejected by the scattering type remote shooting prop hits the first part of the second virtual object, executing a step 1304;
if the virtual ammunition ejected by the scattering type remote shooting prop misses the first part of the second virtual object, step 1305 is executed.
Illustratively, the first location is a head of the second virtual object, and the second location is a location of the second virtual object other than the head.
Step 1304: and displaying a feedback point of a third form in the hit feedback control.
And the feedback points displayed in the hit feedback control correspond to the virtual ammunition ejected by the scattering type remote shooting prop one by one.
The form of the feedback point includes at least one of color, shape, and transparency.
Step 1305: and displaying a feedback point of a fourth form in the hit feedback control.
The third form is different from the fourth form. Illustratively, after virtual ammunition ejected by the scattering type remote shooting prop hits the head of the second virtual object, a red feedback point displayed in the feedback control is hit. And after virtual ammunition ejected by the scattering type remote shooting prop hits a second part of the second virtual object, hitting a feedback point displayed black in the feedback control.
In summary, in the embodiment, when the scattering-type remote shooting prop hits the first portion of the second virtual object, the feedback point of the third form is displayed, and when the scattering-type remote shooting prop misses the first portion of the second virtual object, the feedback point of the fourth form is displayed. And the user can conveniently determine the position of the scattering type remote shooting prop hitting the second virtual object according to the form of the feedback point.
In the following embodiments, a method of implementing a scattering-type remote shooting prop is provided. The method enables the scattering type remote shooting prop to shoot at least two virtual ammunitions in a single shooting.
Fig. 14 is a flow chart illustrating a method for determining the shooting direction of a remote shooting prop according to an embodiment of the present disclosure. The method may be performed by the first terminal 120 or the second terminal 160 or the server cluster 140 shown in fig. 1, the method comprising the steps of:
step 1401: and determining the inner circle and the outer circle corresponding to the remote shooting prop.
The inner circle and the outer circle are positioned in a plane perpendicular to the shooting direction, and an ejection point of the remote shooting prop is arranged on the plane. Illustratively, as shown in FIG. 15, the inner and outer circles lie in a plane 1502 perpendicular to the firing direction 1504 of the remote-fired prop 1501, which has a firing point 1503 of the remote-fired prop 1501.
Step 1402: a first shot location is randomly generated within the inner circle.
Illustratively, a position is randomly generated within the inner circle, and the algorithm is as follows: the first emission position of the inner circle can be obtained by randomly generating an angle from the random.
For example, as shown in fig. 16, a first injection position 1603, a first injection position 1604, and a first injection position 1605 are randomly generated within an inner circle 1601.
Step 1403: a second injection position is generated at any time in a circular ring formed by the inner circle and the outer circle.
Illustratively, the second shot position is generated at any time in a ring formed by the inner circle and the outer circle, and the algorithm is as follows: the method comprises the following steps of firstly, calculating a random distance, wherein the random distance refers to the distance from an emitting position to an emitting point, and the specific steps are as follows: range (m _ minshotspradsiawsize, m _ maxsastroadsize), where, range (m _ minshotspradsiawsize, m _ maxsastroadsize) represents taking a random value between m _ minshotspreadsiawsize and m _ maxsastroadsize, m _ minshotspreadsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsaas means that the random distance is greater than the inner circle radius, and m _ maxsasteaxsasteaksiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiawsiabse. Secondly, calculating a random angle, and concretely comprising the following steps: a random start angle, StartAngle, range (0,360), is set, wherein range (0,360) indicates that the inspiration angle is a random value between 0 and 360 degrees, and therefore, the start angle ranges from 0 to 360 degrees. Then, a specific random angle is calculated, angle ═ random. range (0, m _ angle incertation) + StartAngle, where angle incertation is a pre-configured angle value. Therefore, the distance from the second emission position to the emission point is random.
In an alternative embodiment, for convenience of calculation, the injection base angle is randomly determined to be a; and enabling the angle of the second emission position to be a + k 360/n, wherein n is a positive integer, and k is an integer which is more than 0 and less than n + 1. For example, if n is 6, and the emission base angle is 78, the angle of one of the second emission positions is 78+60 ═ 138, and the angle of one of the second emission positions is 198, which is a random value 78+ 120.
Illustratively, as shown in fig. 16, second emission position 1606, second emission position 1607, second emission position 1608, and second emission position 1609 are randomly generated within the circle formed by inner circle 1601 and outer circle 1602.
Step 1404: and taking a first direction from the preset point of the remote shooting prop to the first ejection position and a second direction from the preset point of the remote shooting prop to the second ejection position as the ejection directions of the virtual ammunition ejected by the remote shooting prop.
Illustratively, as shown in FIG. 15, preset points 1505 of a remotely fired prop 1501 are taken in a first direction 1506 to a first shot position, and preset points 1505 of a remotely fired prop 1501 are taken in a second direction 1507 to a second shot position.
For example, the procedure for determining the injection direction is as follows:
float rad angle (mathf. pi. 2/360); calculating radian of injection position
(vii) float x ═ r × mathf. cos (rad); calculating coordinates of the injection position in the x-direction
float y ═ r × mathf.sin (rad); calculating coordinates of the injection position in the y direction
m _ ShotSpeard ═ new Vector2(x, y); v/calculating the injection Direction
In summary, the present embodiment provides a method for calculating an ejection direction of a remote-shooting prop, which enables the remote-shooting prop to eject at least two virtual ammunitions at a single shooting.
Fig. 17 is a flowchart illustrating an information display method according to an embodiment of the present application. The method may be performed by the first terminal 120 or the second terminal 160 shown in fig. 1, the method comprising the steps of:
step 1701: the process begins.
Step 1702: the first virtual object is controlled to equip the shotgun.
A shotgun fires at least two charges in a single shot.
Step 1703: it is determined whether the second virtual object is hit.
If yes, go to step 1704;
if the second virtual object is not hit, go back to step 1702.
Step 1704: and judging whether the second virtual object is in a no-enemy state.
If the second virtual object is in the invincibility state, go to step 1706;
if the second virtual object is not in the invincibility state, step 1707 is executed.
The invincibility state is used to indicate that the second virtual object is ineffectual of harm. Illustratively, the second virtual object may be harmed when the second virtual object is under attack and the second virtual object is not in a invincible state. When the second virtual object is attacked and the second virtual object is in a invincible state, the second virtual object can be damaged and can be zeroed.
Optionally, the second virtual object is in a invincible state after revival. Or after the second virtual object uses the invincibility prop, the second virtual object is in the invincibility state. Alternatively, the second virtual object is in a no-enemy state after the second virtual object uses no-enemy skills.
Step 1705: a hit location for the second virtual object is determined.
The hit location of the second virtual object includes at least one of a head, an arm, a leg, a carcass.
Step 1706: and displaying the feedback identification in the hit feedback control.
In another optional implementation of the present application, in response to the remote shooting prop hitting the second virtual object and the second virtual object being in a no-enemy state, displaying a first feedback identification within the hit feedback control; and displaying a second feedback identifier in the hit feedback control in response to the remote shooting prop hitting the second virtual object and the second virtual object being in a non-invincible state. The first feedback identification and the second feedback identification are identifications with different forms. Illustratively, the first feedback indicator is a red indicator and the second feedback indicator is a green indicator.
Step 1707: it is determined whether the hit position is a head position of the second virtual object.
If the hit position is the head position of the second virtual object, go to step 1708;
if the hit position is not the head position of the second virtual object, step 1709 is executed.
Step 1708: and displaying the feedback point of the third form in the hit feedback control.
The feedback point of the third form is used for representing that the virtual ammunition ejected by the scattering type remote shooting prop hits the head position of the second virtual object. Illustratively, the feedback point of the third modality is represented by red.
Step 1709: and displaying the feedback point of the fourth form in the hit feedback control.
The feedback point of the third form is used for representing that the virtual ammunition ejected by the scattering type remote shooting prop hits the position, except the head position, of the second virtual object.
Step 1710: the damage caused by the shotgun was calculated.
Optionally, the damage caused by the shotgun is related to the number of virtual ammunition hitting the second virtual object. For example, if one virtual ammunition hits a second virtual object, an 80-point injury is caused. If two virtual ammunitions hit the second virtual object, 160-point damage is caused.
Optionally, the damage caused by the shotgun is associated with a hit location that hits the second virtual object. For example, if a virtual ammunition hits a head location of a second virtual object, a 160 point injury is caused. If the virtual ammunition hits the body site of a second virtual object, an 80-point injury is caused.
Step 1711: and judging whether the damage is larger than the life value of the second virtual object.
If the damage is greater than the life value of the second virtual object, step 1712 is executed.
Step 1712: and the second virtual object is paroxysmal.
Optionally, after the second virtual object is paroxysmal, the second virtual object is revived after a preset time period. Or after the second virtual object is paroxysmal, responding to the second virtual object using the revival prop, and reviving the second virtual object. Alternatively, after the second virtual object is paroxysmal, the second virtual object is revived in response to the second virtual object using revival skills.
Step 1713: the flow ends.
In summary, this embodiment may control the first virtual object to use the remote shooting prop to shoot the second virtual object, and then display the shooting feedback information based on the shooting coverage area indicator on the virtual environment interface. The method for checking the shooting feedback information is provided for the user, the shooting feedback information is directly displayed on the basis of the shooting coverage area indicator, the user can conveniently check the shooting feedback information, and the checking efficiency of the user is improved.
The following are embodiments of the apparatus of the present application, and for details that are not described in detail in the embodiments of the apparatus, reference may be made to corresponding descriptions in the embodiments of the method described above, and details are not described herein again.
Fig. 18 is a schematic structural diagram of an information display device according to an exemplary embodiment of the present application. The apparatus 1800 may be implemented as all or part of a computer device in software, hardware, or a combination of both, and includes:
a display module 1801, configured to display a virtual environment screen of a first virtual object and a shooting indicator, where the first virtual object has a remote shooting prop, and the shooting indicator is configured to indicate at least one of a shooting direction and a shooting coverage of the remote shooting prop;
a control module 1802, configured to control, in response to a shooting operation of the remote shooting prop, the first virtual object to use the remote shooting prop to shoot a second virtual object;
the display module 1801 is further configured to display a hit feedback control based on the shooting indicator of the remote shooting prop on the virtual environment screen, where the hit feedback control is configured to indicate shooting feedback information for the second virtual object.
In an alternative embodiment of the present application, the firing indicator comprises a firing coverage area indicator or a sight bead; the display module 1801 is further configured to display the hit feedback control with a two-dimensional shape on the virtual environment screen by using a shooting coverage area indicator of the remote shooting prop as a reference, where the shooting coverage area indicator indicates a coverage area of virtual ammunition ejected by the remote shooting prop in the virtual environment; or, on the virtual environment picture, the hit feedback control with a two-dimensional shape is displayed with the sight of the remote shooting prop as a reference.
In an optional embodiment of the present application, the remote shooting props are scattering type remote shooting props; the display module 1801 is further configured to display the hit feedback control with a two-dimensional shape on the virtual environment screen by using a shooting coverage area indicator of the scattering-type remote shooting prop as a reference, where at least one of a first-form feedback point and a second-form feedback point is displayed in the hit feedback control; the feedback points displayed in the hit feedback control correspond to virtual ammunition ejected by the scattering type remote shooting prop one by one, the feedback points in the first form are used for representing the virtual ammunition ejected by the scattering type remote shooting prop and hitting the second virtual object, the feedback points in the second form are used for representing the virtual ammunition ejected by the scattering type remote shooting prop and not hitting the second virtual object, and the first form and the second form are different.
In an optional embodiment of the present application, the remote shooting props are scattering type remote shooting props; the display module 1801 is further configured to, in response to the scattering-type remote shooting prop hitting the second virtual object, display, on the virtual environment screen, a hitting feedback control based on a shooting indicator of the scattering-type remote shooting prop, where at least one of a feedback point in a third form and a feedback point in a fourth form is displayed in the hitting feedback control; the feedback points displayed in the hit feedback control correspond to virtual ammunition ejected by the scattering type remote shooting prop one by one, the feedback point in the third form is used for showing that the virtual ammunition ejected by the scattering type remote shooting prop hits a first part of the second virtual object, the feedback point in the fourth form is used for showing that the virtual ammunition ejected by the scattering type remote shooting prop hits a second part of the second virtual object, the third form and the fourth form are different, and the first part and the second part are different parts of the second virtual object.
In an optional embodiment of the present application, the remote shooting props are scattering type remote shooting props; the display module 1801 is further configured to, in response to that the scattering-type remote shooting prop hits the second virtual object, display, on the virtual environment screen, the hit feedback control having a two-dimensional shape with reference to a shooting coverage area indicator of the scattering-type remote shooting prop, where at least one of a feedback point in a fifth form and a feedback point in a sixth form is displayed in the hit feedback control; the feedback points displayed in the hit feedback control correspond to virtual ammunition ejected by the scattering type remote shooting prop one by one, the feedback point in the fifth form is used for indicating that the virtual ammunition ejected by the scattering type remote shooting prop hits the second virtual object at a first time, the feedback point in the sixth form is used for indicating that the virtual ammunition ejected by the scattering type remote shooting prop hits the second virtual object at a second time, the fifth form and the sixth form are different, and the first time and the second time are different.
In an optional embodiment of the present application, the remote shooting props are scattering type remote shooting props; the display module 1801 is further configured to, in response to the scattering-type remote shooting prop hitting the second virtual object, display, on the virtual environment screen, the hitting feedback control having a two-dimensional shape with a shooting coverage area indicator of the scattering-type remote shooting prop as a reference, and display, in the hitting feedback control, at least one of a feedback point in a seventh form and a feedback point in an eighth form; the feedback points displayed in the hit feedback control correspond to virtual ammunition ejected by the scattering type remote shooting prop one by one, the feedback point in the seventh form is used for indicating that the virtual ammunition of the first type ejected by the scattering type remote shooting prop hits the second virtual object, the feedback point in the eighth form is used for indicating that the virtual ammunition of the first type ejected by the scattering type remote shooting prop hits the second virtual object, the seventh form and the eighth form are different, and the first type and the second type are different types.
In an optional implementation manner of the present application, the display module 1801 is further configured to display a feedback identifier in the hit feedback control in response to that the remote shooting prop hits the second virtual object and the second virtual object is in a invincibility state, where the invincibility state is used to indicate that the injury suffered by the second virtual object is invalid.
In an optional implementation manner of the present application, the display module 1801 is further configured to display, on the virtual environment screen, the hit feedback control with a two-dimensional shape by taking a shooting indicator of the remote shooting prop as a center; or, on the virtual environment picture, surrounding a shooting indicator of the remote shooting prop, displaying the hit feedback control with a two-dimensional shape; or, displaying the hit feedback control with a two-dimensional shape on the virtual environment picture along with a shooting indicator of the remote shooting prop.
In an optional embodiment of the present application, the apparatus further comprises a calculating module 1803;
the calculating module 1803 is configured to determine an inner circle and an outer circle corresponding to the remote shooting prop, where the inner circle and the outer circle are located in a plane perpendicular to the shooting direction, and an ejection point of the remote shooting prop is located on the plane; randomly generating a first ejection position in the inner circle; generating a second ejection position in a circular ring formed by the inner circle and the outer circle at any time; and taking a first direction from the ejection point of the remote shooting prop to the first ejection position and a second direction from the ejection point of the remote shooting prop to the second ejection position as ejection directions of virtual ammunition ejected by the remote shooting prop.
In an optional embodiment of the present application, the virtual environment comprises a virtual barrier, on which a crash box is arranged; the calculating module 1803 is further configured to launch a detection ray by using the muzzle position of the remote shooting prop as a ray starting point and using the launching direction of a virtual ammunition launched by the remote shooting prop as a ray direction; and responding to the intersection point generated by the detection ray and the collision box, and displaying a bullet hole special effect corresponding to the intersection point on the virtual barrier.
In summary, this embodiment controls the first virtual object to use the remote shooting prop to shoot the second virtual object, and then displays the shooting feedback information based on the shooting coverage area indicator on the virtual environment interface. The method for viewing the shooting feedback information is provided for the user, the shooting feedback information is displayed directly on the basis of the shooting coverage area indicator, the hit special effect is not displayed on the second virtual object in the virtual environment, and even if the second virtual object is far away from the first virtual object or the second virtual object avoids obstacles, the user controlling the first virtual object can still view the hit feedback information in the hit feedback control, so that the user can directly view the hit feedback information conveniently.
FIG. 19 is a block diagram illustrating a computer device, according to an example embodiment. The computer device 1900 includes a Central Processing Unit (CPU) 1901, a system Memory 1904 including a Random Access Memory (RAM) 1902 and a Read-Only Memory (ROM) 1903, and a system bus 1905 connecting the system Memory 1904 and the CPU 1901. The computer device 1900 also includes a basic Input/Output system (I/O system) 1906 for facilitating information transfer between devices within the computer device, and a mass storage device 1907 for storing an operating system 1913, application programs 1914, and other program modules 1915.
The basic input/output system 1906 includes a display 1908 for displaying information and an input device 1909, such as a mouse, keyboard, etc., for user input of information. Wherein the display 1908 and input device 1909 are coupled to the central processing unit 1901 through an input-output controller 1910 coupled to the system bus 1905. The basic input/output system 1906 may also include an input/output controller 1910 for receiving and processing input from a number of other devices, such as a keyboard, mouse, or electronic stylus. Similarly, input-output controller 1910 also provides output to a display screen, a printer, or other type of output device.
The mass storage device 1907 is connected to the central processing unit 1901 through a mass storage controller (not shown) connected to the system bus 1905. The mass storage device 1907 and its associated computer device-readable media provide non-volatile storage for the computer device 1900. That is, the mass storage device 1907 may include a computer device-readable medium (not shown) such as a hard disk or Compact Disc-Only Memory (CD-ROM) drive.
Without loss of generality, the computer device readable media may comprise computer device storage media and communication media. Computer device storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer device readable instructions, data structures, program modules or other data. Computer device storage media includes RAM, ROM, Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), CD-ROM, Digital Video Disk (DVD), or other optical, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will appreciate that the computer device storage media is not limited to the foregoing. The system memory 1904 and mass storage device 1907 described above may be collectively referred to as memory.
The computer device 1900 may also operate as a remote computer device connected to a network via a network, such as the internet, according to various embodiments of the present disclosure. That is, the computer device 1900 may connect to the network 1911 through the network interface unit 1912 connected to the system bus 1905, or may connect to other types of networks or remote computer device systems (not shown) using the network interface unit 1912.
The memory further includes one or more programs, the one or more programs are stored in the memory, and the central processor 1901 implements all or part of the steps of the information display method by executing the one or more programs.
In an exemplary embodiment, a computer readable storage medium is further provided, in which at least one instruction, at least one program, code set, or instruction set is stored, and the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by a processor to implement the information display method provided by the above-described respective method embodiments.
The present application further provides a computer-readable storage medium, in which at least one instruction, at least one program, a code set, or a set of instructions is stored, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by the processor to implement the information display method provided by the above method embodiment.
The present application also provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, so that the computer device executes the information display method provided in the above embodiment.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. An information display method, characterized in that the method comprises:
displaying a virtual environment picture of a first virtual object and a shooting indicator, wherein the first virtual object is provided with a remote shooting prop, and the shooting indicator is used for indicating at least one of a shooting direction and a shooting coverage range of the remote shooting prop;
responding to a shooting operation of the remote shooting prop, and controlling the first virtual object to shoot a second virtual object by using the remote shooting prop;
displaying a hit feedback control based on the shooting indicator on the virtual environment picture, wherein the hit feedback control is used for indicating shooting feedback information of the second virtual object.
2. The method of claim 1, wherein the firing indicator comprises a firing coverage area indicator or a sight bead;
the virtual environment picture, based on the shooting indicator display hit feedback control of the remote shooting prop, includes:
displaying the hit feedback control with a two-dimensional shape on the virtual environment picture by taking a shooting coverage area indicator of the remote shooting prop as a reference, wherein the shooting coverage area indicator is used for indicating the coverage area of virtual ammunition ejected by the remote shooting prop in the virtual environment;
alternatively, the first and second electrodes may be,
and displaying the hit feedback control with a two-dimensional shape on the virtual environment picture by taking the sight of the remote shooting prop as a reference.
3. The method of claim 2, wherein the remote-shooting props are scattering-type remote-shooting props;
the displaying the hit feedback control with a two-dimensional shape on the virtual environment picture by taking a shooting coverage area indicator of the remote shooting prop as a reference comprises:
displaying the hit feedback control with a two-dimensional shape on the virtual environment picture by taking a shooting coverage area indicator of the scattering type remote shooting prop as a reference, wherein at least one of a first form of feedback point and a second form of feedback point is displayed in the hit feedback control;
the feedback points displayed in the hit feedback control correspond to virtual ammunition ejected by the scattering type remote shooting prop one by one, the feedback points in the first form are used for representing the virtual ammunition ejected by the scattering type remote shooting prop and hitting the second virtual object, the feedback points in the second form are used for representing the virtual ammunition ejected by the scattering type remote shooting prop and not hitting the second virtual object, and the first form and the second form are different.
4. The method of claim 2, wherein the remote-shooting props are scattering-type remote-shooting props;
the virtual environment picture, based on the shooting indicator display hit feedback control of the remote shooting prop, includes:
responding to the scattering type remote shooting prop hitting the second virtual object, and displaying a hitting feedback control based on a shooting coverage area indicator of the scattering type remote shooting prop on the virtual environment picture, wherein at least one of a feedback point in a third form and a feedback point in a fourth form is displayed in the hitting feedback control;
the feedback points displayed in the hit feedback control correspond to virtual ammunition ejected by the scattering type remote shooting prop one by one, the feedback point in the third form is used for showing that the virtual ammunition ejected by the scattering type remote shooting prop hits a first part of the second virtual object, the feedback point in the fourth form is used for showing that the virtual ammunition ejected by the scattering type remote shooting prop hits a second part of the second virtual object, the third form and the fourth form are different, and the first part and the second part are different parts of the second virtual object.
5. The method of claim 2, wherein the remote-shooting props are scattering-type remote-shooting props;
the displaying the hit feedback control with a two-dimensional shape on the virtual environment picture by taking a shooting coverage area indicator of the remote shooting prop as a reference comprises:
in response to the scattering type remote shooting prop hitting the second virtual object, displaying a hitting feedback control with a two-dimensional shape on the virtual environment picture by taking a shooting coverage area indicator of the scattering type remote shooting prop as a reference, and displaying at least one of a feedback point in a fifth form and a feedback point in a sixth form in the hitting feedback control;
the feedback points displayed in the hit feedback control correspond to virtual ammunition ejected by the scattering type remote shooting prop one by one, the feedback point in the fifth form is used for indicating that the virtual ammunition ejected by the scattering type remote shooting prop hits the second virtual object at a first time, the feedback point in the sixth form is used for indicating that the virtual ammunition ejected by the scattering type remote shooting prop hits the second virtual object at a second time, the fifth form and the sixth form are different, and the first time and the second time are different.
6. The method of claim 2, wherein the remote-shooting props are scattering-type remote-shooting props;
the displaying the hit feedback control with a two-dimensional shape on the virtual environment picture by taking a shooting coverage area indicator of the remote shooting prop as a reference comprises:
in response to the scattering type remote shooting prop hitting the second virtual object, displaying a hitting feedback control with a two-dimensional shape on the virtual environment picture by taking a shooting coverage area indicator of the scattering type remote shooting prop as a reference, and displaying at least one of a feedback point in a seventh form and a feedback point in an eighth form in the hitting feedback control;
the feedback points displayed in the hit feedback control correspond to virtual ammunition ejected by the scattering type remote shooting prop one by one, the feedback point in the seventh form is used for indicating that the virtual ammunition of the first type ejected by the scattering type remote shooting prop hits the second virtual object, the feedback point in the eighth form is used for indicating that the virtual ammunition of the first type ejected by the scattering type remote shooting prop hits the second virtual object, the seventh form and the eighth form are different, and the first type and the second type are different types.
7. The method of claim 2, further comprising:
and responding to the fact that the remote shooting prop hits the second virtual object and the second virtual object is in a no-enemy state, displaying a feedback identifier in the hit feedback control, wherein the no-enemy state is used for representing that the second virtual object is damaged inefficiently.
8. The method of any of claims 2-7, wherein displaying the hit feedback control with a two-dimensional shape on the virtual environment screen with reference to a firing indicator of the remote firing prop comprises:
displaying the hit feedback control with a two-dimensional shape on the virtual environment picture by taking a shooting indicator of the remote shooting prop as a center;
or, on the virtual environment picture, surrounding a shooting indicator of the remote shooting prop, displaying the hit feedback control with a two-dimensional shape;
or, displaying the hit feedback control with a two-dimensional shape on the virtual environment picture along with a shooting indicator of the remote shooting prop.
9. The method according to any one of claims 1 to 7, further comprising:
determining an inner circle and an outer circle corresponding to the remote shooting prop, wherein the inner circle and the outer circle are positioned in a plane perpendicular to the shooting direction, and an ejection point of the remote shooting prop is arranged on the plane;
randomly generating a first ejection position in the inner circle; generating a second ejection position in a circular ring formed by the inner circle and the outer circle at any time;
and taking a first direction from the preset point of the remote shooting prop to the first ejection position and a second direction from the preset point of the remote shooting prop to the second ejection position as the ejection directions of the virtual ammunition ejected by the remote shooting prop.
10. The method of any one of claims 1 to 7, wherein the virtual environment comprises a virtual barrier on which a crash box is disposed;
the method further comprises the following steps:
launching detection rays by taking the muzzle position of the remote shooting prop as a ray starting point and taking the launching direction of virtual ammunition launched by the remote shooting prop as a ray direction;
and responding to the intersection point generated by the detection ray and the collision box, and displaying a bullet hole special effect corresponding to the intersection point on the virtual barrier.
11. An information display apparatus, characterized in that the apparatus comprises:
the shooting system comprises a display module, a shooting module and a shooting module, wherein the display module is used for displaying a virtual environment picture and a shooting indicator of a first virtual object, the first virtual object is provided with a remote shooting prop, and the shooting indicator is used for indicating at least one of a shooting direction and a shooting coverage range of the remote shooting prop;
the control module is used for responding to the shooting operation of the remote shooting prop and controlling the first virtual object to use the remote shooting prop to shoot a second virtual object;
the display module is further configured to display a hit feedback control based on a shooting indicator of the remote shooting prop on the virtual environment screen, where the hit feedback control is used to indicate shooting feedback information for the second virtual object.
12. The apparatus of claim 11, wherein the firing indicator comprises a firing coverage area indicator or a sight;
the display module is further configured to display the hit feedback control with a two-dimensional shape on the virtual environment screen by using a shooting coverage area indicator of the remote shooting prop as a reference, where the shooting coverage area indicator is used to indicate a coverage area of virtual ammunition ejected by the remote shooting prop in the virtual environment; or, on the virtual environment picture, the hit feedback control with a two-dimensional shape is displayed with the sight of the remote shooting prop as a reference.
13. A computer device, characterized in that the computer device comprises: a processor and a memory, the memory having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement the information display method of any one of claims 1 to 10.
14. A computer-readable storage medium, in which at least one program code is stored, the program code being loaded and executed by a processor to implement the information display method according to any one of claims 1 to 10.
15. A computer program product comprising a computer program or instructions, characterized in that the computer program or instructions, when executed by a processor, implement the information display method of any one of claims 1 to 10.
CN202111662007.6A 2021-11-05 2021-12-31 Information display method, device, equipment and medium Pending CN114247140A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111305192 2021-11-05
CN2021113051923 2021-11-05

Publications (1)

Publication Number Publication Date
CN114247140A true CN114247140A (en) 2022-03-29

Family

ID=80798959

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111662007.6A Pending CN114247140A (en) 2021-11-05 2021-12-31 Information display method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN114247140A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024045025A1 (en) * 2022-08-31 2024-03-07 上海莉莉丝科技股份有限公司 Virtual weapon control method, electronic device, medium, and product

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024045025A1 (en) * 2022-08-31 2024-03-07 上海莉莉丝科技股份有限公司 Virtual weapon control method, electronic device, medium, and product

Similar Documents

Publication Publication Date Title
CN108654086B (en) Method, device and equipment for obtaining attack damage in virtual environment
WO2021213026A1 (en) Virtual object control method and apparatus, and device and storage medium
US20220379219A1 (en) Method and apparatus for controlling virtual object to restore attribute value, terminal, and storage medium
WO2022057624A1 (en) Method and apparatus for controlling virtual object to use virtual prop, and terminal and medium
CN112138384B (en) Using method, device, terminal and storage medium of virtual throwing prop
KR20230130080A (en) Methods, devices, devices, storage media, and program products for controlling summoned objects during virtual scenarios
CN110841290A (en) Processing method and device of virtual prop, storage medium and electronic device
CN111921198B (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN110585716A (en) Virtual item control method, device, equipment and storage medium
CN110975283A (en) Processing method and device of virtual shooting prop, storage medium and electronic device
US20230330530A1 (en) Prop control method and apparatus in virtual scene, device, and storage medium
CN112891932A (en) Method, device, equipment and medium for controlling virtual character to move
CN111202983A (en) Method, device, equipment and storage medium for using props in virtual environment
JP2023541697A (en) Position acquisition method, device, electronic device, storage medium and computer program in virtual scene
WO2022105480A1 (en) Virtual object control method, device, terminal, storage medium, and program product
CN113769391B (en) Method, device, equipment and medium for acquiring skills in virtual environment
CN114247140A (en) Information display method, device, equipment and medium
CN114225393A (en) Game resource acquisition method, device, medium, device and program product
CN112121428B (en) Control method and device for virtual character object and storage medium
CN112107859A (en) Prop control method and device, storage medium and electronic equipment
US20230030619A1 (en) Method and apparatus for displaying aiming mark
CN111111165A (en) Control method and device of virtual prop, storage medium and electronic device
CN112774189B (en) Picture display method, device, terminal and storage medium
CN115645916A (en) Control method, device and product of virtual object group in virtual scene
CN112138392B (en) Virtual object control method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination