CN113663329A - Shooting control method and device for virtual character, electronic equipment and storage medium - Google Patents

Shooting control method and device for virtual character, electronic equipment and storage medium Download PDF

Info

Publication number
CN113663329A
CN113663329A CN202111019626.3A CN202111019626A CN113663329A CN 113663329 A CN113663329 A CN 113663329A CN 202111019626 A CN202111019626 A CN 202111019626A CN 113663329 A CN113663329 A CN 113663329A
Authority
CN
China
Prior art keywords
virtual
shooting
bullet
drop
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111019626.3A
Other languages
Chinese (zh)
Other versions
CN113663329B (en
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202111019626.3A priority Critical patent/CN113663329B/en
Publication of CN113663329A publication Critical patent/CN113663329A/en
Application granted granted Critical
Publication of CN113663329B publication Critical patent/CN113663329B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5372Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides a shooting control method, a shooting control device, electronic equipment, a computer readable storage medium and a computer program product for virtual roles; the method comprises the following steps: displaying a virtual character and a virtual shooting prop held by the virtual character in a virtual scene, wherein the virtual shooting prop can simultaneously launch a plurality of virtual bullets when shooting each time; controlling the virtual shooting prop to launch a first number of virtual bullets to a first bullet drop zone in response to a first shooting trigger operation for the virtual shooting prop when the virtual character is in a static state; and in response to a second shooting trigger operation for the virtual shooting prop when the virtual character is in the motion state, controlling the virtual shooting prop to launch a second number of virtual bullets to a second bullet drop zone, wherein the second bullet drop zone is smaller than the first bullet drop zone. Through this application, can realize the diversified transmission form of virtual bullet when control virtual shooting stage property shoots to improve the precision and the efficiency of shooting control.

Description

Shooting control method and device for virtual character, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer human-computer interaction technologies, and in particular, to a method and an apparatus for controlling virtual character shooting, an electronic device, a computer-readable storage medium, and a computer program product.
Background
The human-computer interaction technology of the virtual scene based on the graphic processing hardware can realize diversified interaction between virtual roles controlled by users or artificial intelligence according to actual application requirements, and has wide practical value. For example, in virtual scenes such as military exercise simulation and games, a real battle process between virtual characters can be simulated.
Taking a game scene as an example, the shooting game is a competitive game deeply loved by users, can help the users to release pressure and relax mood, and can improve the reaction capability and sensitivity of the users through the shooting game.
However, in the solutions provided in the related art, the shooting control manner of the virtual shooting prop is relatively single, and cannot adapt to the diversified states of the virtual character in the virtual scene, for example, the virtual shooting prop is a shotgun, and in the related art, the shooting form of the shotgun bullet is the same in different operation modes, for example, whether the shooting mode is jump shooting or standing shooting, thereby affecting the accuracy and efficiency of the shooting control.
Disclosure of Invention
The embodiment of the application provides a virtual character shooting control method and device, electronic equipment, a computer readable storage medium and a computer program product, which can realize diversified launching forms of virtual bullets when controlling a virtual shooting prop to shoot, thereby improving the precision and efficiency of shooting control.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a shooting control method for virtual roles, which comprises the following steps:
displaying a virtual character and a virtual shooting prop held by the virtual character in a virtual scene, wherein the virtual shooting prop can simultaneously launch a plurality of virtual bullets at each shooting;
in response to a first firing trigger operation for the virtual firing prop while the virtual character is in a stationary state, controlling the virtual firing prop to fire a first number of virtual bullets to a first bullet drop zone;
in response to a second firing trigger operation for the virtual firing prop when the virtual character is in a motion state, controlling the virtual firing prop to fire a second number of virtual bullets to a second drop zone, wherein the second drop zone is smaller than the first drop zone.
The embodiment of the present application provides a shooting control device of virtual character, includes:
the virtual shooting prop comprises a display module, a shooting module and a control module, wherein the display module is used for displaying a virtual character and a virtual shooting prop held by the virtual character in a virtual scene, and the virtual shooting prop can simultaneously shoot a plurality of virtual bullets when shooting at each time;
the control module is used for responding to a first shooting trigger operation aiming at the virtual shooting prop when the virtual character is in a static state, and controlling the virtual shooting prop to launch a first number of virtual bullets to a first bullet drop area;
the control module is further configured to control the virtual shooting prop to launch a second number of virtual bullets to a second bullet drop area in response to a second shooting trigger operation for the virtual shooting prop when the virtual character is in a motion state, where the second bullet drop area is smaller than the first bullet drop area.
In the above scheme, the device further includes a determining module, configured to determine a first bullet drop region based on a first shooting collision position and a first scattering distance, where the first scattering distance is a scattering distance corresponding to the virtual character in a stationary state; the control module is further configured to control the virtual shooting prop to perform at least one first shooting to the first bullet drop area to launch a first number of virtual bullets, and control the first number of virtual bullets to be randomly distributed in the first number of bullet drop points in the first bullet drop area.
In the above scheme, the device further includes a generating module, configured to generate a first detection ray extending in a shooting direction of the at least one first shot, with a launch opening of the virtual shooting prop as a starting point; the determining module is further configured to determine a first projectile landing area based on a first scattering distance by using a first shooting collision position between the first detection ray and a first virtual obstacle in the virtual scene as a reference point.
In the foregoing solution, the control module is further configured to execute the following processing for each dummy bullet: and randomly generating a bullet drop point in the first bullet drop area to serve as the corresponding bullet drop point of the virtual bullet, and controlling the virtual bullet to hit the bullet drop point.
In the above scheme, the determining module is further configured to determine a second bullet drop area based on a second shooting collision position and a second scattering distance, where the second scattering distance is smaller than a first scattering distance corresponding to the virtual character in a stationary state; the control module is further configured to control the virtual shooting prop to perform at least one second shooting to the second bullet drop area to launch a second number of virtual bullets, and control the second number of virtual bullets to be randomly distributed in a second number of bullet drop points in the second bullet drop area.
In the above scheme, the device further includes a detection module, configured to detect an action included in the motion state; the determining module is further configured to determine to switch to an operation of controlling the virtual shooting prop to shoot a second number of virtual bullets at least once at the second bullet drop area when the motion state includes one type of single action or a plurality of different types of single actions that are successively connected.
In the above scheme, the generating module is further configured to generate a second detection ray extending along a shooting direction of the at least one second shot, with a launch opening of the virtual shooting prop as a starting point; the determining module is further configured to determine a second projectile landing area based on a second scattering distance by using a second shooting collision position between the second detection ray and a second virtual obstacle in the virtual scene as a reference point.
In the foregoing solution, the determining module is further configured to determine the second scattering distance by any one of the following manners: subtracting a fixed adjustment amplitude from the first scattering distance to obtain a second scattering distance; and dividing the first scattering distance by a fixed adjustment multiple to obtain the second scattering distance, wherein the adjustment multiple is greater than 1.
In the foregoing solution, the determining module is further configured to determine the second scattering distance by: detecting motion parameters of a motion included in the motion state; determining an adjustment coefficient negatively correlated with the action parameter, and correspondingly obtaining a dynamic adjustment amplitude or a dynamic adjustment multiple according to the product of the adjustment coefficient and the fixed adjustment amplitude or the fixed adjustment multiple; and adjusting the first scattering distance according to the dynamic adjustment amplitude or the dynamic adjustment multiple to obtain the second scattering distance.
In the foregoing solution, when the virtual shooting prop is in a repeating mode, the determining module is further configured to execute the following processing for each shooting stage in the repeating mode: updating a second scattering distance corresponding to the previous shooting stage or an average distance between adjacent bullet drop points; taking the updated second scattering distance as a second scattering distance corresponding to the current shooting stage, or taking the updated average distance between adjacent bullet drop points as the average distance between the adjacent bullet drop points corresponding to the current shooting stage; wherein each of the shot stages comprises at least one second shot.
In the above scheme, the control module is further configured to control the virtual shooting prop to perform at least one second shooting to a second bullet drop area to launch a second number of virtual bullets; and controlling the second number of virtual bullets to be randomly distributed in the second number of bullet-falling points in the second bullet-falling area, wherein the average distance between the adjacent bullet-falling points in the second bullet-falling area is smaller than the average distance between the adjacent bullet-falling points in the first bullet-falling area.
In the above scheme, the detection module is further configured to detect an action included in the motion state; the determining module is further configured to determine to shift to an operation of controlling the virtual shooting prop to shoot a second number of virtual bullets at least once at a second shooting area when the motion state includes a plurality of single actions of different types occurring at the same time.
In the above scheme, the apparatus further includes a dividing module, configured to averagely divide the second shot falling region into a second number of candidate regions; the generating module is further configured to randomly generate one drop point in each candidate region to obtain a second number of drop points; the control module is further configured to allocate the second number of drop points to the second number of virtual bullets respectively; and controlling the second number of virtual bullets to respectively hit the correspondingly distributed drop points in the second drop zone according to the drop points respectively distributed to the second number of virtual bullets.
In the above scheme, the dividing module is further configured to divide the second drop zone into a first sub-zone and a second sub-zone, where centers of the first sub-zone and the second sub-zone are overlapped, and the first sub-zone is smaller than the second sub-zone; the generating module is further configured to randomly generate a third number of drop points in the first sub-area, and allocate the third number of drop points to a third number of virtual bullets, where the third number is smaller than the second number; the control module is further configured to control the third number of virtual bullets to hit the corresponding allocated drop points in the second drop zone according to the drop points allocated to the third number of virtual bullets; the dividing module is further configured to averagely divide the second sub-region into a fourth number of candidate regions, where the fourth number is a difference between the second number and the third number; the generating module is further configured to randomly generate one bullet drop point in each candidate region to obtain a fourth number of bullet drop points; the control module is further configured to allocate the fourth number of drop points to the fourth number of virtual bullets respectively; and controlling the fourth number of virtual bullets to respectively hit the correspondingly distributed drop points in the second drop zone according to the drop points respectively distributed to the fourth number of virtual bullets.
In the foregoing scheme, when the second projectile landing zone is a circular zone with a radius of R, the dividing module is further configured to divide the second projectile landing zone into a first sub-zone with a radius of P and a second sub-zone with an inner ring radius of S and an outer ring radius of R, where P is less than or equal to S and S is less than R, with the center of the second projectile landing zone as a center.
In the foregoing solution, the dividing module is further configured to averagely divide the second sub-region with the inner ring radius of S and the outer ring radius of R into a fourth number of candidate regions, where an angle range corresponding to each candidate region is 360 °/the fourth number; the generating module is further configured to perform the following processing for each candidate region: randomly generating an angle from the angle range corresponding to the candidate region and a radius between S and R; and taking the point corresponding to the angle and the radius in the candidate area as a bullet drop point corresponding to the candidate area.
An embodiment of the present application provides an electronic device, which includes:
a memory for storing executable instructions;
and the processor is used for realizing the shooting control method of the virtual character provided by the embodiment of the application when the executable instructions stored in the memory are executed.
The embodiment of the application provides a computer-readable storage medium, which stores executable instructions for causing a processor to execute the computer-readable storage medium, so as to implement the virtual character shooting control method provided by the embodiment of the application.
The embodiment of the present application provides a computer program product, which includes a computer program or instructions, and the computer program or instructions, when executed by a processor, implement the shooting control method for virtual characters provided in the embodiment of the present application.
The embodiment of the application has the following beneficial effects:
when the virtual character is in a static state, receiving a shooting trigger operation aiming at the virtual shooting prop, and controlling the virtual shooting prop to launch a virtual bullet to a first bullet falling area with a larger range; receiving the shooting trigger operation to the virtual shooting stage property when the virtual character is in the motion state, then controlling the virtual shooting stage property to launch the virtual bullet to the less second of scope and falling the bullet region, so, can adapt to the diversified state of virtual character in the virtual scene and carry out accurate shooting, realize carrying out diversified transmission form according to the different states of virtual character, improved shooting control efficiency.
Drawings
Fig. 1A is a schematic diagram of an application mode of a shooting control method for a virtual character according to an embodiment of the present application;
Fig. 1B is a schematic diagram of an application mode of a shooting control method for a virtual character according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a terminal device 400 provided in an embodiment of the present application;
fig. 3 is a schematic flowchart of a shooting control method for a virtual character according to an embodiment of the present application;
fig. 4A is a schematic flowchart of a shooting control method for a virtual character according to an embodiment of the present application;
fig. 4B is a schematic flowchart of a shooting control method for a virtual character according to an embodiment of the present application;
fig. 5A is a schematic diagram of a virtual bullet shooting mode in a motion state according to an embodiment of the present application;
FIG. 5B is a schematic diagram of a virtual bullet shooting mode in a motion state according to an embodiment of the present application;
FIG. 6 is a schematic view of a shot gun bullet shooting configuration provided by the related art;
FIG. 7 is a schematic diagram of a shot gun bullet shooting configuration provided by the related art to hit a target;
fig. 8A is a schematic application scenario diagram of a shooting control method for a virtual character according to an embodiment of the present application;
fig. 8B is a schematic application scenario diagram of a shooting control method for a virtual character according to an embodiment of the present application;
fig. 9 is a schematic application scenario diagram of a shooting control method for a virtual character according to an embodiment of the present application;
Fig. 10 is a schematic application scenario diagram of a shooting control method for a virtual character according to an embodiment of the present application;
fig. 11 is a schematic diagram illustrating a principle of dividing a falling bullet area according to an embodiment of the present application;
FIG. 12 is a schematic view of the shot form of the shotgun cartridge when fired by the skive shovel as provided in the example of the present application;
fig. 13 is a schematic flowchart of a shooting control method for a virtual character according to an embodiment of the present application;
FIG. 14 is a schematic diagram illustrating the principle of determining the direction of the projectile in accordance with an embodiment of the present application;
FIG. 15 is a schematic diagram illustrating a principle of determining a special effect of a bullet hole according to an embodiment of the present application;
FIG. 16 is a schematic diagram of a material candidate interface according to an embodiment of the present disclosure.
Detailed Description
In order to make the objectives, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the attached drawings, the described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts shall fall within the protection scope of the present application.
In the following description, references to the terms "first", "second", and the like are only used for distinguishing similar objects and do not denote a particular order or importance, but rather the terms "first", "second", and the like may be used interchangeably with the order of priority or the order in which they are expressed, where permissible, to enable embodiments of the present application described herein to be practiced otherwise than as specifically illustrated and described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
Before further detailed description of the embodiments of the present application, terms and expressions referred to in the embodiments of the present application will be described, and the terms and expressions referred to in the embodiments of the present application will be used for the following explanation.
1) In response to the condition or state on which the performed operation depends, one or more of the performed operations may be in real-time or may have a set delay when the dependent condition or state is satisfied; there is no restriction on the order of execution of the operations performed unless otherwise specified.
2) The client, an application program running in the terminal device for providing various services, such as a video playing client, a game client, etc.
3) The virtual scene is a virtual scene displayed (or provided) when an application program runs on the terminal device. The virtual scene may be a simulation environment of a real world, a semi-simulation semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, the virtual scene may include sky, land, ocean, etc., the land may include environmental elements such as desert, city, etc., and the user may control the virtual character to move in the virtual scene.
4) A virtual character, an image of various people and objects in the virtual scene that can interact, or a movable object in the virtual scene. The movable object may be a virtual character, a virtual animal, an animation character, etc., such as a character, animal, etc., displayed in a virtual scene. The avatar may be an avatar in a virtual scene that is virtual to represent the user. The virtual scene can comprise a plurality of virtual characters, and each virtual character has a shape and a volume in the virtual scene and occupies a part of the space in the virtual scene.
For example, virtual characters may be rendered by 3D graphics modeling rendering techniques through a 3D game engine or Digital Content Creation (DCC) software, wherein the virtual character data may include character model data and character skeleton data.
5) Scene data representing characteristic data of the virtual scene, such as the area of a building area in the virtual scene, the current architectural style of the virtual scene, and the like; the position of the virtual building in the virtual scene, the floor space of the virtual building, and the like may also be included.
The embodiment of the application provides a virtual character shooting control method and device, electronic equipment, a computer readable storage medium and a computer program product, which can realize diversified launching forms of virtual bullets when controlling a virtual shooting prop to shoot, thereby improving the precision and efficiency of shooting control. In order to facilitate easier understanding of the method for controlling shooting of a virtual character provided in the embodiment of the present application, an exemplary implementation scenario of the method for controlling shooting of a virtual character provided in the embodiment of the present application is first described, and a virtual scenario in the method for controlling shooting of a virtual character provided in the embodiment of the present application may be completely output based on a terminal device, or cooperatively output based on a terminal device and a server.
In some embodiments, the virtual scene may be a picture presented in a military exercise simulation, and a user may simulate a tactic, a strategy or a tactics through virtual objects belonging to different teams in the virtual scene, so that the virtual scene has a great guiding effect on the command of military operations.
In other embodiments, the virtual scene may also be an environment for game characters to interact with, for example, game characters to play against in the virtual scene, and the two parties may interact with each other in the virtual scene by controlling actions of the game characters, so that the user may relieve life stress during the game.
In an implementation scenario, referring to fig. 1A, fig. 1A is an application mode schematic diagram of the virtual character shooting control method provided in the embodiment of the present application, and is applicable to application modes that can complete the calculation of related data of the virtual scenario 100 completely depending on the computing capability of the graphics processing hardware of the terminal device 400, such as a game in a single-machine/offline mode, and output of the virtual scenario is completed through various different types of terminal devices 400, such as a smart phone, a tablet computer, and a virtual reality/augmented reality device.
As an example, types of Graphics Processing hardware include a Central Processing Unit (CPU) and a Graphics Processing Unit (GPU).
When the visual perception of the virtual scene 100 is formed, the terminal device 400 calculates and displays required data through the graphic computing hardware, completes the loading, analysis and rendering of the display data, and outputs a video frame capable of forming the visual perception on the virtual scene at the graphic output hardware, for example, a two-dimensional video frame is displayed on a display screen of a smart phone, or a video frame realizing a three-dimensional display effect is projected on a lens of augmented reality/virtual reality glasses; in addition, in order to enrich the perception effect, the terminal device 400 may also form one or more of auditory perception, tactile perception, motion perception, and taste perception by means of different hardware.
As an example, the terminal device 400 runs a client 410 (e.g. a standalone version of a game application), and outputs a virtual scene including role playing during the running process of the client 410, where the virtual scene may be an environment for game role interaction, such as a plain, a street, a valley, and the like for game role battle; taking the example of displaying the virtual scene 100 from the first-person perspective, a virtual character 101 and a virtual shooting prop 102 (e.g., a virtual shotgun) held by the virtual character 101 through a holding part (e.g., a hand) are displayed in the virtual scene 100, wherein the virtual character 101 may be a game character controlled by a user, that is, the virtual character 101 is controlled by a real user, and will move in the virtual scene 100 in response to an operation of the real user on a controller (e.g., a touch screen, a voice control switch, a keyboard, a mouse, a joystick, etc.), for example, when the real user moves the joystick to the right, the virtual character 101 will move to the right in the virtual scene 100, and may also remain stationary at the same place, jump, and control the virtual character 101 to perform a shooting operation, and the like.
For example, when virtual character 101 is in a static state and receives a first shooting trigger operation for virtual shooting prop 102 (e.g., when virtual character 101 is in a standing state and receives a user's click operation on a shooting control displayed in virtual scene 100), virtual shooting prop 102 is controlled to fire a first number of virtual bullets to first bullet drop region 103; when the virtual character 101 is in a moving state and receives a second shooting trigger operation for the virtual shooting item 102 (for example, when the virtual character 101 is in a jumping state and receives a click operation of a user on a shooting control displayed in the virtual scene 100), the virtual shooting item 102 is controlled to launch a second number of virtual bullets to the second shooting zone 104, as can be seen from fig. 1A, the second shooting zone 104 is smaller than the first shooting zone 103, that is, when the virtual character 101 is in a moving state and the virtual shooting item 102 is controlled to shoot, as compared with a static state, the range of the virtual bullets falling into is smaller, so that targets displayed in the virtual scene 100 (for example, virtual characters in an opponent formation with the virtual character 101) are easier to hit, and thus, when the virtual shooting item 102 is controlled to shoot in different operation modes, the corresponding launching forms of the virtual bullets are different, therefore, the method adapts to the diversified states of the virtual characters in the virtual scene, improves the accuracy and efficiency of shooting control, and simultaneously improves the game experience of users with different technical levels.
In another implementation scenario, referring to fig. 1B, fig. 1B is a schematic diagram of an application mode of the virtual character shooting control method provided in this embodiment, which is applied to the terminal device 400 and the server 200, and is adapted to complete virtual scene calculation depending on the calculation capability of the server 200 and output an application mode of a virtual scene at the terminal device 400.
Taking the example of forming the visual perception of the virtual scene 100, the server 200 performs calculation of display data (e.g., scene data) related to the virtual scene and sends the calculated display data to the terminal device 400 through the network 300, the terminal device 400 relies on graphics computing hardware to complete loading, parsing and rendering of the calculated display data, and relies on graphics output hardware to output the virtual scene to form the visual perception, for example, a two-dimensional video frame may be presented on a display screen of a smartphone, or a video frame realizing a three-dimensional display effect may be projected on a lens of augmented reality/virtual reality glasses; for perception in the form of a virtual scene, it is understood that an auditory perception may be formed by means of a corresponding hardware output of the terminal device 400, for example using a microphone, a tactile perception using a vibrator, etc.
As an example, the terminal device 400 runs a client 410 (e.g. a network version of a game application) thereon, and performs game interaction with other users by connecting the server 200 (e.g. a game server), the terminal device 400 outputs the virtual scene 100 of the client 410, and displays the virtual scene 100 in a first-person perspective, and displays a virtual character 101 and a virtual shooting prop 102 (e.g. a virtual shotgun) held by the virtual character 101 through a holding part (e.g. a hand) in the virtual scene 100, wherein the virtual character 101 can be a game character controlled by the user, i.e. the virtual character 101 is controlled by a real user, and will move in the virtual scene 100 in response to the real user's operation on a controller (e.g. a touch screen, a voice control switch, a keyboard, a mouse, a joystick, etc.), for example, when the real user moves the joystick to the right, the virtual character 101 will move to the right in the virtual scene 100, it is also possible to keep still, jump, and control the virtual character 101 to perform shooting operations, etc.
For example, when virtual character 101 is in a static state and receives a first shooting trigger operation for virtual shooting prop 102 (e.g., when virtual character 101 is in a standing state and receives a user's click operation on a shooting control displayed in virtual scene 100), virtual shooting prop 102 is controlled to fire a first number of virtual bullets to first bullet drop region 103; when the virtual character 101 is in a moving state and receives a second shooting trigger operation for the virtual shooting item 102 (for example, when the virtual character 101 is in a jumping state and receives a click operation of a user on a shooting control displayed in the virtual scene 100), the virtual shooting item 102 is controlled to launch a second number of virtual bullets to the second shooting zone 104, as can be seen from fig. 1A, the second shooting zone 104 is smaller than the first shooting zone 103, that is, when the virtual character 101 is in a moving state and the virtual shooting item 102 is controlled to shoot, as compared with a static state, the range of the virtual bullets falling into is smaller, so that targets displayed in the virtual scene 100 (for example, virtual characters in an opponent formation with the virtual character 101) are easier to hit, and thus, when the virtual shooting item 102 is controlled to shoot in different operation modes, the corresponding launching forms of the virtual bullets are different, therefore, the method adapts to the diversified states of the virtual characters in the virtual scene, improves the accuracy and efficiency of shooting control, and simultaneously improves the game experience of users with different technical levels.
In some embodiments, the terminal device 400 may implement the shooting control method for virtual characters provided by the embodiments of the present application by running a computer program, for example, the computer program may be a native program or a software module in an operating system; may be a Native APPlication (APP), i.e. a program that needs to be installed in an operating system to run, such as a shooting game APP (i.e. the client 410 described above); or may be an applet, i.e. a program that can be run only by downloading it to the browser environment; but also a game applet that can be embedded in any APP. In general, the computer programs described above may be any form of application, module or plug-in.
Taking a computer program as an application program as an example, in actual implementation, the terminal device 400 is installed and runs with an application program supporting a virtual scene. The application program may be any one of a First-Person Shooting game (FPS), a third-Person Shooting game, a virtual reality application program, a three-dimensional map program, a military simulation program, or a multi-player gun-battle type survival game. The user uses the terminal device 400 to operate the virtual character located in the virtual scene for activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing, building a virtual building. Illustratively, the virtual persona may be a virtual character, such as a simulated persona or an animated persona.
In other embodiments, the embodiments of the present application may also be implemented by Cloud Technology (Cloud Technology), which refers to a hosting Technology for unifying resources of hardware, software, network, and the like in a wide area network or a local area network to implement calculation, storage, processing, and sharing of data.
The cloud technology is a general term of network technology, information technology, integration technology, management platform technology, application technology and the like applied based on a cloud computing business model, can form a resource pool, is used as required, and is flexible and convenient. Cloud computing technology will become an important support. Background services of the technical network system require a large amount of computing and storage resources.
For example, the server 200 in fig. 1B may be an independent physical server, may also be a server cluster or a distributed system formed by a plurality of physical servers, and may also be a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a network service, cloud communication, a middleware service, a domain name service, a security service, a CDN, and a big data and artificial intelligence platform. The terminal device 400 may be, but is not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, and the like. The terminal device 400 and the server 200 may be directly or indirectly connected through wired or wireless communication, and the embodiment of the present application is not limited thereto.
The structure of the terminal apparatus 400 shown in fig. 1A is explained below. Referring to fig. 2, fig. 2 is a schematic structural diagram of a terminal device 400 provided in an embodiment of the present application, where the terminal device 400 shown in fig. 2 includes: at least one processor 420, memory 460, at least one network interface 430, and a user interface 440. The various components in the terminal device 400 are coupled together by a bus system 450. It is understood that the bus system 450 is used to enable connected communication between these components. The bus system 450 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 450 in fig. 2.
The Processor 420 may be an integrated circuit chip having Signal processing capabilities, such as a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like, wherein the general purpose Processor may be a microprocessor or any conventional Processor, or the like.
The user interface 440 includes one or more output devices 441, including one or more speakers and/or one or more visual display screens, that enable the presentation of media content. The user interface 440 also includes one or more input devices 442 including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display screen, camera, other input buttons and controls.
The memory 460 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard disk drives, optical disk drives, and the like. Memory 460 may optionally include one or more storage devices physically located remote from processor 420.
The memory 460 may include volatile memory or nonvolatile memory, and may also include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read Only Memory (ROM), and the volatile Memory may be a Random Access Memory (RAM). The memory 460 described in embodiments herein is intended to comprise any suitable type of memory.
In some embodiments, memory 460 may be capable of storing data to support various operations, examples of which include programs, modules, and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 461 comprising system programs for handling various basic system services and performing hardware related tasks, such as framework layer, core library layer, driver layer, etc., for implementing various basic services and handling hardware based tasks;
a network communication module 462 for reaching other computing devices via one or more (wired or wireless) network interfaces 430, exemplary network interfaces 430 including: bluetooth, wireless compatibility authentication (WiFi), and Universal Serial Bus (USB), etc.;
A presentation module 463 for enabling presentation of information (e.g., user interfaces for operating peripherals and displaying content and information) via one or more output devices 441 (e.g., display screens, speakers, etc.) associated with user interface 440;
an input processing module 464 for detecting one or more user inputs or interactions from one of the one or more input devices 442 and translating the detected inputs or interactions.
In some embodiments, the shooting control device of the virtual character provided in the embodiments of the present application may be implemented in software, and fig. 2 shows the shooting control device 465 of the virtual character stored in the memory 460, which may be software in the form of programs, plug-ins, and the like, and includes the following software modules: a display module 4651, a control module 4652, a determination module 4653, a generation module 4654, a detection module 4655 and a division module 4656, which are logical and thus may be arbitrarily combined or further divided according to the functions implemented. It is noted that all of the above modules are shown once in fig. 2 for convenience of expression, but should not be construed as excluding implementations that may include only the display module 4651 and the control module 4652 at the fire control 465 of the virtual character, the functions of each of which will be described below.
In other embodiments, the shooting control Device of the virtual character provided in this embodiment may be implemented in hardware, and as an example, the shooting control Device of the virtual character provided in this embodiment may be a processor in the form of a hardware decoding processor, which is programmed to execute the shooting control method of the virtual character provided in this embodiment, for example, the processor in the form of the hardware decoding processor may be implemented by one or more Application Specific Integrated Circuits (ASICs), DSPs, Programmable Logic Devices (PLDs), Complex Programmable Logic Devices (CPLDs), Field Programmable Gate Arrays (FPGAs), or other electronic components.
The shooting control method for the virtual character provided in the embodiment of the present application will be specifically described below with reference to the accompanying drawings. The method for controlling shooting of a virtual character according to the embodiment of the present application may be executed by the terminal device 400 in fig. 1A alone, or may be executed by the terminal device 400 and the server 200 in fig. 1B in cooperation.
Next, a shooting control method for a virtual character provided in the embodiment of the present application is described as an example, in which the terminal device 400 in fig. 1A alone executes the shooting control method. Referring to fig. 3, fig. 3 is a flowchart illustrating a method for controlling shooting of a virtual character according to an embodiment of the present application, and will be described with reference to the steps illustrated in fig. 3.
It should be noted that the method shown in fig. 3 can be executed by various forms of computer programs running on the terminal device 400, and is not limited to the client 410 described above, but may also be the operating system 461, software modules and scripts described above, so that the client should not be considered as limiting the embodiments of the present application.
In step S101, a virtual character and a virtual shooting prop held by the virtual character are displayed in a virtual scene.
Here, the virtual shooting prop is capable of firing multiple virtual bullets at the same time at each shot, for example the virtual shooting prop may be a virtual shotgun, which typically includes 8 muzzles, which 8 muzzles may fire bullets at the same time at each shot, i.e. 8 bullets at the same time at each shot. Of course, the virtual shooting prop may also be other virtual weapons capable of firing multiple virtual bullets at the same time, which is not limited in this embodiment of the present application.
In some embodiments, a client supporting a virtual scene is installed on the terminal device (for example, when the virtual scene is a game, the corresponding client may be a shooting game APP), and when the client installed on the terminal device is opened by a user (for example, the user clicks an icon corresponding to the shooting game APP presented on a user interface of the terminal device), and the terminal device runs the client, a virtual character (for example, a virtual character a controlled by the user) and a virtual shooting prop (for example, a virtual shotgun) held by the virtual character through a holding part (for example, a hand) may be displayed in a virtual scene presented on a human-computer interaction interface of the client.
In other embodiments, the displaying of the virtual character and the virtual shooting prop held by the virtual character in the virtual scene may be implemented as follows: in response to the virtual shooting prop selection operation, a virtual character and a selected target virtual shooting prop (e.g., a virtual shotgun) held by the virtual character through the holding portion are displayed in the virtual scene.
For example, taking a virtual scene as an example of a game, various virtual weapons are provided in the game for a user to select, including a virtual heavy machine gun, a virtual shotgun, a virtual sniper gun, and the like, and for each virtual weapon, a corresponding icon is displayed in a game screen. When the user clicks the icon corresponding to the virtual shotgun displayed in the game screen, the game screen in which the game character controlled by the user holds the virtual shotgun by hand is displayed.
In some embodiments, a virtual scene may be displayed in a first-person perspective in a human-machine interface of a client (e.g., to play a virtual role in a game in a user's own perspective); or displaying the virtual scene at a third person perspective (e.g., the user follows a virtual character in the game to play the game); the virtual scene can also be displayed at a bird's-eye view angle; the above-mentioned different viewing angles can be switched arbitrarily.
As an example, the virtual character may be an object controlled by a current user in a game or military simulation, although other virtual characters may also be included in the virtual scene, such as virtual characters that may be controlled by other users or by a robot program. The virtual roles may be divided into any one of a plurality of teams, which may be in a hostile or cooperative relationship, and the teams in the virtual scene may include one or all of the above relationships.
Taking the first-person perspective as an example to display the virtual scene, displaying the virtual scene in the human-computer interaction interface may include: and determining the field of view area of the virtual character according to the viewing position and the field angle of the virtual character in the complete virtual scene, and presenting a partial virtual scene in the field of view area in the complete virtual scene, namely the displayed virtual scene can be a partial virtual scene relative to the panoramic virtual scene. Because the first person viewing angle is the viewing angle which can give impact force to the user, the immersive perception that the user is personally on the scene in the operation process can be realized.
Taking the virtual scene displayed at the bird's-eye view angle as an example, displaying the virtual scene in the human-computer interaction interface may include: in response to a zoom operation for the panoramic virtual scene, a partial virtual scene corresponding to the zoom operation is presented in the human-machine interaction interface, i.e., the displayed virtual scene may be a partial virtual scene relative to the panoramic virtual scene. Therefore, the operability of the user in the operation process can be improved, and the efficiency of man-machine interaction can be improved.
In step S102, in response to a first shooting trigger operation for the virtual shooting prop while the virtual character is in a stationary state, the virtual shooting prop is controlled to fire a first number of virtual bullets to a first bullet drop zone.
Here, the static state may include a state of standing, squatting, lying prone, etc., and the first number is an integer multiple of the number of virtual bullets that the virtual shooting prop can fire simultaneously at each shooting, for example, assuming that the virtual shooting prop can fire 8 virtual bullets at each shooting simultaneously, the first number may be 8, 16, 24, etc. (depending on the number of times of the first shooting, for example, when the virtual shooting prop is in a burst mode, in response to a first shooting trigger operation for the virtual shooting prop, the virtual shooting prop is controlled to make two first shots to the first bullet drop area to fire 16 virtual bullets).
In some embodiments, the operation of controlling the virtual shooting prop to fire the first number of virtual bullets to the first bullet drop zone can be implemented through step S1021 and step S1022 shown in fig. 4A, which will be described in conjunction with the steps shown in fig. 4A.
In step S1021, a first drop zone is determined based on the first shot impact location and the first scattering distance.
Here, the first scattering distance is a scattering distance corresponding to the virtual character in a stationary state.
In some embodiments, the above-described determination of the first drop zone based on the first shooting impact location and the first scattering distance may be achieved by: generating a first detection ray extending along the shooting direction of at least one first shooting by taking the launching port of the virtual shooting prop as a starting point; and taking a first shooting collision position between the first detection ray and at least one first virtual obstacle (such as a virtual wall, or a virtual character controlled by other users or AI in the virtual scene) in the virtual scene as a reference point, and determining a first bullet drop area based on the first scattering distance.
For example, taking the first bullet drop zone as a circular zone (or an elliptical zone) as an example, the first circular zone (or the first elliptical zone) may be determined by taking the first scattering distance as a radius (or a short axis of an ellipse) and taking the first circular zone (or the first elliptical zone) as the first bullet drop zone, with a first shooting collision position between the first detection ray and the first virtual obstacle in the virtual scene as a center of a circle.
For example, taking the first bullet drop zone as a rectangular zone, the first rectangular zone may be determined by taking the first shot collision position between the first detection ray and the first virtual obstacle in the virtual scene as a center, taking the first scattering distance as half of the diagonal length of the rectangular zone, and taking the first rectangular zone as the first bullet drop zone.
In step S1022, the virtual shooting prop is controlled to make at least one first shooting to the first bullet drop zone to shoot a first number of virtual bullets, and the first number of virtual bullets are controlled to be randomly distributed in the first number of bullet drop points in the first bullet drop zone.
In some embodiments, after the first bullet drop zone is determined, the above-mentioned controlling of the random distribution of the first number of virtual bullets in the first number of bullet drop points in the first bullet drop zone may be implemented by, for each virtual bullet: and randomly generating a drop point (such as a drop point A) in the first drop zone as a corresponding drop point of the virtual bullet in the first drop zone, and controlling the virtual bullet to hit the drop point A.
In other embodiments, after controlling the virtual shooting prop to make at least one first shooting to the first bullet drop zone to shoot a first number of virtual bullets and controlling the first number of virtual bullets to be randomly distributed in a first number of bullet drop points in the first bullet drop zone, the following processes may be further performed: and displaying a virtual bullet hole matched with the material of the first virtual barrier at each bullet drop point (namely when the virtual bullet hits the virtual barriers made of different materials, the special effects of the corresponding virtual bullet holes are different), and stopping displaying the virtual bullet holes when the display duration of the virtual bullet holes is longer than a duration threshold (for example, 2 seconds).
In step S103, in response to a second shooting trigger operation for the virtual shooting prop while the virtual character is in the motion state, the virtual shooting prop is controlled to fire a second number of virtual bullets to a second shooting area.
Here, the second bullet drop region is smaller than the first bullet drop region, that is, the region into which the virtual bullet fired by the virtual shooting prop falls when the virtual character is in the motion state is smaller than that in the stationary state, so that it is easier to hit the target. The motion state includes a state in which the position of the virtual character is unchanged but the motion of the body changes, or a state in which the position of the virtual character changes, and may include, for example, states such as jumping, running, creeping forward, sliding shovel (i.e., a motion in which the virtual character is controlled to rapidly slide a small distance while the body of the virtual character is lowered), and the like. The second number is an integer multiple of the number of virtual bullets that the virtual shooting prop can fire simultaneously at each shot, e.g., the second number may be 8, 16, 24, etc., assuming that the virtual shooting prop can fire 8 virtual bullets simultaneously at each shot (depending on the number of times of the second shot, e.g., when the virtual shooting prop is in a repeating mode, the virtual shooting prop is controlled to make three second shots to the second drop zone to fire 24 virtual bullets in response to a second shooting trigger operation for the virtual shooting prop).
In addition, it should be noted that, in an actual game scene, gaps between a plurality of virtual bullets launched from a launching port of a virtual shooting prop become larger as the distance becomes farther, that is, the scattering area of the virtual bullets launched by the virtual shooting prop becomes larger as the distance becomes farther, so that a first bullet drop area and a second bullet drop area in the embodiment of the present application refer to bullet drop areas at the same distance or at similar distances, that is, shooting operations in a static state and a moving state, the distances between the launching port of the virtual shooting prop and a virtual obstacle are the same or similar, for example, when a virtual character stands, the virtual prop is controlled to shoot at a virtual shooting wall body at a distance of 10 meters, and bullet hole traces are randomly distributed in a circular area with a radius of 1 meter; when the virtual character is in jumping, the virtual shooting props are controlled to shoot at a distance of 10 meters from the virtual wall, and the bullet hole traces are randomly distributed in a circular area with the radius of 0.7 meter, namely, compared with standing shooting, the bullet shooting method has the advantage that the bullet falls into a smaller area range in jumping shooting.
In some embodiments, the operation of controlling the virtual shooting prop to shoot the second number of virtual bullets to the second drop zone can be realized through step S1031 and step S1032 shown in fig. 4B, which will be described in conjunction with the steps shown in fig. 4B.
In step S1031, a second drop zone is determined based on the second shot collision position and the second scattering distance.
Here, the second scattering distance is smaller than the corresponding first scattering distance when the virtual character is in a stationary state.
In some embodiments, the determination of the second drop zone based on the second shooting impact location and the second scattering distance described above may be achieved by: generating a second detection ray extending along the shooting direction of at least one second shooting by taking the launching port of the virtual shooting prop as a starting point; and determining a second ball landing area based on the second scattering distance by taking a second shooting collision position between the second detection ray and at least one second virtual obstacle (such as a virtual wall, a virtual character controlled by other users or AI) in the virtual scene as a reference point.
For example, taking the second bullet drop zone as a circular zone (or an elliptical zone) as an example, the second circular zone (or the second elliptical zone) may be determined by taking the second scattering distance as a radius (or a short axis of an ellipse) and taking the second circular zone (or the second elliptical zone) as the second bullet drop zone, with a second shooting collision position between the second detection ray and a second virtual obstacle in the virtual scene as a center.
For example, taking the second shot falling region as a rectangular region, the second rectangular region may be determined by taking the second shot collision position between the second detection ray and the second virtual obstacle in the virtual scene as a center, taking the second scattering distance as half of the diagonal length of the rectangular region, and taking the second rectangular region as the second shot falling region.
In some embodiments, the second scattering distance may be obtained by adjusting the first scattering distance, and the adjustment parameter may be fixed, then before determining the second drop zone based on the second shooting collision location and the second scattering distance, the following process may be further performed: determining the second scattering distance by any one of: subtracting the fixed adjustment amplitude from the first scattering distance to obtain a second scattering distance, for example, if the first scattering distance is 1 meter and the adjustment amplitude is 0.3 meter, subtracting the fixed adjustment amplitude from the first scattering distance to obtain a second scattering distance of 0.7 meter; the first scattering distance is divided by a fixed adjustment factor (the adjustment factor is greater than 1, and may be 1.5, for example) to obtain the second scattering distance, and for example, if the first scattering distance is 1 meter and the adjustment factor is 1.5, the first scattering distance is divided by the fixed adjustment factor to obtain the second scattering distance of 0.6 meter.
In other embodiments, the adjustment parameter may also be dynamic, and the following process may be further performed before determining the second drop zone based on the second shooting collision location and the second scattering distance: determining a second scattering distance by: when the virtual character is in the motion state, detecting motion parameters (such as moving speed, acceleration, rotating speed and the like) of the motion included in the motion state; determining an adjustment coefficient negatively correlated with the action parameter, and correspondingly obtaining a dynamic adjustment amplitude or a dynamic adjustment multiple according to the product of the adjustment coefficient and the fixed adjustment amplitude or the fixed adjustment multiple; the first scattering distance is adjusted according to the dynamic adjustment amplitude or the dynamic adjustment multiple to obtain the second scattering distance, and thus the second scattering distance is related to the action parameters of the virtual character, for example, when the moving speed of the virtual character is higher, the virtual character is more difficult to aim at the target, and the corresponding second scattering distance is smaller, so that the target is easier to hit.
It should be noted that, in practical applications, when the motion state includes a plurality of single motions of different types, the motion parameters of each type of single motion may be obtained first, then the motion parameters of each type of single motion are added to obtain final motion parameters, and then an adjustment coefficient that is inversely related to the final motion parameters is determined.
In some embodiments, when the virtual shooting prop is in a burst mode, the following may also be performed: for each shot stage in the burst mode (where each shot stage includes at least one second shot, e.g., each shot stage may include only one second shot, or may include multiple second shots), the following is performed: updating the second scattering distance corresponding to the previous shooting stage or the average distance between a plurality of pairs of adjacent drop points (for example, reducing the second scattering distance corresponding to the previous shooting stage or the average distance between a plurality of pairs of adjacent drop points); the updated second scattering distance is used as the second scattering distance corresponding to the current shooting stage (for example, it is assumed that the second scattering distance corresponding to the previous shooting stage is 0.8 m, and the second scattering distance corresponding to the current shooting stage is 0.7 m), or the updated average distance between a plurality of pairs of adjacent drop points is used as the average distance between a plurality of pairs of adjacent drop points corresponding to the current shooting stage (for example, it is assumed that the average distance between a plurality of pairs of adjacent drop points corresponding to the previous shooting stage is 0.05 m, and the average distance between a plurality of pairs of adjacent drop points corresponding to the current shooting stage is 0.03 m).
In step S1032, the virtual shooting prop is controlled to make at least one second shooting to the second bullet drop zone to fire a second number of virtual bullets, and the second number of virtual bullets are controlled to be randomly distributed in the second number of bullet drop points in the second bullet drop zone.
In some embodiments, before controlling the virtual shooting prop to make at least one second shot to the second bullet drop zone to fire the second number of virtual bullets, the following process may be further performed: detecting an action included in the motion state of the virtual character; when the state of motion comprises one type of single action (e.g. jump, run, crawl advance, etc.) or comprises a plurality of different types of single actions (e.g. sliding shovel) that succeed one another, it is determined that an operation will be carried out to control the virtual shooting prop to make at least one second shot at the second shooting area to fire a second number of virtual bullets.
In other embodiments, controlling the virtual shooting prop to fire the second number of virtual bullets to the second drop zone as described above may be accomplished by: determining a second bullet drop area based on the second shooting collision position and the second scattering distance, and controlling the virtual shooting prop to carry out at least one second shooting to the second bullet drop area so as to launch a second number of virtual bullets; the virtual bullets of the second quantity are controlled to be randomly distributed on the bullet falling points of the second quantity in the second bullet falling area, the average distance between a plurality of pairs of adjacent bullet falling points in the second bullet falling area is smaller than the average distance between a plurality of pairs of adjacent bullet falling points in the first bullet falling area, namely, compared with a static state, the virtual shooting prop is controlled to shoot when the virtual character is in a motion state, the gap between the virtual bullets emitted by the virtual shooting prop becomes smaller, namely, the distance between the plurality of pairs of virtual bullets becomes more uniform without too much shake, and therefore, the target is shot more easily.
In some embodiments, taking the above example into account, before controlling the virtual shooting prop to make at least one second shot to the second shooting zone to fire the second number of virtual bullets, the following process may be further performed: detecting an action included in the motion state; when the state of motion of the virtual character includes a plurality (i.e., at least 2) of different types of single actions (e.g., a shoveling action that moves the body of the virtual character downward and quickly across a small distance) that occur simultaneously, it is determined that an operation will be shifted to perform controlling the virtual shooting prop to make at least one second shot to the second drop zone to fire a second number of virtual bullets.
In some embodiments, controlling the random distribution of the second number of dummy cartridges in the second drop zone to the second number of drop points may also be accomplished by: averagely dividing the second falling area into a second number of candidate areas; randomly generating a drop point in each candidate area to obtain a second number of drop points; assigning a second number of drop points to a second number of virtual bullets, respectively; and controlling the second number of virtual bullets to respectively hit the correspondingly distributed drop points in the second drop zone according to the drop points respectively distributed to the second number of virtual bullets.
The second drop zone is a rectangular zone as an example.
For example, referring to fig. 5A, fig. 5A is a schematic diagram of a virtual bullet shooting mode in a moving state provided in an embodiment of the present application, and as shown in fig. 5A, assuming that the virtual shooting prop can shoot 4 virtual bullets at the same time at each shooting, a rectangular region 501 is first divided into 4 candidate regions, including a first candidate region 502, a second candidate region 503, a third candidate region 504, and a fourth candidate region 505, and then a bullet drop point is randomly generated in each candidate region, for example, a bullet drop point 506 is randomly generated in the first candidate region 502, a bullet drop point 507 is randomly generated in the second candidate region 503, a bullet drop point 508 is randomly generated in the third candidate region 504, and a bullet drop point 509 is randomly generated in the fourth candidate region 505, and then the bullet drop points 506 through 509 are respectively allocated to 4 virtual bullets, and according to the respective bullet drop points allocated to the 4 virtual bullets, the 4 virtual bullets are controlled to hit the corresponding distributed bullet drop points in the rectangular area 501, that is, the 4 virtual bullets launched by the virtual shooting prop hit the middle bullet drop points 506 to the middle bullet drop points 509 in a certain shooting, so that the bullet drop points of the 4 virtual bullets are evenly distributed in the rectangular area 501, the shooting control precision is improved, and the target can be hit with high probability.
The second drop zone will be described as a circular zone.
For example, referring to fig. 5B, fig. 5B is a schematic diagram of a virtual bullet shooting mode in a moving state provided in this embodiment of the present application, and as shown in fig. 5B, assuming that the virtual shooting prop can shoot 6 virtual bullets at the same time at each shooting, the circular region 510 is first divided into 6 candidate regions, including a first candidate region 511, a second candidate region 512, a third candidate region 513, a fourth candidate region 514, a fifth candidate region 515, and a sixth candidate region 516, and then a shot point is randomly generated in each candidate region, such as a shot point 517 in the first candidate region 511, a shot point 518 in the second candidate region 512, a shot point 519 in the third candidate region 513, a shot point 520 in the fourth candidate region 514, a shot point 521 in the fifth candidate region 515, And randomly generating the drop points 522 in the sixth candidate area 516, then respectively allocating the drop points 517 to the drop points 522 to 6 virtual bullets, and controlling the 6 virtual bullets to respectively hit the correspondingly allocated drop points in the circular area 510 according to the drop points respectively allocated to the 6 virtual bullets, that is, the 6 virtual bullets fired by the virtual firing prop at a certain firing respectively hit the drop points 517 to the drop points 522 in the circular area 510, so that since the drop points of the 6 virtual bullets are evenly distributed in the circular area 510, the accuracy of firing control is improved, and thus a target can be hit with a high probability.
In some embodiments, after controlling the virtual shooting prop to make at least one second shooting to the second shooting area to fire a second number of virtual bullets and controlling the second number of virtual bullets to be randomly distributed in a second number of shooting points in the second shooting area, the following processes may be further performed: and displaying a virtual bullet hole matched with the material of the second virtual barrier at each bullet drop point (namely when the virtual bullet hits the virtual barriers of different materials, the special effects of the corresponding virtual bullet holes are different), and stopping displaying the virtual bullet holes when the display duration of the virtual bullet holes is longer than a duration threshold (for example, 2 seconds).
In other embodiments, the above-mentioned controlling of the random distribution of the second number of dummy cartridges in the second drop zone to the second number of drop points may also be achieved by: dividing the second falling bullet area into a first sub area and a second sub area, wherein the centers of the first sub area and the second sub area are overlapped, and the first sub area is smaller than the second sub area; randomly generating a third number of drop points in the first sub-area, and respectively allocating the third number of drop points to a third number of virtual bullets, wherein the third number is smaller than the second number; controlling the third number of virtual bullets to respectively hit the correspondingly distributed drop points in the second drop zone according to the drop points respectively distributed to the third number of virtual bullets; averagely dividing the second sub-area into a fourth number of candidate areas, wherein the fourth number is the difference value between the second number and the third number; randomly generating a bullet drop point in each candidate area to obtain a fourth number of bullet drop points; assigning a fourth number of drop points to a fourth number of virtual bullets, respectively; and controlling the fourth number of virtual bullets to respectively hit the correspondingly distributed drop points in the second drop zone according to the drop points respectively distributed to the fourth number of virtual bullets.
For example, taking the second bullet drop zone as a circular zone with a radius of R as an example, the second bullet drop zone may be first divided into a first sub-zone with a radius of P and a second sub-zone with an inner ring radius of S and an outer ring radius of R, where P is less than or equal to S and S is less than R, and then a third number of bullet drop points are randomly generated in the first sub-zone with a radius of P, for example, when the third number is 2, 2 bullet drop points are randomly generated in the first sub-zone with a radius of P to serve as bullet drop points corresponding to 2 virtual bullets in 8 virtual bullets; then, averagely dividing the second sub-regions with the inner ring radius S and the outer ring radius R into a fourth number of candidate regions, wherein the angle range corresponding to each candidate region is 360 °/the fourth number, for example, when the second number is 8 and the third number is 2, the fourth number is 6, that is, averagely dividing the second sub-regions with the inner ring radius S and the outer ring radius R into 6 candidate regions, and the angle range corresponding to each candidate region is 60 °; finally, the following processing is performed for each candidate region: randomly generating an angle from the angle range corresponding to the candidate region and a radius between S and R; the points corresponding to the angles and the radii in the candidate area are used as the bullet drop points corresponding to the candidate area, so that 6 bullet drop points are obtained, the 6 bullet drop points are respectively distributed to the remaining 6 virtual bullets in the 8 virtual bullets, namely, the remaining 6 virtual bullets respectively hit the 6 bullet drop points, therefore, the second bullet drop area is divided into an inner circle (namely, a first sub area) and an outer circle (namely, a second sub area), and the mode that the bullet drop points on the outer circle are uniformly distributed is controlled, so that the accuracy and the efficiency of shooting control are improved, the probability of hitting targets can be further improved, and the game experience of users is improved.
According to the shooting control method for the virtual character, when the virtual character is in a static state and receives a shooting trigger operation aiming at the virtual shooting prop, the virtual shooting prop is controlled to launch a virtual bullet to a first bullet falling area with a large range; receive the shooting trigger operation to virtual shooting stage property when virtual character is in the motion state, then control virtual shooting stage property and to the less second of scope and fall the bullet regional virtual bullet of launching, so, can be suitable for virtual character diversified state in the virtual scene to correspond when control virtual shooting stage property shoots and realize the diversified transmission form of virtual bullet, improved the precision and the efficiency of shooting control, thereby satisfied different technical level's user's demand.
Next, an exemplary application of the embodiment of the present application in a practical application scenario will be described.
Currently, in mobile shooting game applications, a user may choose to use a variety of different types of virtual shooting props for shooting operations, for example, the user may choose to shoot using a shotgun, or to shoot using a rifle. The greatest difference between the shotgun and other firearms is the number of bullets (the conventional firearms can only fire one bullet at each shot, and the shotgun can fire multiple bullets at each shot), and the multiple bullets fired by the shotgun are in scattering form, which can cause terrorist injury to the target if all or most of the scattered bullets hit the target. However, the shot range of the shotgun is short, and the scattering area of the bullet is larger the farther away, so that even if the bullet can hit the target when shooting towards the target at a long distance, the number of the bullets is not too large, and the injury is hard to embody. So the shotgun does not work much in the middle and long distance, but at the short distance, the shotgun can be used skillfully even the opponent who flips the third-class armor by one gun. Furthermore, shotguns are single-shot weapons, as are most handguns, and if movement is a concern during the firing process, it is difficult to ensure accuracy, especially for players operating with two fingers.
Because the difficulty of the shotgun hitting the target varies among the different modes of operation, they have a high skill level to perform a wide variety of operations for high-level users, however, the related art provides a solution in which the shot form of the shotgun bullet is the same among the different modes of operation, which is disadvantageous for high-level users. To this end, the embodiments of the present application provide a virtual character shooting control method, in which the shot form of the shotgun bullet is more favorable for the target form at the higher operation difficulty, that is, the shot form of the shotgun can be different in different operation modes.
For example, referring to fig. 6, fig. 6 is a schematic diagram of a shot-gun bullet shooting mode provided by the related art, as shown in fig. 6, in the solution provided by the related art, the calculation of the shot-gun bullet shooting point is randomly obtained in a circle 601, for example, the center point of the circle 601 is the position of the muzzle, and the circle 601 with radius R is centered around the muzzle (the black points in fig. 6 are the shooting points 602 corresponding to the bullets, respectively, i.e., the positions of the virtual bullets hitting in the circle 601), so that the bullets shot by the shot-gun are randomly distributed around the circle 601, however, the pure random trajectory is uncontrollable, the shooting number is difficult to control, it is possible that all the bullets hit the target (i.e., 8 bullets in a gun), and it is possible that all the bullets do not hit the target (i.e., 0 bullets in a gun), for example, as shown in fig. 7, the plurality of drop points 701 of each of the plurality of virtual bullets do not coincide with the target 702, i.e., all of the bullets fired by the shotgun at a single shot do not hit the target.
The following describes a shooting control method for a virtual character provided in an embodiment of the present application in detail.
The operation modes in the embodiment of the present application include a normal operation state (corresponding to the above-described rest state), a jump shot, and a sliding shovel shot (corresponding to the above-described motion state), wherein the normal operation state mainly includes: standing shooting, squatting shooting, lying shooting and the like are relatively simple, a continuous operation process is not needed, only operation is performed in advance, and shooting is performed again, so that the operations in the states can be regarded as common operation states, the shooting form of the shotgun bullet is random emission, targets are likely not to be hit under the random emission, targets are likely to be hit completely, and the shotgun bullet cannot be mastered by a user.
For example, referring to fig. 8A, fig. 8A is a schematic view of an application scenario of the shooting control method for the virtual character provided in the embodiment of the present application, as shown in fig. 8A, a game scenario is displayed in a first-person perspective, and a shotgun 802 held by a virtual character 801 (e.g. a virtual character controlled by a user a) and the virtual character 801 is displayed in the game scenario, and in addition, a target 803 (e.g. a virtual character controlled by a user B and in enemy battle with the user a, or a virtual character artificially intelligently controlled and in enemy battle with the user a) is also displayed in the game scenario, when the user a controls a shooting port of the shotgun 802 to aim at the target 803 and receives a click operation of the user a on a shooting control 804 displayed in the game scenario, the shotgun 802 is controlled to fire a plurality of bullets to the target 803, and since the shooting form of the shotgun 802 is random, only a portion of the bullets may hit the target 803 resulting in the life value of the target 803 being reduced 3/5 only, i.e., the target 803 is not killed.
As an example, referring to fig. 8B, fig. 8B is a schematic view of an application scenario of the shooting control method for the virtual character provided in the embodiment of the present application, as shown in fig. 8B, a game scenario is displayed in a first person perspective, and a virtual character 805 (e.g., a virtual character controlled by the user C) and a shotgun 806 held by the virtual character 805 through a hand are displayed in the game scenario, and in addition, a target 807 is also displayed in the game scenario, when the user C controls a shooting port of the shotgun 806 to aim at the target 807 and receives a click operation of the user C on a shooting control 808 displayed in the game scenario, the shotgun 806 is controlled to fire a plurality of bullets to the target 807, since the shooting form of the shotgun bullets 806 is random emanation, all the bullets may hit the target 807, resulting in that the target 807 is killed by one gun, but the probability of the above situations is low.
The following continues with the description of the shot form of the shotgun cartridge at jump shooting.
Since the level of the virtual character controlled by the user is always changed in the jumping state, unlike in the case of aiming at a flat ground, aiming at shooting requires prejudice shooting, that is, the falling speed and the bullet shooting time are taken into consideration, so that the bullets shot in this state are denser than in the normal operation state, and are shot in the normal operation state, and are also random, but the gap between the bullets becomes smaller, for example, the bullets are shot randomly in a range of radius R in the normal operation state (for example, standing shooting), and the bullets are shot randomly in a new range of radius P in the jumping shooting, where P < R.
For example, referring to fig. 9, fig. 9 is a schematic view of an application scenario of the shooting control method for the virtual character provided in the embodiment of the present application, as shown in fig. 9, when the virtual character is in a standing state, a shotgun is controlled to shoot a wall, and bullet hole marks left on the wall are randomly distributed in a circle 901 with a radius R.
For example, referring to fig. 10, fig. 10 is a schematic view of an application scenario of the virtual character shooting control method provided in the embodiment of the present application, as shown in fig. 10, when the virtual character is in a jumping state, a shotgun is controlled to shoot a gun, and bullet hole traces left on a wall are randomly distributed in a circle 1001 with a radius P.
As can be seen from the combination of fig. 9 and 10, the shot pattern of the shotgun bullet still retains the random character at the time of jump shooting, but the radius range of the bullet emission becomes smaller (the radius P of the circle 1001 shown in fig. 10 is smaller than the radius R of the circle 901 shown in fig. 9), i.e., the gap between the bullets becomes smaller, thereby making it easier to hit the target at the time of jump shooting.
The shooting pattern of the shotgun cartridge at the moment of the shovel shooting is explained below.
The sliding shovel is a more difficult aiming operation mode, the sliding shovel is to press down to trigger the sliding shovel action when the virtual character runs, the action controls the body of the virtual character to move downwards and slide a small distance quickly, the user is more difficult to aim at the target at the moment due to the high speed and the low visual angle, therefore, if the shotgun is used in the operation mode, the shooting form of the shotgun bullet is changed into a form which is more favorable for hitting the target, the form is optimized on the jumping shooting, namely, the influence of randomness is reduced while the gap between the bullet and the bullet is shortened, and the randomness is still maintained.
For example, referring to fig. 11, fig. 11 is a schematic diagram illustrating a principle of dividing a bullet drop area according to an embodiment of the present application, as shown in fig. 11, a circle is divided into six equal parts on average, where six bullets are randomly generated in each of the six equal parts, for example, taking a divided peripheral bullet generation area 1101 as an example, after the circle is divided, the peripheral bullet generation area 1101 is determined according to an angle range 1102 in which bullets are randomly generated, then a point is randomly taken from the peripheral bullet generation area 1101, and the selected point is taken as a drop point of a corresponding bullet in the peripheral bullet generation area 1101, so that it can be ensured that the bullets are distributed uniformly in all places of the circle and the remaining two bullets hit a central portion of the circle no matter how randomly.
For example, referring to fig. 12, fig. 12 is a schematic diagram of the shot form of the shotgun bullet at the time of sliding shovel shooting according to an embodiment of the present application, as shown in fig. 12, the 8 black points shown in fig. 12 are 8 landing points 1203 respectively corresponding to 8 bullets fired by the shotgun at a certain shooting, wherein 2 landing points 1203 are randomly distributed in the inner circle 1201, and the remaining 6 landing points 1203 are uniformly distributed in the outer circle 1202, that is, the bullets are uniformly distributed in the shooting area at the time of sliding shovel shooting, so that the target must be hit at an approximate rate as long as the target is in the aiming box.
The shot patterns of the shotgun cartridges respectively corresponding to the above-described different operation modes will be specifically described with reference to fig. 13.
For example, referring to fig. 13, fig. 13 is a flow chart of a shooting control method for a virtual character provided in the embodiment of the present application, and as shown in fig. 13, the algorithm provided in the embodiment of the present application is designed only for shotguns and is ineffective for other types of weapons, so that the virtual character needs to be equipped with the shotgun first. Then, detecting whether the user presses a shooting control (such as a firing key) or not, and if not, returning to re-detecting whether the user presses the firing key or not; if so, the shotgun is controlled to fire the bullet, wherein the shotgun differs from other weapons in that the other weapons fire only one bullet per shot, and the shotgun is capable of firing N bullets at the same time at each shot (where N is a positive integer greater than 1, for example, N may be 8, i.e., 8 bullets can be fired at the same time at each shot), and the emission value of each bullet (corresponding to the drop point of each bullet) may be obtained as follows, taking the sliding shovel shooting as an example:
taking the bullet falling area as a circular area with a radius of R as an example, first, taking the center of the bullet falling area as a circle center, dividing the bullet falling area into an inner circle (for example, the inner circle may be a small circular area with a radius of P) and an outer circle (for example, the outer circle may be a circular ring with an inner circle radius of S and an outer circle radius of R, where P is smaller than S and S is smaller than R), then, determining whether the current bullet is in the inner circle, and if the current bullet is in the inner circle, directly generating a position in the inner circle at random, where the algorithm is as follows:
Random.insideUnitCircle*m_ShotSpreadSize
Random. inertia unit circle randomly generates a value (for example, 0.5 m) in a circle with a radius of 1 m, and then multiplies the value by a radius m _ shot _ size (i.e., a radius P of the inner circle) to obtain a random value in the inner circle, wherein the value is a deviation coefficient of the bullet, and then the position of the bullet at the drop point in the inner circle can be determined according to the deviation coefficient.
The first few rounds of the multiple rounds fired by the shotgun are all at the inner circle, and as to how many rounds hit the inner circle, they may be preset, or they may be randomly set during the process (for example, as shown in fig. 12, the planner may preset 2 rounds hitting the inner circle, and the remaining 6 rounds hitting the outer circle), and the remaining rounds are all calculated at the outer circle, for example, first, a random value from the minimum radius (m _ MinShotSpreadSize) to the maximum radius (m _ maxsshotspreadsize), which is a random radius R, and the algorithm is as follows:
Random.Range(m_MinShotSpreadSize,m_MaxShotSpreadSize)
range is a radius R randomly drawn from a minimum radius to a maximum radius, m _ minshotsprireadsize represents the minimum radius, and m _ maxshoodsreadsize represents the maximum radius.
Then, the angle is determined, and the algorithm is as follows:
StartAngle=Random.Range(0,360)
Where StartAngle denotes the determined start angle, and random. range (0, 360) denotes that the determined start angle ranges from 0 degrees to 360 degrees, i.e., a start angle is randomly determined from 0 degrees to 360 degrees.
The corresponding final angle of the bullet is then determined, the algorithm is as follows:
Angle=Random.Range(0,m_AngleUncertainty)+StartAngle
for example, if the random angle of the StartAngle is 70 degrees, and then the range of the m _ angleruertaintity is 20 degrees, the random interval is (70, 90), that is, the final angle value is greater than 70 degrees and less than 90 degrees, so that the offset coefficient of the first bullet in the outer ring can be determined according to the random angle and the radius, and the position of the drop point of the first bullet in the outer ring is determined according to the offset coefficient.
Similarly, the second bullet in the outer circle takes a random value from the interval, and the equal intervals are superimposed, for example, if the circle is equally divided into six equal intervals, each equal interval is 60 degrees, and if the value randomly taken for the second time is 78 degrees, the angle of the interval is 78+60 degrees to 138 degrees, and so on, the angle of the third bullet in the outer circle is a random value +120 (the random value is obtained from the interval each time).
After the radius R and the angle are obtained, the final offset coefficient can be obtained, and the calculation formula is as follows:
rad ═ Angle ═ mathf.pi × 2/360; v/arc of rotation
(vii) x ═ R × mathf. cos (rad); // calculate x
Sin (rad); // calculate y
m_ShotSpread=new Vector2(x,y)
The PI represents the circumferential rate, mathf.pi × 2/360 represents conversion of the finally determined angle into a radian, Cos represents a cosine function, mathf.cos (rad) represents a cosine value of the radian obtained by calculation and conversion, Sin represents a sine function, mathf.sin (rad) represents a sine value of the radian obtained by calculation and conversion, Vector2 represents a two-dimensional coordinate including an x-axis coordinate and a y-axis coordinate, m _ shotspray is a deviation coefficient to be used finally, the deviation coefficient is superposed by the position of a muzzle and is the final launch direction of the bullet, and the position of a bullet drop point can be determined according to the final launch direction of the bullet.
For example, referring to fig. 14, fig. 14 is a schematic diagram of the principle of determining the bullet firing direction provided by the embodiment of the present application, and as shown in fig. 14, assuming no emission, each bullet is fired from a solid line 1401, which is the muzzle direction, and a dashed line 1402 is the firing direction superimposed with an offset coefficient.
For example, referring to fig. 15, fig. 15 is a schematic diagram illustrating a principle of determining a bullet hole special effect according to an embodiment of the present application, as shown in fig. 15, a ray 1501 is emitted from a muzzle position as a starting point after a bullet direction is determined, the direction of the ray 1501 is an emission direction of the bullet, and logic for generating a bullet hole 1502 is triggered after physical collision of other materials with a box is detected, so that different bullet hole special effects are generated according to different materials of an obstacle, for example, a material of the obstacle in fig. 15 is a stone (physical stone) 1503.
For example, referring to fig. 16, fig. 16 is a schematic diagram of a material candidate interface provided in an embodiment of the present application, as shown in fig. 16, a plurality of different types of materials are provided in the material candidate interface, including stones (physical stone), snow (physical snow), wooden boxes (physical woodbox), and the like, that is, different material collision boxes are hung on different obstacles, and different material collision special effects are generated according to different materials, for example, a smaller bullet hole trace is left on stones compared to a wooden box.
In other embodiments, the ammunition hole automatically disappears after a period of time has elapsed since the ammunition hole appeared on the obstacle, and the shotgun stops firing when the user releases the firing key.
The shooting control method for the virtual character provided by the embodiment of the application has the advantages that the shooting form of the shot gun bullet becomes more favorable for hitting the target when the operation difficulty is higher, namely, the shooting control modes of the virtual shooting prop are different in different operation modes, so that the shooting control method can adapt to the diversified states of the virtual character in the virtual scene, and the shooting control precision and efficiency are improved.
Continuing with the exemplary structure of the virtual character's fire control device 465 implemented as software modules provided in the embodiments of the present application, in some embodiments, as shown in fig. 2, the software modules stored in the virtual character's fire control device 465 in the memory 460 may include: a display module 4651 and a control module 4652.
A display module 4651, configured to display a virtual character and a virtual shooting prop held by the virtual character in a virtual scene, where the virtual shooting prop can fire multiple virtual bullets at the same time at each shooting; a control module 4652 to control the virtual shooting prop to fire a first number of virtual bullets to the first drop zone in response to a first shooting trigger operation for the virtual shooting prop while the virtual character is in a static state; the control module 4652 is further configured to control the virtual shooting prop to fire a second number of virtual bullets to a second fall area in response to a second shooting trigger operation for the virtual shooting prop while the virtual character is in motion, wherein the second fall area is smaller than the first fall area.
In some embodiments, the shooting control device 465 of the virtual character further includes a determining module 4653 configured to determine a first drop zone based on the first shooting impact location and a first scattering distance, where the first scattering distance is a corresponding scattering distance when the virtual character is in a static state; the control module 4652 is further configured to control the virtual shooting prop to make at least one first shot to the first bullet drop zone to fire a first number of virtual bullets, and control the first number of virtual bullets to be randomly distributed among a first number of bullet drop points in the first bullet drop zone.
In some embodiments, the shooting control device 465 of the virtual character further includes a generating module 4654 configured to generate a first detection ray extending in a shooting direction of at least one first shot, starting from the launch opening of the virtual shooting prop; the determining module 4653 is further configured to determine a first landing zone based on the first scattering distance with reference to a first shooting collision location between the first detection ray and a first virtual obstacle in the virtual scene.
In some embodiments, the control module 4652 is further configured to perform the following for each dummy bullet: and randomly generating a bullet drop point in the first bullet drop area to serve as a corresponding bullet drop point of the virtual bullet, and controlling the virtual bullet to hit the bullet drop point.
In some embodiments, the determining module 4653 is further configured to determine a second drop zone based on the second shooting collision location and a second scattering distance, where the second scattering distance is less than the corresponding first scattering distance when the virtual character is in the static state; the control module 4652 is further configured to control the virtual shooting prop to make at least one second shooting to the second bullet drop zone to fire a second number of virtual bullets, and control the second number of virtual bullets to be randomly distributed among the second number of bullet drop points in the second bullet drop zone.
In some embodiments, the virtual character's fire control 465 also includes a detection module 4655 for detecting actions included in the state of motion; the determining module 4653 is further configured to, when the motion state includes one type of single action or includes multiple different types of single actions that follow one another, determine to shift to an operation of controlling the virtual shooting prop to make at least one second shooting to the second shooting area to fire a second number of virtual bullets.
In some embodiments, the generating module 4654 is further configured to generate a second detection ray extending in the shooting direction of at least one second shot starting from the launch opening of the virtual shooting prop; the determining module 4653 is further configured to determine a second drop zone based on a second scattering distance with reference to a second shooting collision location between the second detected ray and a second virtual obstacle in the virtual scene.
In some embodiments, the determining module 4653 is further configured to determine the second scattering distance by any one of: subtracting the fixed adjustment amplitude from the first scattering distance to obtain a second scattering distance; and dividing the first scattering distance by a fixed adjustment multiple to obtain a second scattering distance, wherein the adjustment multiple is greater than 1.
In some embodiments, the determining module 4653 is further configured to determine the second scattering distance by: detecting motion parameters of the motion included in the motion state; determining an adjustment coefficient negatively correlated with the action parameter, and correspondingly obtaining a dynamic adjustment amplitude or a dynamic adjustment multiple according to the product of the adjustment coefficient and the fixed adjustment amplitude or the fixed adjustment multiple; and adjusting the first scattering distance according to the dynamic adjustment amplitude or the dynamic adjustment multiple to obtain a second scattering distance.
In some embodiments, when the virtual shooting prop is in the burst mode, determining module 4653 is further configured to perform the following for each shooting stage in the burst mode: updating a second scattering distance corresponding to the previous shooting stage or an average distance between adjacent bullet drop points; taking the updated second scattering distance as a second scattering distance corresponding to the current shooting stage, or taking the updated average distance between adjacent bullet drop points as the average distance between the adjacent bullet drop points corresponding to the current shooting stage; wherein each shot stage comprises at least one second shot.
In some embodiments, the control module 4652 is further configured to control the virtual shooting prop to make at least one second shot at the second drop zone to fire a second number of virtual bullets; and controlling a second number of virtual bullets to be randomly distributed in a second number of bullet-falling points in the second bullet-falling area, wherein the average distance between adjacent bullet-falling points in the second bullet-falling area is smaller than the average distance between adjacent bullet-falling points in the first bullet-falling area.
In some embodiments, detection module 4655, is further configured to detect an action included in the motion state; the determining module 4653 is further configured to determine that an operation of controlling the virtual shooting prop to make at least one second shot to the second drop zone to fire a second number of virtual bullets is to be performed when the motion state includes a plurality of single actions of different types occurring at the same time.
In some embodiments, the shot control of the virtual character 465 further comprises a dividing module 4656 configured to divide the second drop zone into a second number of candidate zones on average; a generating module 4654, further configured to randomly generate one drop point in each candidate region, so as to obtain a second number of drop points; a control block 4652 further configured to assign a second number of drop points to a second number of dummy cartridges, respectively; and controlling the second number of virtual bullets to respectively hit the correspondingly distributed drop points in the second drop zone according to the drop points respectively distributed to the second number of virtual bullets.
In some embodiments, the dividing module 4656 is further configured to divide the second drop zone into a first sub-zone and a second sub-zone, where centers of the first sub-zone and the second sub-zone coincide and the first sub-zone is smaller than the second sub-zone; a generating module 4654, further configured to randomly generate a third number of drop points in the first sub-area, and assign the third number of drop points to a third number of dummy cartridges, respectively, where the third number is smaller than the second number; the control module 4652 is further configured to control the third number of virtual bullets to hit the corresponding drop points allocated in the second drop zone according to the drop points allocated to the third number of virtual bullets; the dividing module is further configured to averagely divide the second sub-region into a fourth number of candidate regions, where the fourth number is a difference between the second number and the third number; a generating module 4654, further configured to randomly generate one drop point in each candidate region, so as to obtain a fourth number of drop points; a control block 4652 further configured to assign a fourth number of drop points to a fourth number of dummy cartridges, respectively; and controlling the fourth number of virtual bullets to respectively hit the correspondingly distributed drop points in the second drop zone according to the drop points respectively distributed to the fourth number of virtual bullets.
In some embodiments, when the second landing zone is a circular zone with a radius of R, the dividing module 4656 is further configured to divide the second landing zone into a first sub-zone with a radius of P and a second sub-zone with an inner ring radius of S and an outer ring radius of R, where P is less than or equal to S and S is less than R, with a center of the second landing zone as a center.
In some embodiments, the dividing module 4656 is further configured to divide the second sub-region with the inner ring radius S and the outer ring radius R into a fourth number of candidate regions on average, wherein each candidate region corresponds to an angular range of 360 °/the fourth number; a generating module 4654, further configured to perform the following for each candidate region: randomly generating an angle from the angle range corresponding to the candidate region and a radius between S and R; and taking the point corresponding to the angle and the radius in the candidate area as a bullet drop point corresponding to the candidate area.
It should be noted that, in the embodiments of the present application, descriptions about a device are similar to the implementation of the shooting control method for a virtual character in the foregoing, and have similar beneficial effects, and therefore, no further description is given. The inexhaustible technical details of the virtual character shooting control device provided in the embodiment of the present application can be understood from the description of any one of fig. 3, 4A, 4B, or 13.
Embodiments of the present application provide a computer program product comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the shooting control method for the virtual character described in the embodiment of the present application.
Embodiments of the present application provide a computer-readable storage medium storing executable instructions, which when executed by a processor, cause the processor to execute a live broadcast processing method provided by an embodiment of the present application, for example, a shooting control method for a virtual character as illustrated in fig. 3, 4A, 4B, or 13.
In some embodiments, the computer-readable storage medium may be memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash, magnetic surface memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories.
In some embodiments, executable instructions may be written in any form of programming language (including compiled or interpreted languages), in the form of programs, software modules, scripts or code, and may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
By way of example, executable instructions may correspond, but do not necessarily have to correspond, to files in a file system, and may be stored in a portion of a file that holds other programs or data, such as in one or more scripts in a hypertext Markup Language (HTML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
By way of example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network.
In summary, in the embodiment of the present application, when the virtual character is in a static state, a shooting trigger operation for the virtual shooting prop is received, and the virtual shooting prop is controlled to launch a virtual bullet to a first bullet drop area with a large range; receiving the shooting trigger operation to virtual shooting stage property when virtual character is in the motion state, then controlling virtual shooting stage property to launch virtual bullet to the less second of scope and falling bullet region, so, can realize the diversified launch form of virtual bullet when controlling virtual shooting stage property to shoot, adapted to virtual character diversified state in virtual scene, improved the precision and the efficiency of shooting control.
The above description is only an example of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, and improvement made within the spirit and scope of the present application are included in the protection scope of the present application.

Claims (20)

1. A method for controlling shooting of a virtual character, the method comprising:
displaying a virtual character and a virtual shooting prop held by the virtual character in a virtual scene, wherein the virtual shooting prop can simultaneously launch a plurality of virtual bullets at each shooting;
in response to a first firing trigger operation for the virtual firing prop while the virtual character is in a stationary state, controlling the virtual firing prop to fire a first number of virtual bullets to a first bullet drop zone;
in response to a second firing trigger operation for the virtual firing prop when the virtual character is in a motion state, controlling the virtual firing prop to fire a second number of virtual bullets to a second drop zone, wherein the second drop zone is smaller than the first drop zone.
2. The method of claim 1, wherein the controlling the virtual shooting prop to fire a first number of virtual bullets to a first drop zone comprises:
Determining a first bullet drop area based on a first shooting collision position and a first scattering distance, wherein the first scattering distance is a corresponding scattering distance when the virtual character is in a static state;
and controlling the virtual shooting prop to carry out at least one time of first shooting to the first bullet falling area so as to launch a first number of virtual bullets, and controlling the first number of virtual bullets to be randomly distributed in a first number of bullet falling points in the first bullet falling area.
3. The method of claim 2, wherein determining the first drop zone based on the first firing impact location and the first scattering distance comprises:
generating a first detection ray extending along the shooting direction of the at least one first shot by taking the launching port of the virtual shooting prop as a starting point;
and determining a first bullet drop area based on a first scattering distance by taking a first shooting collision position between the first detection ray and a first virtual obstacle in the virtual scene as a reference point.
4. The method of claim 2, wherein said controlling said first number of dummy cartridges to be randomly distributed among a first number of drop points in said first drop zone comprises:
The following processing is performed for each dummy bullet:
and randomly generating a bullet drop point in the first bullet drop area to serve as the corresponding bullet drop point of the virtual bullet, and controlling the virtual bullet to hit the bullet drop point.
5. The method of claim 1, wherein the controlling the virtual shooting prop to fire a second number of virtual bullets to a second drop zone comprises:
determining a second bullet drop area based on a second shooting collision position and a second scattering distance, wherein the second scattering distance is smaller than a corresponding first scattering distance when the virtual character is in a static state;
and controlling the virtual shooting prop to carry out at least one second shooting to the second bullet falling area so as to launch a second number of virtual bullets, and controlling the second number of virtual bullets to be randomly distributed in a second number of bullet falling points in the second bullet falling area.
6. The method of claim 5, wherein prior to controlling the virtual shooting prop to make at least one second shot at the second drop zone to fire a second number of virtual bullets, the method further comprises:
detecting an action included in the motion state;
And when the motion state comprises a single action of one type or a plurality of single actions of different types which are successively connected, determining to switch to the operation of controlling the virtual shooting prop to shoot a second number of virtual bullets for at least one second shooting to the second shooting area.
7. The method of claim 5, wherein determining a second drop zone based on the second shot collision location and the second scattering distance comprises:
generating a second detection ray extending along the shooting direction of the at least one second shot by taking the launching port of the virtual shooting prop as a starting point;
and determining a second bullet drop area based on a second scattering distance by taking a second shooting collision position between the second detection ray and a second virtual obstacle in the virtual scene as a reference point.
8. The method of claim 5, wherein prior to determining a second drop zone based on a second shot collision location and a second scattering distance, the method further comprises:
determining the second scattering distance by any one of:
subtracting a fixed adjustment amplitude from the first scattering distance to obtain a second scattering distance;
And dividing the first scattering distance by a fixed adjustment multiple to obtain the second scattering distance, wherein the adjustment multiple is greater than 1.
9. The method of claim 5, wherein prior to determining a second drop zone based on a second shot collision location and a second scattering distance, the method further comprises:
determining a second scattering distance by:
detecting motion parameters of a motion included in the motion state;
determining an adjustment coefficient negatively correlated with the action parameter, and correspondingly obtaining a dynamic adjustment amplitude or a dynamic adjustment multiple according to the product of the adjustment coefficient and the fixed adjustment amplitude or the fixed adjustment multiple;
and adjusting the first scattering distance according to the dynamic adjustment amplitude or the dynamic adjustment multiple to obtain the second scattering distance.
10. The method of any one of claims 5-9, wherein when the virtual shooting prop is in a burst mode, the method further comprises:
performing the following for each firing phase in the burst mode:
updating a second scattering distance corresponding to the previous shooting stage or an average distance between adjacent bullet drop points;
Taking the updated second scattering distance as a second scattering distance corresponding to the current shooting stage, or taking the updated average distance between adjacent bullet drop points as the average distance between the adjacent bullet drop points corresponding to the current shooting stage;
wherein each of the shot stages comprises at least one second shot.
11. The method of claim 1, wherein the controlling the virtual shooting prop to fire a second number of virtual bullets to a second drop zone comprises:
controlling the virtual shooting prop to shoot a second number of virtual bullets at least once to a second bullet falling area;
and controlling the second number of virtual bullets to be randomly distributed in the second number of bullet-falling points in the second bullet-falling area, wherein the average distance between the adjacent bullet-falling points in the second bullet-falling area is smaller than the average distance between the adjacent bullet-falling points in the first bullet-falling area.
12. The method of claim 11, wherein prior to controlling the virtual shooting prop to make at least one second shot at a second drop zone to fire a second number of virtual bullets, the method further comprises:
detecting an action included in the motion state;
When the motion state comprises a plurality of single actions of different types which occur simultaneously, determining to shift to an operation of controlling the virtual shooting prop to shoot a second number of virtual bullets at least once to a second shooting area.
13. The method of claim 12, wherein said controlling said second number of dummy cartridges to be randomly distributed among a second number of drop points in said second drop zone comprises:
averagely dividing the second falling area into a second number of candidate areas;
randomly generating a drop point in each candidate area to obtain a second number of drop points;
assigning the second number of drop points to the second number of virtual bullets, respectively;
and controlling the second number of virtual bullets to respectively hit the correspondingly distributed drop points in the second drop zone according to the drop points respectively distributed to the second number of virtual bullets.
14. The method of claim 12, wherein said controlling said second number of dummy cartridges to be randomly distributed among a second number of drop points in said second drop zone comprises:
Dividing the second falling bullet area into a first sub area and a second sub area, wherein the centers of the first sub area and the second sub area are overlapped, and the first sub area is smaller than the second sub area;
randomly generating a third number of drop points in the first sub-area, and respectively allocating the third number of drop points to a third number of virtual bullets, wherein the third number is smaller than the second number;
controlling the third number of virtual bullets to respectively hit the correspondingly distributed drop points in the second drop zone according to the drop points respectively distributed to the third number of virtual bullets;
dividing the second sub-region into a fourth number of candidate regions on average, wherein the fourth number is a difference between the second number and the third number;
randomly generating a bullet drop point in each candidate area to obtain a fourth number of bullet drop points;
assigning the fourth number of drop points to the fourth number of virtual bullets, respectively;
and controlling the fourth number of virtual bullets to respectively hit the correspondingly distributed drop points in the second drop zone according to the drop points respectively distributed to the fourth number of virtual bullets.
15. The method of claim 14,
when the second drop zone is a circular zone with a radius R,
the dividing the second falling bullet area into a first sub area and a second sub area includes:
and dividing the second falling area into a first sub-area with the radius of P and a second sub-area with the radius of an inner ring of S and the radius of an outer ring of R by taking the center of the second falling area as the center of a circle, wherein P is less than or equal to S, and S is less than R.
16. The method of claim 15,
the averagely dividing the second sub-region into a fourth number of candidate regions comprises:
averagely dividing the second sub-area with the inner ring radius of S and the outer ring radius of R into a fourth number of candidate areas, wherein the angle range corresponding to each candidate area is 360 degrees/the fourth number;
the randomly generating a bullet drop point in each candidate area comprises:
performing the following processing for each of the candidate regions:
randomly generating an angle from the angle range corresponding to the candidate region and a radius between S and R;
and taking the point corresponding to the angle and the radius in the candidate area as a bullet drop point corresponding to the candidate area.
17. A shooting control apparatus for a virtual character, the apparatus comprising:
the virtual shooting prop comprises a display module, a shooting module and a control module, wherein the display module is used for displaying a virtual character and a virtual shooting prop held by the virtual character in a virtual scene, and the virtual shooting prop can simultaneously shoot a plurality of virtual bullets when shooting at each time;
the control module is used for responding to a first shooting trigger operation aiming at the virtual shooting prop when the virtual character is in a static state, and controlling the virtual shooting prop to launch a first number of virtual bullets to a first bullet drop area;
the control module is further configured to control the virtual shooting prop to launch a second number of virtual bullets to a second bullet drop area in response to a second shooting trigger operation for the virtual shooting prop when the virtual character is in a motion state, where the second bullet drop area is smaller than the first bullet drop area.
18. An electronic device, characterized in that the electronic device comprises:
a memory for storing executable instructions;
a processor for implementing the method of controlling the firing of a virtual character according to any one of claims 1 to 16 when executing executable instructions stored in the memory.
19. A computer-readable storage medium storing executable instructions, wherein the executable instructions, when executed by a processor, implement the method of controlling shooting by a virtual character of any one of claims 1-16.
20. A computer program product comprising a computer program or instructions, characterized in that the computer program or instructions, when executed by a processor, implement the method of controlling shooting by a virtual character according to any one of claims 1-16.
CN202111019626.3A 2021-09-01 2021-09-01 Shooting control method and device for virtual character, electronic equipment and storage medium Active CN113663329B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111019626.3A CN113663329B (en) 2021-09-01 2021-09-01 Shooting control method and device for virtual character, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111019626.3A CN113663329B (en) 2021-09-01 2021-09-01 Shooting control method and device for virtual character, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113663329A true CN113663329A (en) 2021-11-19
CN113663329B CN113663329B (en) 2024-04-02

Family

ID=78548017

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111019626.3A Active CN113663329B (en) 2021-09-01 2021-09-01 Shooting control method and device for virtual character, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113663329B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011182929A (en) * 2010-03-08 2011-09-22 Konami Digital Entertainment Co Ltd Game program, game device, and game control method
CN102824747A (en) * 2011-06-14 2012-12-19 科乐美数码娱乐株式会社 Game machine and storage medium storing computer programs
CN108043032A (en) * 2017-12-29 2018-05-18 武汉艺术先生数码科技有限公司 Shooting game system based on AR
CN111001159A (en) * 2019-12-06 2020-04-14 腾讯科技(深圳)有限公司 Virtual item control method, device, equipment and storage medium in virtual scene
CN112274931A (en) * 2020-11-20 2021-01-29 网易(杭州)网络有限公司 Shooting track processing method and device and electronic equipment
KR102237380B1 (en) * 2020-10-16 2021-04-09 육군사관학교 산학협력단 Device for analyzing impact point and vitual shooting training simulation system using the same
CN112657183A (en) * 2021-01-04 2021-04-16 网易(杭州)网络有限公司 Game display control method and device, storage medium and computer equipment
CN113239576A (en) * 2021-06-10 2021-08-10 北京字跳网络技术有限公司 Impact point determining method, terminal, equipment and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011182929A (en) * 2010-03-08 2011-09-22 Konami Digital Entertainment Co Ltd Game program, game device, and game control method
CN102824747A (en) * 2011-06-14 2012-12-19 科乐美数码娱乐株式会社 Game machine and storage medium storing computer programs
CN108043032A (en) * 2017-12-29 2018-05-18 武汉艺术先生数码科技有限公司 Shooting game system based on AR
CN111001159A (en) * 2019-12-06 2020-04-14 腾讯科技(深圳)有限公司 Virtual item control method, device, equipment and storage medium in virtual scene
KR102237380B1 (en) * 2020-10-16 2021-04-09 육군사관학교 산학협력단 Device for analyzing impact point and vitual shooting training simulation system using the same
CN112274931A (en) * 2020-11-20 2021-01-29 网易(杭州)网络有限公司 Shooting track processing method and device and electronic equipment
CN112657183A (en) * 2021-01-04 2021-04-16 网易(杭州)网络有限公司 Game display control method and device, storage medium and computer equipment
CN113239576A (en) * 2021-06-10 2021-08-10 北京字跳网络技术有限公司 Impact point determining method, terminal, equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
猿天刚: "【使命召唤手游】职业哥常用,出场率最高霰弹枪KRM262深度测评", Retrieved from the Internet <URL:https://haokan.baidu.com/v?vid=11248771334683296201> *

Also Published As

Publication number Publication date
CN113663329B (en) 2024-04-02

Similar Documents

Publication Publication Date Title
US11712634B2 (en) Method and apparatus for providing online shooting game
CN108654086B (en) Method, device and equipment for obtaining attack damage in virtual environment
CN110732135B (en) Virtual scene display method and device, electronic equipment and storage medium
CN110585712A (en) Method, device, terminal and medium for throwing virtual explosives in virtual environment
JP2022539289A (en) VIRTUAL OBJECT AIMING METHOD, APPARATUS AND PROGRAM
US20230013014A1 (en) Method and apparatus for using virtual throwing prop, terminal, and storage medium
CN112121414B (en) Tracking method and device in virtual scene, electronic equipment and storage medium
WO2021227733A1 (en) Method and apparatus for displaying virtual prop, and device and storage medium
JP7447296B2 (en) Interactive processing method, device, electronic device and computer program for virtual tools
US20230033530A1 (en) Method and apparatus for acquiring position in virtual scene, device, medium and program product
WO2022156491A1 (en) Virtual object control method and apparatus, and device, storage medium and program product
JP2023543519A (en) Virtual item input method, device, terminal, and program
WO2022007567A1 (en) Virtual resource display method and related device
JP2024511796A (en) Virtual gun shooting display method and device, computer equipment and computer program
CN111202983A (en) Method, device, equipment and storage medium for using props in virtual environment
US20230030619A1 (en) Method and apparatus for displaying aiming mark
CN113703654B (en) Camouflage processing method and device in virtual scene and electronic equipment
CN113663329B (en) Shooting control method and device for virtual character, electronic equipment and storage medium
CN112156472B (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN114146412A (en) Display control method and device in game, electronic equipment and storage medium
CN114225419B (en) Virtual prop control method, device, equipment, storage medium and program product
CN113769392B (en) Method and device for processing state of virtual scene, electronic equipment and storage medium
ZHANG et al. FPS Game Design and Implementation Based on Unity3D
CN117298580A (en) Virtual object interaction method, device, equipment, medium and program product
CN114191817A (en) Shooting control method and device for virtual character, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40055279

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant