WO2022007569A1 - 道具控制方法和装置、存储介质及电子设备 - Google Patents

道具控制方法和装置、存储介质及电子设备 Download PDF

Info

Publication number
WO2022007569A1
WO2022007569A1 PCT/CN2021/098687 CN2021098687W WO2022007569A1 WO 2022007569 A1 WO2022007569 A1 WO 2022007569A1 CN 2021098687 W CN2021098687 W CN 2021098687W WO 2022007569 A1 WO2022007569 A1 WO 2022007569A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
shooting
prop
gesture
area
Prior art date
Application number
PCT/CN2021/098687
Other languages
English (en)
French (fr)
Inventor
杨金昊
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to JP2022572422A priority Critical patent/JP7419568B2/ja
Priority to KR1020227031420A priority patent/KR20220139967A/ko
Publication of WO2022007569A1 publication Critical patent/WO2022007569A1/zh
Priority to US18/046,122 priority patent/US20230057421A1/en
Priority to JP2024001020A priority patent/JP2024062977A/ja

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/219Input arrangements for video game devices characterised by their sensors, purposes or types for aiming at specific areas on the display, e.g. light-guns
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • A63F13/245Constructional details thereof, e.g. game controllers with detachable joystick handles specially adapted to a particular type of game, e.g. steering wheels
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/44Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment involving timing of operations, e.g. performing an action within a time slot
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets

Definitions

  • the present application relates to the field of computers, and in particular, to a prop control method and device, a storage medium, and an electronic device.
  • shooting props may include lightweight equipment items (such as knives or pistols, etc.), heavyweight equipment items (such as bombs, mortars, etc.).
  • each trigger operation can only trigger the shooting action once, so that the player needs to frequently trigger the shooting action when using the shooting prop.
  • the operation of the shooting prop may also be complicated, resulting in low control efficiency for controlling the shooting prop to execute the shooting action.
  • An embodiment of the present application provides a method for controlling props, including: acquiring a trigger operation performed on a target shooting prop in an activated state in a virtual battle scene, wherein the attack range of the target shooting prop covers the target shooting prop in the virtual battle scene. Part of the area; in response to the trigger operation, determine the target object aimed at by the target shooting prop; within the target time period, execute the target area range where the target object is located through the target shooting prop for multiple consecutive times. Shooting action.
  • An embodiment of the present application provides a prop control device, comprising: an acquisition unit configured to acquire a trigger operation performed on an active target shooting prop in a virtual battle scene, wherein the attack range of the target shooting prop covers the a partial area range in the virtual battle scene; a determining unit, configured to respond to the triggering operation, to determine the target object aimed at by the target shooting prop; a control unit, configured to pass the target shooting prop within the target time period A continuous multiple shooting action is performed to the target area where the target object is located.
  • Embodiments of the present application provide a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, where the computer program is used to execute the above-mentioned method for controlling props at runtime.
  • An embodiment of the present application provides an electronic device, including a memory and a processor, where a computer program is stored in the memory, and the processor is configured to execute the above-mentioned prop control method through the computer program.
  • FIG. 1 is a schematic diagram of a prop control system according to an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a prop control method according to an embodiment of the present application.
  • FIG. 3 is a schematic diagram of the size of the adjustment aligner according to an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a prop control method according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a gesture-triggered operation according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of a shooting flight trajectory according to an embodiment of the present application.
  • FIG. 7 is a schematic diagram of the relationship between an operation duration and a shooting parameter in an embodiment of the present application.
  • FIG. 8 is a schematic diagram of an explosion in a target area according to an embodiment of the present application.
  • FIG. 9 is a schematic diagram of a collision with a first reference object according to an embodiment of the present application.
  • FIG. 10 is a schematic diagram of the relationship between the distance and the variation range of the life value according to the embodiment of the present application.
  • FIG. 11 is a schematic diagram of adjusting the aiming direction according to an embodiment of the present application.
  • FIG. 12 is a schematic diagram of a prop configuration according to an embodiment of the present application.
  • FIG. 13 is a schematic diagram of a display state of an adjustment prop trigger icon according to an embodiment of the present application.
  • FIG. 14 is a schematic structural diagram of a prop control device according to an embodiment of the present application.
  • FIG. 15 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • Embodiments of the present application provide a prop control method.
  • the above prop control method may be applied to, but not limited to, a prop control system in the environment as shown in FIG. 1 , wherein the prop control system may include: But not limited to the terminal device 102 , the network 104 , and the server 106 .
  • the terminal device 102 runs a shooting application client shown in FIG. 1 , such as a shooting game application client.
  • the above-mentioned terminal device 102 includes a human-computer interaction screen 1022 , a processor 1024 and a memory 1026 .
  • the human-computer interaction screen 1022 is used to present the scene picture in the virtual battle scene provided by the shooting task run by the above-mentioned shooting application client, wherein the shooting task is a confrontation task realized between multiple virtual objects in the virtual battle scene; It is also used to provide a human-computer interaction interface to obtain a human-computer interaction operation (such as a trigger operation) performed on the human-computer interaction interface of the above-mentioned shooting application client.
  • the processor 1024 is configured to generate a corresponding operation instruction in response to the above trigger operation, and control the virtual object in the virtual battle scene to perform a corresponding shooting action according to the operation instruction.
  • the memory 1026 is used to store the scene picture in the virtual battle scene provided by the above-mentioned shooting task and the attribute information of the virtual object in the virtual battle scene, such as the object identifier, the information of the props equipped by the object, and the life value of the object.
  • the server 106 includes a database 1062 and a processing engine 1064.
  • the database 1062 is used to store the battle results generated by the above-mentioned virtual objects in the shooting task, and is also used to provide the above-mentioned clients with corresponding battle resources, such as attribute information of props and Picture resources for shooting rendering effects, etc.
  • the processing engine 1064 is used to determine the current battle result and the battle resources required by the client, and send them to the shooting application client in the terminal device 102 .
  • the example process is as follows: as in steps S102-S104, a shooting task running in the terminal device 102 (a shooting battle task between the virtual object 10 shown in FIG. 1 and the virtual object holding the target shooting prop 11)
  • the trigger operation performed on the target shooting prop in the activated state is obtained through the human-computer interaction screen 1022, and then in steps S104-S106, in response to the trigger operation, the target object targeted by the target shooting prop is determined ;
  • the target shooting props are used to perform continuous multiple shooting actions to the target area where the target object is located, wherein the time interval between two adjacent shooting actions can be smaller than the first threshold.
  • step S108 the result of the competition in the above process is sent to the server 106 through the network 104.
  • the server 106 When the server 106 receives the above-mentioned battle result, it executes steps S110-S114: saves the battle result, acquires the battle resource, and sends the battle resource to the terminal device 102, so that the terminal device 102 renders and displays it on the human-computer interaction screen 1022.
  • the terminal device 102 may also store the battle result locally, and render and display it on the human-computer interaction screen 1022 according to the locally stored battle resources.
  • the terminal device 102 may run an offline version (stand-alone version) of the shooting application client.
  • the trigger operation performed on the active target shooting prop in the virtual battle scene is obtained, and the attack range of the target shooting prop covers part of the area in the virtual battle scene; in response to the trigger operation to determine the target object aimed at by the target shooting props; within the target time period, the target shooting props are used to perform continuous multiple shooting actions to the target area where the target object is located.
  • the equipment operation of the target shooting props in the game process is simplified, and the number of times of using the target shooting props is increased, thereby achieving the purpose of improving the control efficiency, thereby overcoming the problems in the related art when players (users) use the shooting props to perform shooting actions. It takes a long time to prepare for the problem of low control efficiency.
  • the above-mentioned terminal device may be a terminal device configured with a shooting application client, which may include, but is not limited to, at least one of the following: a mobile phone (such as an Android mobile phone, an iOS mobile phone, etc.), a laptop computer, a tablet computer (Portable Android Device , PAD), PDAs, Mobile Internet Devices (MIDs), desktop computers, smart TVs, etc.
  • the above-mentioned network may include, but is not limited to, at least one of a wired network and a wireless network, wherein the wired network includes at least one of a local area network, a metropolitan area network, and a wide area network, and the wireless network includes Bluetooth, WIFI, and other wireless communication devices. at least one in the network.
  • the above server may be a single server, a server cluster composed of multiple servers, or a cloud server. The above is only an example, which is not limited in this embodiment.
  • the above prop control method includes:
  • S206 within the target time period, perform continuous multiple shooting actions to the target area where the target object is located by using the target shooting props.
  • the above-mentioned prop control method can be applied to, but not limited to, shooting applications.
  • the shooting task is configured with a target shooting prop, after acquiring the trigger operation performed on the target shooting prop in the activated state, in response to the trigger operation, determine The target object that is aimed at, and the target area where the target object is located within the target time period performs multiple shooting actions in a row, so that the equipment operation of the target shooting props during the game is simplified and the use of the target shooting props is increased. Therefore, the purpose of improving the control efficiency is achieved, thereby overcoming the problem of low control efficiency caused by the need for a player to prepare for a long time when using the shooting prop to perform a shooting action in the related art.
  • the above-mentioned shooting application may be a military simulation application (for running military simulation tasks), or may be a shooting game application, such as a multiplayer online battle arena (Multiplayer Online Battle Arena, MOBA) application, Or it can also be a single-player game (Single-Player Game, SPG) application.
  • the types of the above shooting game applications may include but are not limited to at least one of the following: two-dimensional (Two Dimension, 2D) game applications, three-dimensional (Three Dimension, 3D) game applications, virtual reality (Virtual Reality, VR) game applications, augmented reality (Augmented Reality, AR) game applications, mixed reality (Mixed Reality, MR) game applications.
  • the above is just an example, which is not limited in this embodiment.
  • the above-mentioned shooting game application may be a third-person shooting game (Third Person Shooting Game, TPS) application, such as the perspective of a third-party character object (ie, a third-person perspective) other than the virtual character (virtual object) controlled by the current player.
  • TPS Third Person Shooting Game
  • FPS First Person Shooting Game
  • Players can achieve the following controls through each shooting application client: control virtual characters (also called player characters) to perform specified actions, control and non-player characters (Non-Player Character, NPC) to perform interactive actions, control and virtual battle scenes.
  • the static objects (such as buildings, trees, etc.) perform interactive actions, and control the virtual characters to use the props and vehicles equipped for them in the virtual battle scene.
  • the props here can include but are not limited to heavyweight shooting props. The operations required for a heavyweight shooting prop to be equipped once are more complicated and time consuming, but the attack range of the heavyweight shooting prop is compared with that of the lightweight equipment props. bigger. The above is only an example, which is not limited in this embodiment.
  • the above trigger operation may include, but is not limited to, at least one of the following: a touch screen click operation, a press operation, a touch screen sliding operation, a gesture operation, a gesture indication operation (a gesture trigger operation), a pupil locking operation, etc. interactive operation.
  • the triggering operation of the target shooting prop can be set in advance according to a specified gesture operation or a specified posture; then during the task running, when the target shooting prop is activated state, and when it is detected by the camera that the player shows a specified gesture operation or a specified posture, it is determined to trigger the aiming shooting process of the target shooting props.
  • the trigger operation of the target shooting prop can be set in advance according to a touch screen button displayed on the human-computer interaction screen; When the target shooting prop is in an activated state and a touch-screen click operation on the touch screen button is detected, it is determined to trigger the aiming and shooting process of the target shooting prop.
  • a touch screen button displayed on the human-computer interaction screen
  • the current virtual character controlled by the player can be equipped with, but not limited to, two types of shooting props: 1) Standing shooting props (hand-held lightweight shooting props are allowed) ;2) Shooting props to be loaded (hand-held lightweight shooting props and/or heavyweight shooting props are allowed).
  • the trigger icon corresponding to the above-mentioned standing shooting props can be directly displayed in the operation area of the human-computer interaction interface during the operation of the shooting task, but does not need to be Additional props loading interface, so that players can directly control the current virtual character to use this type of shooting props to perform shooting actions.
  • the current virtual character there is no need to perform additional operations of equipping props, and for the current virtual character, there is no need to perform additional weapon equipment actions, so as to achieve the effect of simplifying the operation of props control.
  • the above-mentioned shooting props to be loaded will be stored in the props backpack of the current virtual character. It is necessary to add such shooting props to the operation area through an additional prop loading interface (configuration interface), so that the player can control the current virtual character. Characters use this type of shooting props to perform shooting actions. Among them, for heavyweight shooting props, due to their large size, it takes a certain amount of time to prepare each time they are equipped on the current avatar or changing shells; Cold Time, CD), that is to say, after each use of this type of shooting props, this type of shooting props will be in a frozen state during the cooling time, and the frozen state is used to indicate that the above-mentioned shooting props will freeze when a trigger operation is detected. Unresponsive, i.e. cannot be called to perform a shooting action.
  • the above-mentioned target shooting props may be, but are not limited to, heavyweight shooting props (also referred to as ultimate weapons) among the above-mentioned shooting props to be loaded.
  • the target shooting props here can be heavyweight bow and crossbow props, gun props and so on.
  • it can be a heavyweight powerful crossbow, which has the function of long-range precise shooting, and at the same time, a powerful explosive is bound to the arrow of the powerful crossbow, and also has the function of large-scale damage after hitting.
  • the target object aimed at is determined in response to the triggering operation, and the continuous shooting action in the target time period is realized by controlling the target shooting prop, thereby improving the attack range.
  • the shooting efficiency of such heavyweight shooting props makes it more efficient to control the application.
  • the above-mentioned target shooting props will enter a freezing state after completing multiple consecutive shooting actions in response to a trigger operation.
  • the use cooling time CD corresponding to the freezing state may be set to different values, but is not limited to, different values according to different item attribute requirements, and the value is not limited in this embodiment.
  • the remaining time for the target shooting prop to exit the freezing state may be prompted, but not limited to, by means of a countdown prompt to the player who controls the virtual character.
  • the manner of the countdown prompt may include, but is not limited to, one of the following: a countdown by a second, a countdown by a progress bar, and the like. This is an example, which is not limited in this embodiment.
  • the size of the crosshair when determining the target object targeted by the target shooting prop, may be determined according to the operation attribute information of the trigger operation, wherein the size of the crosshair is the same as that of the target shooting prop.
  • the prop object's flying distance without collision is negatively correlated. For example, the smaller the displayed size of the reticle, the longer the flying distance of the above-mentioned prop object without collision, and the higher the hit accuracy.
  • the operation attribute information of the trigger operation can be used as a reference for adjusting the size of the reticle (also the aiming distance).
  • the operation attribute information here may include, but is not limited to: the pressing duration of the touch-screen pressing operation, the pressing force (pressing pressure) of the touch-screen pressing operation, the sliding distance of the touch-screen sliding operation, the holding duration of the gesture operation, and the holding time of the gesture-triggered operation. time etc.
  • the operation attribute information is used as a reference condition for adjusting the reticle. For example, the longer the pressing duration of the touch screen pressing operation, the smaller the size of the reticle.
  • the longer the sliding distance the smaller the size of the quasi center; and when the touch screen sliding operation is sliding in the second direction, the sliding distance is longer. , the larger the size of the caliper.
  • the size of the crosshair 302 of the target shooting prop can be as shown in (a) in Figure 3
  • the size of the reticle 302 of the target shooting prop may be as shown in (b) of FIG. 3 . That is to say, as the pressing time of the pressing operation on the touch screen is longer, that is, the charging time is longer, the size of the pre-aimed reticule using the target shooting prop is smaller.
  • the above is an example, which is not limited in this embodiment.
  • step S402 start to run a shooting game task; then as in step S404, check whether the cooling time of the target shooting props is over; when the cooling time has ended, as in step S406, determine that the target shooting props are in an activated state; then as in step S408 , to detect whether the trigger operation for the above-mentioned target shooting prop is obtained; if the trigger operation is obtained, step S410 is performed, otherwise, the process returns to step S406.
  • steps S410-S412 are performed, namely, the target shooting prop is called, and whether the fire button (shooting action trigger button) is pressed is detected, if pressed, step S414 is performed, otherwise, return to step S410. If it is detected that the fire button is pressed, as in steps S414-S416, the target shooting prop is controlled to enter the pre-aiming state, and it is detected whether the fire button is continuously pressed. If it is determined that the fire button is continuously pressed, step S418 is executed, according to The size of the pressing time control crosshair becomes smaller.
  • step S420 it is detected whether the fire key is released, if it is released, step S422 is executed, otherwise, it returns to step S418. If the fire button is released, step S422 is executed, and the target shooting props are used to perform multiple shooting actions in succession. Then, as in steps S424-S426, it is detected whether the prop objects (such as bullets or bows and arrows) shot by the target shooting props collide with the objects (virtual objects) set in the virtual object scene (game scene), and if a collision occurs, control the prop object explode. Then, perform steps S428-S430 to detect whether there are target objects within the explosion range, and if it is determined that there are target objects, control these target objects to be damaged, that is, the life value of these target objects will decrease accordingly.
  • the prop objects such as bullets or bows and arrows
  • FIG. 4 is an example, which is used to describe an implementation manner in the embodiment of the present application, and the present embodiment does not make any limitation on the sequence of steps shown in FIG. 4 and the means to be performed.
  • determining the target object that the target shooting prop is aimed at includes:
  • the object selected by the crosshair is determined as the target object.
  • the above triggering operation may be, but not limited to, at least one of the following: a touch screen operation, a gesture operation, a gesture-triggered operation, and the like.
  • the above-mentioned touch screen operation may include, but is not limited to, at least one of the following: a pressing operation, a sliding operation, and the like.
  • the operation attribute information here may be attribute information such as the operation duration of the triggering operation, the operation direction of the triggering operation, and the operation frequency of the triggering operation. That is to say, the size of the alignment center can be adjusted in one direction. For example, as the time increases, the size of the alignment center gets smaller and smaller.
  • the size of the alignment center can be adjusted bidirectionally, for example, the adjustment direction is determined according to the sliding direction or the gesture direction, and the adjustment scale is further determined according to the sliding distance and the gesture moving distance. For example, swipe left to decrease the crosshair, and swipe right to increase the crosshair. This is an example, and no limitation is made in this embodiment.
  • the process of adjusting the reticle of the currently displayed target shooting prop in the aiming direction to the target size matching the operation attribute information may include at least one of the following situations:
  • the operation attribute information is the operation duration
  • the operation duration includes any one of the pressing duration of the pressing operation, the gesture holding duration of the gesture operation, and the holding duration of the gesture triggering operation;
  • the operation attribute information is the sliding distance of the sliding operation
  • the operation attribute information is the gesture movement distance of the gesture operation
  • the gesture movement distance obtained by the gesture operation performed in the direction is positively correlated with the target size
  • the gesture movement distance obtained by the gesture operation performed in the second gesture direction is related to the target size. size is negatively correlated.
  • the player can simultaneously control and adjust the viewing direction of the current virtual character through the shooting application client (that is, the aiming of the target shooting props). direction), you can also control and adjust the current virtual character to move a certain distance.
  • the target object aimed at by the target shooting props will also be adjusted and updated accordingly, so that the updated target object can be shot quickly and accurately.
  • the player 50 is running a shooting game task through a somatosensory shooting game application.
  • the image of the player 50 is captured by the camera 500 to identify the current gesture and action of the player 50. If the gesture action is consistent with the designated gesture action 502 prompted in the shooting game task, it is determined that a trigger operation is obtained. Further, the aiming point of the target shooting prop is adjusted according to the detected holding time of the above-mentioned posture action. For example, the longer the holding time, the smaller the size of the reticle of the target shooting prop, the higher the aiming accuracy, the longer the shooting distance, and the higher the success rate of hitting the target object.
  • the aiming point of the target shooting prop in the aiming direction is adjusted, so as to realize the pre-aim adjustment for the target object, thereby achieving the effect of improving the shooting hit rate.
  • the method further includes:
  • the shooting parameters include the initial shooting velocity and initial gravitational acceleration of the prop object shot by the target shooting prop;
  • S2 Determine the shooting flight trajectory of the prop object in the aiming direction according to the shooting parameters, wherein the flight distance of the shooting flight trajectory in the absence of collision is negatively correlated with the target size of the reticle.
  • the prop objects shot by the above-mentioned target shooting props may include, but are not limited to: bows and arrows shot by a crossbow, bullets or explosives shot by a gun, and the like.
  • the prop object configured in the virtual battle scene will simulate the physical movement in the real world, and will fly for a period of time according to the trajectory of free fall after shooting until it lands or collides with other reference objects.
  • the flight trajectory obtained in the above-mentioned flight process is related to the shooting parameters (ie, the initial shooting speed and the initial gravitational acceleration) of the above-mentioned prop object when shooting.
  • the size of the reticle is adjusted by the pressing time of the touch screen pressing operation as an example. While adjusting the size of the reticle, the flight is also determined according to the shooting parameters of the currently used target shooting props. Trajectory (shooting flight trajectory). Assuming that the pressing time of the touch screen pressing operation is T, the flight trajectory of the target shooting prop in the case of no collision (directly landing) can be shown in (a) in Figure 6, where the initial shooting velocity at the initial position A is v0, the initial gravitational acceleration is g0. After reaching position B after a period of time, the shooting speed will drop to v1, and the acceleration of gravity will also drop to g1.
  • the pressing time of the touch screen pressing operation is 2T, it means that the pressing force is longer, and the flight trajectory generated by the target shooting prop in the case of no collision (directly landing) can be shown in (b) in Figure 6, where The initial velocity of the shot at the initial position C is v2, and the initial gravitational acceleration is g2. After reaching position D after a period of time, the shooting speed will drop to v3, and the acceleration of gravity will also drop to g3.
  • the pressing duration of the touch-screen pressing operation shown in (b) of FIG. 6 is longer than the pressing duration of the touch-screen pressing operation shown in (a) of FIG. 6 , the The initial firing velocity v2 is also greater than the initial firing velocity v0 shown in (a) in Fig.
  • FIG. 6 is a reference example, which is not limited in this embodiment.
  • acquiring shooting parameters that match the operational attribute information includes at least one of the following:
  • the operation attribute information is the operation duration
  • the shooting parameters matching the operation duration are obtained, wherein the operation duration is positively correlated with the initial shooting velocity, and the operation duration is negatively correlated with the initial gravitational acceleration, and the operation duration includes the pressing of the pressing operation. Any one of the duration, the gesture holding duration of the gesture operation, and the holding duration of the gesture-triggered operation;
  • the operation attribute information is the pressing pressure of the pressing operation, obtain the shooting parameters that match the pressing pressure, wherein the pressing pressure is positively correlated with the initial firing velocity, and the pressing pressure is negatively correlated with the initial acceleration of gravity;
  • the operation attribute information is the sliding distance of the sliding operation
  • obtain the shooting parameters that match the sliding distance wherein, the sliding distance obtained by the sliding operation performed in the first sliding direction is positively correlated with the initial shooting speed, and the first The sliding distance obtained by the sliding operation performed in the sliding direction is negatively correlated with the initial gravitational acceleration; the sliding distance obtained by the sliding operation performed in the second sliding direction is negatively correlated with the initial shooting speed, and the sliding operation performed in the second sliding direction is obtained.
  • the sliding distance is positively correlated with the initial gravitational acceleration;
  • the operation attribute information is the gesture movement distance of the gesture operation, obtain the shooting parameter that matches the gesture movement distance, wherein, the gesture movement distance obtained with the gesture operation performed in the first gesture direction is positively correlated with the initial shooting velocity, The gesture movement distance obtained by the gesture operation performed in the first gesture direction is negatively correlated with the initial gravitational acceleration; the gesture movement distance obtained by the gesture operation performed in the second gesture direction is negatively correlated with the initial shooting speed, and the second gesture The gesture movement distance obtained by the gesture operation performed in the direction is positively related to the initial gravitational acceleration.
  • the above-mentioned shooting parameters may, but are not limited to, have a correlation with the operation attribute information of the trigger operation, wherein the shooting parameters can be adjusted unidirectionally, for example, as the operation duration t (such as the pressing duration) increases,
  • the initial shooting velocity v may become larger and larger as shown in Fig. 7(a), and the initial gravitational acceleration g may become smaller and smaller as shown in Fig. 7(b).
  • the graphs shown here are trends and do not impose any limitations on specific values.
  • the shooting parameters can be adjusted in both directions, for example, the adjustment direction is determined according to the sliding direction or the gesture direction, and the adjustment scale is further determined according to the sliding distance and the gesture moving distance. For example, slide to the left to decrease the initial speed, and slide to the right to increase the initial speed.
  • the initial gravitational acceleration can be correlated with the sliding distance. For example, the longer the sliding distance, the smaller the initial gravitational acceleration.
  • the adjustment of the initial gravitational acceleration is not an unlimited adjustment, and the reference basis for the adjustment is to ensure the true falling trajectory of the simulated object.
  • the shooting operation matching the operation attribute information can also be obtained.
  • the pre-aiming state of the target shooting props can be adjusted directly through the triggering operation, so that the aiming accuracy of the target shooting props can be accurately adjusted during the triggering process, so as to improve the accuracy of using the target shooting props.
  • the method further includes:
  • the target object When the target object is within the attack range of the target shooting prop, and the prop object shot by the target shooting prop collides with the target object, adjust the respective health values of all objects within the target area where the target object is located. If the value is less than the health value before adjustment, the target area range is the area range obtained from the location of the target object as the center and the target distance as the radius.
  • the target object 802 is aimed at after the target shooting prop (such as a powerful crossbow) is adjusted by pre-aiming, as shown in (a) of FIG. 8 . Then determine the distance between the object 802 and the position of the virtual character currently using the target shooting prop. If the distance is smaller than the attack range of the target shooting prop, it means that the target shooting prop (such as a powerful crossbow) shoots the prop.
  • Objects (such as bows and arrows) can reach the location of object 802 and collide with object 802 . Then, the explosives carried on the prop objects (such as bows and arrows) are used to explode, and the rendered explosion effect can be as shown in (b) in FIG. 8 . All targets in the area of 803) deal damage, reducing their health.
  • the target shooting prop when the target object is located in the attack range of the target shooting prop, and the prop object shot by the target shooting prop collides with the target object, the target shooting prop is controlled to generate damage after the collision, and the target shooting prop is controlled to cause damage.
  • the health value of all objects within the target area where the object is located is adjusted. That is to say, using the target shooting props provided in this embodiment will achieve the purpose of long-range shooting of objects within the area where the target object is located, and does not require complex prop equipment operations, thereby improving the efficiency of prop control.
  • the method further includes:
  • the first reference area where the first reference object is located
  • the respective health values of all objects within the range are adjusted, wherein the adjusted health value is less than the life value before adjustment, and the direction of the first reference object relative to the target shooting prop is the same as the direction of the target object relative to the target shooting prop.
  • the reference area range is an area range obtained with the location of the first reference object as the center and the first reference distance as the radius;
  • the prop object will no longer be controlled by the target shooting prop. If the prop collides with the newly encountered reference object in the flight path, it will directly act on the reference object. , instead of the target area reference range where the target object is located. If the prop does not encounter a reference object in the flight trajectory, it will directly act on the area where the landing position is located after landing.
  • the above-mentioned first reference object may include, but is not limited to: virtual character objects controlled by other players in a virtual battle scene (such as a virtual battle scene provided by a shooting mission), and a non-player character (NPC) in the virtual battle scene Objects, stationary objects in virtual battle scenes (such as buildings, vehicles, trees, etc.), etc. That is to say, the first reference object may be any static object or dynamic object in the virtual battle scene provided by the shooting task, which is not limited in this embodiment of the present application.
  • the target object 802 is aimed at after the target shooting prop (such as a powerful bow and crossbow) is adjusted by pre-aiming, as shown in (a) of FIG. 9 . Then determine the distance between the object 802 and the position of the virtual character currently using the target shooting prop. If the distance is greater than the attack range of the target shooting prop, it means that the target shooting prop (such as a powerful crossbow) shoots the prop. Objects (such as bows and arrows) cannot reach the location of object 802.
  • the target shooting prop such as a powerful bow and crossbow
  • the Adjust the health of all objects within a reference area As shown in (b) of FIG. 9 , at least the object 904 is included in the first reference area, then the object 904 will be dealt with life damage, that is, its life value will be reduced.
  • the distance between the above-mentioned object 802 and the position of the virtual character currently using the target shooting prop is greater than the attack range of the target shooting prop, and the prop object shot by the target shooting prop does not collide with any object , then when the prop object lands, it will cause life damage to the object within the area where the landing position is located, reducing its health value.
  • the specific process refer to the above example.
  • the target object in the case where the target object is not located in the attack range of the target shooting prop, it can be detected whether the prop object shot by the target shooting prop collides with other reference objects other than the target object, or whether Land, so as to control the target shooting prop to explode according to the detection result, so as to affect the life value of the objects in the corresponding area.
  • adjusting the respective health values of all objects includes:
  • the adjustment range of the health value corresponding to each object is determined according to the distance from each object in the area to the center, so as to damage the life of objects in different positions in the area, so as to achieve the purpose of simulation .
  • performing multiple consecutive shooting actions to the target area where the target object is located by using the target shooting props includes:
  • the target shooting prop when it is detected that the aiming direction of the target shooting prop has not changed within the target time period, the target shooting prop is used to perform continuous multiple shooting actions to the target area where the target object is located;
  • the target shooting prop after the target shooting prop is triggered each time, the target shooting prop will continuously perform multiple shooting actions within the target time period, and there is no need to perform additional shooting actions before each shooting action.
  • the operation of props and equipment simplifies the difficulty of operation and increases the number of shots, thereby achieving the effect of improving the efficiency of shooting control.
  • the target object aimed at by the target shooting prop will not change, and correspondingly, the target object where the target object is located will not change.
  • Perform multiple consecutive shooting actions within the target area It should be noted that there is a certain time interval between the continuous multiple shooting actions of the target shooting prop, so if the virtual character currently using the target shooting prop turns around during this period, the continuous multiple shooting actions of the corresponding target shooting prop are aimed at.
  • the target object will also change accordingly, so that there is no need to re-equip such complex target shooting props, you can achieve long-range and large-scale attacks on target objects in different directions, so as to improve the application range of shooting props. Shorten the duration of the shooting mission and increase the probability of winning the shooting mission.
  • the target object targeted by the first shooting action of the target shooting prop is the object 1102 shown in (a) in FIG. 11
  • the target area range is determined based on the object 1102
  • the target Shooting props cause damage to objects within the range of the target area.
  • the targeted target object is also adjusted to the object 1104 shown in (b) in FIG. 11
  • the target area range is re-determined based on the object 1104 Objects within the updated target area deal damage.
  • the target object aimed at each time the target shooting prop performs a shooting action is updated, and the updated target object and The updated target area range further expands the attack range of target shooting props with a larger attack range, so as to achieve long-distance and large-scale damage to objects in different directions.
  • the target shooting props are used to perform multiple consecutive shooting actions towards the target area where the target object is located, including:
  • the target object that is aimed at each time the shooting action is performed is updated, and the target shooting prop is sent to the The shooting action is performed in the range of the updated target area where the updated target object is located.
  • the target object targeted by the target shooting prop will not change.
  • the target area where the target object is located will be changed. Perform multiple shooting actions in a row.
  • the target object aimed at each time the target shooting prop performs a shooting action is updated, and the updated target object and The updated target area range further expands the attack range of target shooting props with a larger attack range, so as to achieve long-range and large-scale attacks on different target objects during the movement process.
  • the method before acquiring the trigger operation performed on the target shooting props in the activated state in the virtual battle scene, the method further includes:
  • S2 in response to the prop configuration instruction, display a configuration interface for providing shooting props in the shooting task, wherein the configuration interface includes a target slot configured for the target shooting prop;
  • the configuration interface 1202 shown in (a) of FIG. 12 is displayed, in which the display There are various shooting props such as crossbows or guns as shown in the picture. Further, assuming that the selection operation performed on the target slot 1204 is obtained, the shooting props (such as the powerful crossbow) in the above-selected target slot 1204 are added to the shooting task, and the item trigger icon is displayed in the shooting task In the operation area of (such as the operation area in the virtual battle scene provided by the shooting mission), the icon 1206 is shown in (b) of FIG. 12 .
  • the target shooting props before running the shooting task, can be selected and configured from the target operation (the selection operation for the target slot) separately configured for the target shooting props, so as to realize that the shooting tasks can be operated through the operation
  • the area quickly calls and controls the target shooting prop to perform multiple shooting actions in succession, thus saving the time of calling and loading the target shooting prop, and improving the control efficiency of directly using the target shooting prop to perform multiple shooting actions in shooting tasks.
  • the method further includes:
  • the initial state of the target shooting item is the frozen state that has not been activated.
  • the target shooting item will be adjusted to release the frozen state and enter the activated state, so that the target will be in the activated state.
  • Shooting props are allowed to be triggered and invoked to perform shooting actions.
  • the target shooting prop completes a set of shooting actions (the number of shooting actions in a set of shooting actions can be set according to the actual application scenario), it will also re-enter the freezing state, and can not be re-entered until the cooling time expires. active state.
  • the prop trigger icon of the above target shooting prop will also adjust the display state correspondingly, so as to intuitively remind the player of the current state of the target shooting prop.
  • the remaining duration of use of the cooling time is also displayed correspondingly, for example, in the form of a digital countdown or a progress bar.
  • the display state of the prop trigger icon will be adjusted at the same time, that is, from the first display state to the second display state, wherein, The first display state is used to indicate that the target shooting prop is in an activated state, and the second display state is used to indicate that the target shooting prop is in a frozen state, in which the target shooting prop will not respond after receiving a trigger operation.
  • the item trigger icon thereof will simultaneously display the remaining time in this state through the progress bar 1302, as shown in (a) of FIG. 13 .
  • the remaining time is zero, it means that the target shooting item is switched from the frozen state to the activated state, and the item trigger icon is also adjusted from the second display state to the first display state, as shown in (b) in Figure 13 icon 1304.
  • the state of the target shooting prop is indicated by the display state of the prop trigger icon of the target shooting prop, so that the user can be intuitively prompted whether the player can call the target shooting during the tense shooting task.
  • the props perform the shooting action, thus achieving the purpose of simplifying the control operation.
  • the apparatus includes: an acquisition unit 1402 configured to acquire a trigger operation performed on a target shooting prop in an active state in a virtual battle scene, wherein the attack range of the target shooting prop covers part of the area in the virtual battle scene range; the determining unit 1404 is configured to, in response to the triggering operation, determine the target object aimed at by the target shooting prop; the control unit 1406 is configured to execute continuous multiplexing through the target shooting prop to the target area where the target object is located within the target time period shooting action.
  • An embodiment of the present application further provides an electronic device for implementing the above prop control method, where the electronic device may be a terminal device or a server as shown in FIG. 1 .
  • the electronic device may be a terminal device or a server as shown in FIG. 1 .
  • This embodiment is described by taking the electronic device as a terminal device as an example.
  • the electronic device includes a memory 1502 and a processor 1504, where a computer program is stored in the memory 1502, and the processor 1504 is configured to execute the prop control method provided by the embodiment of the present application through the computer program.
  • the aforementioned electronic device may be located at at least one network device among a plurality of network devices of a computer network.
  • the above-mentioned processor may be configured to perform the following steps through a computer program: obtaining a trigger operation performed on an active target shooting prop in a virtual battle scene, wherein the attack range of the target shooting prop covers the virtual battle scene In response to the trigger operation, determine the target object aimed at by the target shooting prop; within the target time period, the target shooting props are used to perform continuous multiple shooting actions to the target area where the target object is located.
  • FIG. 15 is for illustration only, and the electronic device can also be a smart phone (such as an Android phone, an iOS phone, etc.), a tablet computer, a palmtop computer, and terminal devices such as MID and PAD.
  • FIG. 15 does not limit the structure of the above-mentioned electronic device.
  • the electronic device may also include more or fewer components than those shown in FIG. 15 (eg, network interfaces, etc.), or have a different configuration than that shown in FIG. 15 .
  • the memory 1502 may be used to store software programs and modules, such as program instructions/modules corresponding to the prop control method and device in the embodiments of the present application.
  • the processor 1504 executes various software programs and modules stored in the memory 1502 by running the software programs and modules. Function application and data processing, that is, to realize the above-mentioned prop control method.
  • Memory 1502 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
  • the memory 1502 may further include memory located remotely from the processor 1504, and these remote memories may be connected to the terminal through a network.
  • the memory 1502 may be, but not limited to, for storing information such as object attribute information of the virtual object and screen resources (combat resources) corresponding to shooting actions.
  • the above-mentioned memory 1502 may include, but is not limited to, the obtaining unit 1402 , the determining unit 1404 and the control unit 1406 in the above-mentioned prop control apparatus. In addition, it can also include but not limited to other module units in the above-mentioned prop control device.
  • the transmission means 1506 described above is used to receive or transmit data via a network.
  • Examples of the above-mentioned networks may include wired networks and wireless networks.
  • the transmission device 1506 includes a network adapter (Network Interface Controller, NIC), which can be connected to other network devices and routers through a network cable to communicate with the Internet or a local area network.
  • the transmission device 1506 is a radio frequency (Radio Frequency, RF) module, which is used for wirelessly communicating with the Internet.
  • RF Radio Frequency
  • the above-mentioned electronic equipment also includes: a display 1508, which is used to display the above-mentioned virtual battle scene and the virtual objects therein and the generated shooting process picture; and a connection bus 1510, which is used to connect the various module components in the above-mentioned electronic equipment.
  • the above-mentioned terminal device or server may be a node in a distributed system, wherein the distributed system may be a blockchain system, and the blockchain system may be communicated by the multiple nodes through a network A distributed system formed by connection in the form of.
  • a peer-to-peer (P2P, Peer To Peer) network can be formed between nodes, and any form of computing equipment, such as servers, terminals and other electronic devices can become a node in the blockchain system by joining the peer-to-peer network.
  • Embodiments of the present application provide a computer program product or computer program, where the computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
  • the processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the above-mentioned prop control method.
  • the above-mentioned computer-readable storage medium may be configured to store a computer program for performing the steps of: obtaining a triggering operation performed on an active target shooting prop in a virtual battle scene, wherein the target shooting prop is The attack range covers part of the area in the virtual battle scene; in response to the trigger operation, the target object targeted by the target shooting props is determined; within the target time period, the target shooting props are used to execute multiple consecutive executions to the target area where the target object is located. shooting action.
  • the storage medium may include: a flash disk, a read-only memory (Read-Only Memory, ROM), a random access device (Random Access Memory, RAM), a magnetic disk or an optical disk, and the like.
  • the integrated units in the above-mentioned embodiments are implemented in the form of software functional units and sold or used as independent products, they may be stored in the above-mentioned computer-readable storage medium.
  • the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art, or all or part of the technical solution, and the computer software product is stored in a storage medium,
  • Several instructions are included to cause one or more computer devices (which may be personal computers, servers, or network devices, etc.) to execute all or part of the steps of the methods of the various embodiments of the present application.
  • the disclosed clients may be implemented in other manners.
  • the device embodiments described above are only illustrative, for example, the division of units is only a logical function division. In actual implementation, there may be other division methods, for example, multiple units or components may be combined or integrated into Another system, or some features can be ignored, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of units or modules, and may be in electrical or other forms.
  • Units described as separate components may or may not be physically separated, and components shown as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种道具控制方法和装置、存储介质及电子设备。其中方法包括:获取对虚拟对战场景中处于激活状态的目标射击道具执行的触发操作,其中,目标射击道具的攻击范围覆盖虚拟对战场景中的部分区域范围;响应于触发操作,确定目标射击道具所瞄准的目标对象(802);在目标时间段内,通过目标射击道具向目标对象(802)所在的目标区域范围(803)执行连续多次的射击动作。

Description

道具控制方法和装置、存储介质及电子设备
相关申请的交叉引用
本申请基于申请号为202010664321.7、申请日为2020年07月10日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本申请涉及计算机领域,具体而言,涉及一种道具控制方法和装置、存储介质及电子设备。
背景技术
在射击类的虚拟对战场景中,往往需要玩家通过控制射击道具来射击场景中的目标对象,来获得当前对战的胜利。例如,射击道具可以包括轻量级装备道具(如刀具或手枪等)、重量级装备道具(如炸弹、迫击炮等)。
然而,针对虚拟对战场景中的射击道具,每次触发操作只能触发一次射击动作,从而使得玩家在使用该射击道具执行射击动作时需要频繁触发,另外,在每次使用该射击道具前,装备该射击道具的操作也可能较为复杂,导致控制该射击道具执行射击动作的控制效率较低。
针对上述的问题,目前尚未提出有效的解决方案。
发明内容
本申请实施例提供了一种道具控制方法,包括:获取对虚拟对战场景中处于激活状态的目标射击道具执行的触发操作,其中,所述目标射击道具的攻击范围覆盖所述虚拟对战场景中的部分区域范围;响应于所述触发操作,确定所述目标射击道具所瞄准的目标对象;在目标时间段内,通过所述目标射击道具向所述目标对象所在的目标区域范围执行连续多次的射击动作。
本申请实施例提供了一种道具控制装置,包括:获取单元,配置为获取对虚拟对战场景中处于激活状态的目标射击道具执行的触发操作,其中,所述目标射击道具的攻击范围覆盖所述虚拟对战场景中的部分区域范围;确定单元,配置为响应于所述触发操作,确定所述目标射击道具所瞄准的目标对象;控制单元,配置为在目标时间段内,通过所述目标射击道具向所述目标对象所在的目标区域范围执行连续多次的射击动作。
本申请实施例提供了一种计算机可读的存储介质,该计算机可读的存储介质中存储有计算机程序,其中,该计算机程序用于在运行时执行上述道具控制方法。
本申请实施例提供了一种电子设备,包括存储器和处理器,上述存储器中存储有计算机程序,上述处理器用于通过所述计算机程序执行上述的道具控制方法。
附图说明
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1是本申请实施例的道具控制系统的示意图;
图2是本申请实施例的道具控制方法的示意图;
图3是本申请实施例的调整准心的尺寸大小的示意图;
图4是本申请实施例的道具控制方法的示意图;
图5是本申请实施例的姿态触发操作的示意图;
图6是本申请实施例的射击飞行轨迹的示意图;
图7是本申请实施例的操作时长与射击参数之间关系的示意图;
图8是本申请实施例的在目标区域范围产生爆炸的示意图;
图9是本申请实施例的与第一参考对象发生碰撞的示意图;
图10是本申请实施例的距离与生命值的变化幅度之间关系的示意图;
图11是本申请实施例的调整瞄准方向的示意图;
图12是本申请实施例的道具配置的示意图;
图13是本申请实施例的调整道具触发图标的显示状态的示意图;
图14是本申请实施例的道具控制装置的结构示意图;
图15是本申请实施例的电子设备的结构示意图。
具体实施方式
为了使本技术领域的人员更好地理解本申请方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分的实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都应当属于本申请保护的范围。
需要说明的是,本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。在以下的描述中,所涉及的术语“多个”是指至少两个。
本申请实施例提供了一种道具控制方法,在一些实施例中,上述道具控制方法可以但不限于应用于如图1所示的环境中的道具控制系统中,其中,该道具控制系统可以包括但不限于终端设备102、网络104、服务器106。其中,终端设备102中运行有如图1所示射击应用客户端,如射击游戏应用客户端。上述终端设备102中包括人机交互屏幕1022,处理器1024及存储器1026。人机交互屏幕1022用于呈现上述射击应用客户端所运行的射击任务所提供的虚拟对战场景中的场景画面,其中,该射击任务为在虚拟对战场景多个虚拟对象之间实现的对抗任务;还用于提供人机交互接口来获取对上述射击应用客户端的人机交互界面执行的人机交互操作(如触发操作)。处理器1024用于响应上述触发操作,生成对应的操作指令,并按照该操作指令控制虚拟对战场景中的虚拟对象执行对应的射击动作。存储器1026用于存储上述射击任务所提供的虚拟对战场景中的场景画面及该虚拟对战场景中虚拟对象的属性信息,如对象标识、对象已装备的道具信息及对象生命值等信息。
此外,服务器106中包括数据库1062及处理引擎1064,数据库1062中用于存储上述虚拟对象在射击任务中产生的对战结果,还用于为上述客户端提供相应的对战资源, 如道具的属性信息及射击渲染效果的画面资源等。处理引擎1064用于确定当前对战结果及客户端所需的对战资源,并发送给终端设备102中的射击应用客户端。
示例过程如以下步骤:如步骤S102-S104,在终端设备102中运行的一局射击任务(如图1所示的虚拟对象10与持有目标射击道具11的虚拟对象之间的射击对战任务)的过程中,如步骤S102,通过人机交互屏幕1022获取对处于激活状态的目标射击道具执行的触发操作,然后如步骤S104-S106,响应于该触发操作,确定目标射击道具所瞄准的目标对象;并在目标时间段内,通过目标射击道具向目标对象所在的目标区域范围执行连续多次的射击动作,其中,相邻两次射击动作的时间间隔可以小于第一阈值。然后如步骤S108,将上述过程的对战结果,通过网络104发送给服务器106。
当服务器106接收到上述对战结果时,执行步骤S110-S114:保存对战结果,并获取对战资源,将对战资源发送给终端设备102,以使得终端设备102在人机交互屏幕1022中渲染显示。
在一些实施例中,终端设备102也可以将对战结果存储至本地,并根据本地存储的对战资源在人机交互屏幕1022中渲染显示,即本申请实施例提供的道具控制方法可以由终端设备102单独实现,例如终端设备102可以运行离线版本(单机版本)的射击应用客户端。
需要说明的是,在本实施例中,获取对虚拟对战场景中处于激活状态的目标射击道具执行的触发操作,该目标射击道具的攻击范围覆盖虚拟对战场景中的部分区域范围;响应于该触发操作,确定目标射击道具所瞄准的目标对象;在目标时间段内,通过目标射击道具向目标对象所在的目标区域范围执行连续多次的射击动作。如此,使得目标射击道具在游戏过程中的装备操作被简化,增加目标射击道具的使用次数,从而达到提高控制效率的目的,进而克服相关技术中玩家(用户)在使用该射击道具执行射击动作时需要准备较长时间所导致的控制效率较低的问题。
在一些实施例中,上述终端设备可以是配置有射击应用客户端的终端设备,可以包括但不限于以下至少之一:手机(如Android手机、iOS手机等)、笔记本电脑、平板电脑(Portable Android Device,PAD)、掌上电脑、移动互联网设备(Mobile Internet Devices,MID)、台式电脑、智能电视等。上述网络可以包括但不限于有线网络及无线网络中的至少之一,其中,该有线网络包括局域网、城域网和广域网中的至少之一,该无线网络包括蓝牙、WIFI及其他实现无线通信的网络中的至少之一。上述服务器可以是单一服务器,也可以是由多个服务器组成的服务器集群,或者是云服务器。上述仅是一种示例,本实施例中对此不做任何限定。
在一些实施例中,如图2所示,上述道具控制方法包括:
S202,获取对虚拟对战场景中处于激活状态的目标射击道具执行的触发操作,其中,目标射击道具的攻击范围覆盖虚拟对战场景中的部分区域范围;
S204,响应于触发操作,确定目标射击道具所瞄准的目标对象;
S206,在目标时间段内,通过目标射击道具向目标对象所在的目标区域范围执行连续多次的射击动作。
在一些实施例中,上述道具控制方法可以但不限于应用于射击应用中。在运行射击应用的一局射击任务的过程中,在该射击任务配置有目标射击道具的情况下,在获取到对处于激活状态的目标射击道具执行的触发操作后,响应于该触发操作,确定所瞄准的目标对象,并在目标时间段内对该目标对象所在目标区域范围执行连续多次射击动作,以使得目标射击道具在游戏过程中的装备操作被简化,增加目标射击道具的使用次数,从而达到提高控制效率的目的,进而克服相关技术中玩家在使用该射击道具执行射击动作时需要准备较长时间所导致的控制效率较低的问题。
在一些实施例中,上述的射击应用可以是军事仿真应用(用于运行军事仿真任务),也可以是射击游戏应用,例如可以为多人在线战术竞技游戏(Multiplayer Online Battle Arena,MOBA)应用,或者还可以为单人游戏(Single-Player Game,SPG)应用。上述射击游戏应用的类型可以包括但不限于以下至少之一:二维(Two Dimension,2D)游戏应用、三维(Three Dimension,3D)游戏应用、虚拟现实(Virtual Reality,VR)游戏应用、增强现实(Augmented Reality,AR)游戏应用、混合现实(Mixed Reality,MR)游戏应用。以上只是一种示例,本实施例对此不做任何限定。
此外,上述射击游戏应用可以为第三人称射击游戏(Third Person Shooting Game,TPS)应用,如除当前玩家所控制的虚拟角色(虚拟对象)之外的第三方角色对象的视角(即第三人称视角)来运行该射击游戏应用,还可以为第一人称射击游戏(First Person Shooting Game,FPS)应用,如以当前玩家所控制的虚拟角色的视角(即第一人称视角)来运行该射击游戏应用。
玩家通过各个射击应用客户端可以实现以下控制:控制虚拟角色(也可称作玩家角色)执行指定动作、控制与非玩家角色(Non-Player Character,NPC)执行交互动作、控制与虚拟对战场景中的静态对象(如建筑、树木等)执行交互动作、控制虚拟角色使用虚拟对战场景中为其装备的道具、载具。其中,这里的道具可以包括但不限于为重量级射击道具,重量级射击道具进行一次装备所需要的操作较复杂且用时较长,但该重量级射击道具的攻击范围较之轻量级装备道具更大。上述仅为示例,本实施例中对此不做任何限定。
在一些实施例中,上述触发操作可以包括但不限于以下至少之一:触屏点击操作、按压操作、触屏滑动操作、手势操作、姿态指示操作(姿态触发操作)、瞳孔锁定操作等人机交互操作。例如,以体感游戏应用中的射击任务为例,可以但不限于预先根据一个指定的手势操作或指定的姿态设定目标射击道具的触发操作;然后在任务运行过程中,在目标射击道具处于激活状态、且通过摄像头检测到玩家展示出指定的手势操作或指定的姿态时,则确定触发目标射击道具的瞄准射击过程。又例如,以触屏游戏应用中的射击任务为例,可以但不限于预先根据人机交互屏幕中所显示的一个触屏按键设定目标射击道具的触发操作;然后在任务运行过程中,在目标射击道具处于激活状态、且检测到对触屏按键的触屏点击操作时,则确定触发目标射击道具的瞄准射击过程。上述为示例,本实施例中对此不做任何限定。
需要说明的是,在一局射击任务的运行过程中,玩家所控制的当前虚拟角色可以但不限于装备有两种类型的射击道具:1)常备射击道具(允许手持的轻量级射击道具);2)需加载的射击道具(允许手持的轻量级射击道具和/或重量级射击道具)。
其中,上述常备射击道具(例如“主武器装备”或“副武器装备”)对应的触发提示图标,可以但不限于在射击任务运行的过程中直接显示在人机交互界面的操作区域中,无需额外的道具加载界面,以便于玩家可以直接控制当前虚拟角色使用该类射击道具执行射击动作。对于玩家来说无需额外的装备道具的操作,对于当前虚拟角色来说无需执行额外的武器装备动作,从而达到简化道具控制操作的效果。
而上述需加载的射击道具会被存放在当前虚拟角色的道具背包中,需要通过额外的道具加载界面(配置界面)来将该类射击道具添加到操作区域中,如此,玩家才可以控制当前虚拟角色使用该类射击道具执行射击动作。其中,对于重量级射击道具,由于体量较大,因而每次装备到当前虚拟角色上或更换炮弹时都需要占用一定时间来准备;此外,这里的重量级射击道具还设置有使用冷却时间(Cold Time,CD),也就是说,每次使用完该类射击道具后,该类射击道具都将在冷却时间内处于冷冻状态,该冷冻状态用于指示上述射击道具在检测到触发操作时将无响应,即无法被调用以执行射击动作。
在本实施例中,上述目标射击道具可以但不限于是上述需加载的射击道具中的重量级射击道具(也可称作大招武器)。这里的目标射击道具可以为重量级的弓弩道具、枪炮道具等。例如可以是重量级的强力弓弩,该强力弓弩具有远程精准射击的功能,同时在该强力弓弩的箭头上绑定有杀伤力强的炸药,在命中后还具有大范围伤害的功能。这样在获取到对上述目标射击道具执行的触发操作时,响应于该触发操作,确定出所瞄准的目标对象,并通过控制该目标射击道具实现在目标时间段的连续射击动作,从而提高攻击范围较大的这类重量级射击道具的射击效率,使其得到更高效地控制应用。
在一些实施例中,上述目标射击道具在响应触发操作完成连续多次的射击动作之后,将进入冷冻状态。其中,该冷冻状态对应的使用冷却时间CD可以但不限于根据不同的道具属性需求设置为不同的取值,本实施例中对该数值不做任何限定。此外,可以但不限于通过倒计时提示的方式,来提示控制虚拟角色的玩家该目标射击道具退出冷冻状态的剩余时长。在一些实施例中,倒计时提示的方式可以包括但不限于以下之一:读秒倒计时、进度条倒计时等。这里为示例,本实施例中对此不做任何限定。
在一些实施例中,在确定目标射击道具所瞄准的目标对象时可以但不限于:根据触发操作的操作属性信息来确定准心的尺寸大小,其中,该准心的尺寸大小与目标射击道具射出的道具对象在无碰撞的情况下的飞行距离为负相关。如,该准心显示的尺寸大小越小,表示上述道具对象在无碰撞的情况下的飞行距离越远,命中准确率更高。
也就是说,在本实施例中,上述触发操作的操作属性信息可以作为调整准心的尺寸大小(也是瞄准距离)的参考依据。这里操作属性信息可以包括但不限于:触屏按压操作的按压时长、触屏按压操作的按压力度(按压压力)、触屏滑动操作的滑动距离、手势操作的手持保持时长、姿态触发操作的保持时长等。将操作属性信息作为调整准心的参考条件,例如,触屏按压操作的按压时长越长,则准心的尺寸大小越小。又例如,在触屏滑动操作向第一方向滑动的情况下,滑动距离越长,则准心的尺寸大小越小;而在触屏滑动操作向第二方向滑动的情况下,滑动距离越长,则准心的尺寸大小越大。
例如,以触屏按压操作的按压时长来调整准心的尺寸为例,在触屏按压操作刚被触发第0秒时,目标射击道具的准心302的尺寸可以如图3中的(a)所示,在触屏按压操作被触发第2秒时,目标射击道具的准心302的尺寸可以如图3中的(b)所示。也就是说,随着触屏按压操作的按压时长越长,即蓄力时间越长,则使用目标射击道具的预瞄准的准心的尺寸大小越小。上述为示例,本实施例中对此不做任何限定。
以射击任务为射击游戏任务的情况举例,结合图4所示进行说明。如步骤S402,开始运行一局射击游戏任务;然后如步骤S404,检测目标射击道具的冷却时间是否结束;当冷却时间已结束时,如步骤S406,确定目标射击道具进入激活状态;接着如步骤S408,检测是否获取到对上述目标射击道具的触发操作;若获取到触发操作,则执行步骤S410,否则返回步骤S406。
在获取到触发操作时,执行步骤S410-S412,即调用目标射击道具,并检测开火键(射击动作触发键)是否被按下,若按下,则执行步骤S414,否则返回步骤S410。若检测到开火键被按下,则如步骤S414-S416,控制该目标射击道具进入预瞄准状态,并检测开火键是否被持续按压,若确定开火键是被持续按压,则执行步骤S418,根据按压时长控制准心的尺寸变小。
进一步如步骤S420,检测开火键是否被释放,如果被释放则执行步骤S422,否则返回步骤S418。若开火键被释放,则执行步骤S422,使用目标射击道具执行连续多次的射击动作。然后如步骤S424-S426,检测目标射击道具射出的道具对象(如枪弹或弓箭等)是否与虚拟对象场景(游戏场景)中设置的对象(虚拟对象)发生碰撞,若发生碰撞,则控制道具对象爆炸。然后,执行步骤S428-S430,检测爆炸范围内是否有目标 对象,如果确定有目标对象,则控制这些目标对象受到伤害,即这些目标对象的生命值将会对应下降。
上述图4所示为示例,用于阐述本申请实施例中的一种实现方式,本实施例中对图4中所示步骤顺序及所执行的手段方式不做任何限定。
在一些实施例中,响应于触发操作,确定目标射击道具所瞄准的目标对象包括:
S1,响应于触发操作,获取触发操作的操作属性信息;
S2,将当前所显示的目标射击道具在瞄准方向上的准心,调整为与操作属性信息相匹配的目标尺寸大小;
S3,将准心选中的对象确定为目标对象。
在一些实施例中,上述触发操作可以但不限于以下至少之一:触屏操作、手势操作、姿势触发操作等。其中,上述触屏操作可以包括但不限于以下至少之一:按压操作、滑动操作等。需要说明的是,这里的操作属性信息可以为触发操作的操作时长、触发操作的操作方向、触发操作的操作频率等属性信息。也就是说,对准心的尺寸大小可以单向调整,如随着时长增加,准心的尺寸越来越小。此外,对准心的尺寸大小可以双向调整,如根据滑动方向或手势方向确定调整方向,进一步根据滑动距离和手势移动距离来确定调整的尺度。如向左滑动调小准心,向右滑动调大准心。这里为示例,本实施例中不做任何限定。
在一些实施例中,将当前所显示的目标射击道具在瞄准方向上的准心,调整为与操作属性信息相匹配的目标尺寸大小的过程,可以包括以下至少一种情况:
1)当操作属性信息为操作时长时,将当前所显示的目标射击道具在瞄准方向上的准心,调整为与操作时长相匹配的目标尺寸大小,其中,操作时长与目标尺寸大小为负相关,操作时长包括按压操作的按压时长、手势操作的手势保持时长、以及姿态触发操作的保持时长中的任意一种;
2)当操作属性信息为滑动操作的滑动距离时,将当前所显示的目标射击道具在瞄准方向上的准心,调整为与滑动距离相匹配的目标尺寸大小,其中,以第一滑动方向(例如向右的方向)执行的滑动操作得到的滑动距离与目标尺寸大小为正相关;以第二滑动方向(例如向左的方向)执行的滑动操作得到的滑动距离与目标尺寸大小为负相关;
3)当操作属性信息为手势操作的手势移动距离时,将当前所显示的目标射击道具在瞄准方向上的准心,调整为与手势移动距离相匹配的目标尺寸大小,其中,以第一手势方向(例如向右的方向)执行的手势操作得到的手势移动距离与目标尺寸大小为正相关;以第二手势方向(例如向左的方向)执行的手势操作得到的手势移动距离与目标尺寸大小为负相关。
需要说明的是,在本实施中,在使用目标射击道具执行连续多次射击动作的过程中,玩家可以通过射击应用客户端同时控制调整当前虚拟角色的调整观察方向(也就是目标射击道具的瞄准方向),还可以控制调整当前虚拟角色移动一段距离。在上述变化过程中,目标射击道具所瞄准的目标对象也将随之调整更新,以便于对更新后的目标对象进行快速精准地射击。
例如,结合图5所示示例进行说明:假设玩家50正在通过体感射击游戏应用运行一局射击游戏任务。通过摄像头500采集玩家50的图像,以识别玩家50当前的姿势动作,在该姿势动作与射击游戏任务中提示的指定姿势动作502一致的情况下,则确定获取到触发操作。进一步根据检测到的上述姿势动作的保持时长,来调整目标射击道具的准心。比如保持时长越长,则目标射击道具的准心的尺寸越小,瞄准精度越高,射击距离越远,命中目标对象的成功率越高。
通过本申请提供的实施例,根据触发操作的操作属性信息来调整目标射击道具在瞄 准方向上的准心,以实现对目标对象进行预瞄准调整,从而达到提高射击命中率的效果。
在一些实施例中,在将当前所显示的目标射击道具在瞄准方向上的准心,调整为与操作属性信息相匹配的目标尺寸大小的过程中,还包括:
S1,获取与操作属性信息相匹配的射击参数,其中,射击参数包括目标射击道具射出的道具对象的射击初速度和初始重力加速度;
S2,按照射击参数确定道具对象在瞄准方向上的射击飞行轨迹,其中,射击飞行轨迹在无碰撞的情况下的飞行距离与准心的目标尺寸大小为负相关。
在一些实施例中,上述目标射击道具射出的道具对象可以包括但不限于:弓弩射出的弓箭、枪炮射出的子弹或炸药等。需要说明的是,这里虚拟对战场景中配置的该道具对象将模拟真实世界中的物理运动,在射出后按照自由落体的轨迹飞行一段时间,直至落地或与其他参考对象发生碰撞。其中,在上述飞行过程中得到的飞行轨迹与上述道具对象在射出时的射击参数(即射击初速度和初始重力加速度)相关。
例如结合图6所示示例进行说明,仍以触屏按压操作的按压时长来调整准心的尺寸为例,在调整准心的尺寸的同时,还根据当前使用的目标射击道具的射击参数确定飞行轨迹(射击飞行轨迹)。假设触屏按压操作的按压时长为T,则该目标射击道具在无碰撞情况下(直接落地)产生的飞行轨迹可以如图6中的(a)所示,其中初始位置A的射击初速度为v0,初始重力加速度为g0。经过一段时间后到达位置B,射击速度会下降为v1,重力加速度也会下降到g1。假设触屏按压操作的按压时长为2T,则表示按压蓄力更久,则该目标射击道具在无碰撞情况下(直接落地)产生的飞行轨迹可以如图6中的(b)所示,其中初始位置C的射击初速度为v2,初始重力加速度为g2。经过一段时间后到达位置D,射击速度会下降为v3,重力加速度也会下降到g3。这里,由于图6中的(b)所示的触屏按压操作的按压时长大于图6中的(a)所示的触屏按压操作的按压时长,则图6中的(b)所示的射击初速度v2也大于图6中的(a)所示的射击初速度v0,而图6中的(b)所示的初始重力加速度g2小于或等于图6中的(a)所示的初始重力加速度g0。图6所示为参考示例,本实施例中对此不做任何限定。
在一些实施例中,获取与操作属性信息相匹配的射击参数包括以下至少一种情况:
1)当操作属性信息为操作时长时,获取与操作时长相匹配的射击参数,其中,操作时长与射击初速度为正相关,操作时长与初始重力加速度为负相关,操作时长包括按压操作的按压时长、手势操作的手势保持时长、以及姿态触发操作的保持时长中的任意一种;
2)当操作属性信息为按压操作的按压压力时,获取与按压压力相匹配的射击参数,其中,按压压力与射击初速度为正相关,按压压力与初始重力加速度为负相关;
3)当操作属性信息为滑动操作的滑动距离时,获取与滑动距离相匹配的射击参数,其中,以第一滑动方向执行的滑动操作得到的滑动距离与射击初速度为正相关,以第一滑动方向执行的滑动操作得到的滑动距离与初始重力加速度为负相关;以第二滑动方向执行的滑动操作得到的滑动距离与射击初速度为负相关,以第二滑动方向执行的滑动操作得到的滑动距离与初始重力加速度为正相关;
4)当操作属性信息为手势操作的手势移动距离时,获取与手势移动距离相匹配的射击参数,其中,以第一手势方向执行的手势操作得到的手势移动距离与射击初速度为正相关,以第一手势方向执行的手势操作得到的手势移动距离与初始重力加速度为负相关;以第二手势方向执行的手势操作得到的手势移动距离与射击初速度为负相关,以第二手势方向执行的手势操作得到的手势移动距离与初始重力加速度为正相关。
需要说明的是,在本实施例中,上述射击参数可以但不限于与触发操作的操作属性信息具有相关性,其中射击参数可以单向调整,如随着操作时长t(如按压时长)增加, 射击初速度v可以如图7中的(a)所示越来越大,初始重力加速度g可以如图中的7(b)所示越来越小。这里图中所示为趋势,不对具体的数值造成任何限定。
此外,射击参数可以双向调整,如根据滑动方向或手势方向确定调整方向,进一步根据滑动距离和手势移动距离来确定调整的尺度。如向左滑动调小初速度,向右滑动调大初速度。这里初始重力加速度可以与滑动距离具有相关性,如滑动距离越长,初始重力加速度越小,但初始重力加速度的调整不是无限制的调整,以保证模拟物体真实坠落的轨迹为调整的参考依据。
通过本申请提供的实施例,获取与操作属性信息相匹配的准心的尺寸大小的同时,还可以获取与该操作属性信息相匹配的射击操作。从而实现通过触发操作来直接对目标射击道具的预瞄准状态进行调整,以使得在触发过程中就对目标射击道具的瞄准精度进行精准地调整,从而达到提高使用目标射击道具时的准确度。
在一些实施例中,通过目标射击道具向目标对象所在的目标区域范围执行连续多次的射击动作之后,还包括:
当目标对象位于目标射击道具的攻击范围、且目标射击道具射出的道具对象与目标对象发生碰撞时,对目标对象所在目标区域范围内的全部对象各自的生命值进行调整,其中,调整后的生命值小于调整前的生命值,目标区域范围是以目标对象所在位置为中心、且以目标距离为半径得到的区域范围。
例如结合图8所示示例进行说明:假设目标射击道具(如强力弓弩)通过预瞄准调整后所瞄准的目标对象为对象802,如图8中的(a)所示。然后确定该对象802与当前使用目标射击道具的虚拟角色所在位置之间的距离,在该距离小于目标射击道具的攻击范围的情况下,则表示该目标射击道具(如强力弓弩)射出的道具对象(如弓箭)可以到达对象802所在位置,并与对象802发生碰撞。然后利用道具对象(如弓箭)上携带的炸药发生爆炸,渲染的爆炸效果可以如图8中的(b)所示,则将对位于上述对象802所在位置的区域范围内(如图中示出的区域范围803)的全部对象产生伤害,使其生命值下降。
通过本申请提供的实施例,在目标对象位于目标射击道具的攻击范围,且目标射击道具射出的道具对象与目标对象发生碰撞的情况下,则控制目标射击道具在碰撞后产生伤害作用,对目标对象所在目标区域范围内的全部对象各自的生命值进行调整。也就是说,使用本实施例中提供的目标射击道具将实现对目标对象所在区域范围内的对象达到远程射击的目的,而且无需复杂的道具装备操作,提高了对道具控制效率。
在一些实施例中,通过目标射击道具向目标对象所在的目标区域范围执行连续多次的射击动作之后,还包括:
1)当目标对象并未位于目标射击道具的攻击范围、且目标射击道具射出的道具对象与目标射击道具的攻击范围内的第一参考对象发生碰撞时,对第一参考对象所在第一参考区域范围内的全部对象各自的生命值进行调整处理,其中,调整后的生命值小于调整前的生命值,第一参考对象相对目标射击道具的方向与目标对象相对目标射击道具的方向相同,第一参考区域范围是以第一参考对象所在位置为中心、且以第一参考距离为半径得到的区域范围;
2)当目标对象并未位于目标射击道具的攻击范围、且目标射击道具射出的道具对象与目标射击道具的攻击范围内的任意一个对象均未发生碰撞时,确定道具对象的落地位置,并将以落地位置中心、且以第二参考距离为半径的区域范围确定为第二参考区域范围,对第二参考区域范围内的全部对象各自的生命值进行调整处理,其中,调整后的生命值小于调整前的生命值。
需要说明的是,在目标射击道具射出道具对象之后,道具对象将不再受目标射击道 具的控制,如果该道具在飞行轨迹中与新遇到的参考对象发生碰撞,则直接作用在该参考对象的参考区域范围内,而不是目标对象所在的目标区域参考范围内。如果该道具在飞行轨迹中也并未遇到参考对象,则在落地后直接作用在落地位置所在区域范围内。
在一些实施例中,上述第一参考对象可以包括但不限于:虚拟对战场景(如射击任务提供的虚拟对战场景)中其他玩家控制的虚拟角色对象、虚拟对战场景中的非玩家角色(NPC)对象、虚拟对战场景中的静止对象(如建筑、车辆、树木等)等。也就是说,第一参考对象可以为射击任务所提供的虚拟对战场景中的任意静态对象或动态对象,本申请实施例对此不做任何限定。
例如结合图9所示示例进行说明:假设目标射击道具(如强力弓弩)通过预瞄准调整后所瞄准的目标对象为对象802,如图9中的(a)所示。然后确定该对象802与当前使用目标射击道具的虚拟角色所在位置之间的距离,在该距离大于目标射击道具的攻击范围的情况下,则表示该目标射击道具(如强力弓弩)射出的道具对象(如弓箭)无法到达对象802所在位置。
若进一步检测到目标射击道具射出的道具对象与目标射击道具的攻击范围内的第一参考对象(如图9中的(b)中的车辆902)发生碰撞的情况下,则对车辆902所在第一参考区域范围内的全部对象各自的生命值进行调整。如图9中的(b)所示第一参考区域范围内至少包括对象904,则对该对象904造成生命伤害,即其生命值将被减小。
此外,在本实施例中,若上述对象802与当前使用目标射击道具的虚拟角色所在位置之间的距离大于目标射击道具的攻击范围、且目标射击道具射出的道具对象并未与任何对象发生碰撞,则该道具对象落地时,将对落地位置所在区域范围内的对象产生生命伤害,使其生命值减小。具体过程可以参考上述示例。
通过本申请提供的实施例,在目标对象并未位于目标射击道具的攻击范围的情况下,则可以检测目标射击道具射出的道具对象是否与除目标对象之外的其他参考对象发生碰撞,或是否落地,从而根据该检测结果来控制目标射击道具实现爆炸,以对相应的区域范围内的对象产生生命值的影响。
在一些实施例中,对全部对象各自的生命值进行调整处理包括:
S1,确定全部对象各自相对中心的距离;
S2,按照距离调整生命值,其中,距离与生命值的变化幅度为负相关。
例如结合图10进行说明:在目标射击道具射出的道具对象在相应区域范围(如目标区域范围、第一参考区域范围或第二参考区域范围)内发生爆炸时,则根据当前在该区域范围内的各个对象与该区域范围的中心之间的距离,来确定各个对象的生命值的变化幅度。假设如图10所示,上述区域范围内包括四个对象,与中心位置的距离分别为r1、r2、r3和r4,则各自对应的生命值的减小幅度分别为L1、L2、L3和L4,其中,r1<r2<r3<r4,对应的L1>L2>L3>L4,其中生命值的减小幅度与距离的变化幅度呈比例。例如,r1/r2=L1/L2。这里为示例,本实施例中对此不做任何限定。
通过本申请提供的实施例,按照区域范围内各个对象到中心的距离,来确定各个对象对应的生命值的调整幅度,从而对区域范围内的不同位置的对象进行生命伤害,从而达到仿真的目的。
在一些实施例中,在目标时间段内,通过目标射击道具向目标对象所在的目标区域范围执行连续多次的射击动作包括:
S1,当检测到目标时间段内目标射击道具的瞄准方向未改变时,通过目标射击道具向目标对象所在的目标区域范围执行连续多次的射击动作;
S2,当检测到目标时间段内目标射击道具的瞄准方向发生改变时,更新目标射击道具每一次执行射击动作之前所瞄准的目标对象,得到更新后的目标对象,并通过目标射 击道具向更新后的目标对象所在的更新后的目标区域范围执行射击动作。
需要说明的是,在本实施例中,在每次触发目标射击道具之后,该目标射击道具都将在目标时间段内连续执行多次射击动作,而无需在每次射击动作之前再执行额外的道具装备操作,如此,简化了操作难度,而且增加了射击次数,从而达到了提高射击控制效率的效果。
此外,在本实施例中,如果当前使用目标射击道具的虚拟角色并未发生转向,则该目标射击道具所瞄准的目标对象也将不会发生变化,对应的,将会对该目标对象所在的目标区域范围内执行连续多次射击动作。需要说明的是,目标射击道具连续多次射击动作之间有一定时间间隔,因而如果当前使用目标射击道具的虚拟角色在此期间发生转向,则对应的目标射击道具的连续多次射击动作所瞄准的目标对象也将随之发生变化,从而无需重新再次装备这类操作复杂的目标射击道具,就可以对不同方向上的目标对象实现远程大范围的攻击,以提高射击道具的应用范围,还将缩短射击任务的时长,提高射击任务的获胜概率。
例如结合图11所示进行说明:假设目标射击道具第一次射击动作所瞄准的目标对象为如图11中的(a)所示的对象1102,则基于对象1102确定目标区域范围,并使得目标射击道具对该目标区域范围的对象产生伤害。之后在执行第二次射击动作时调整了瞄准方向,则所瞄准的目标对象也对应调整为如图11中的(b)所示的对象1104,则基于对象1104重新确定目标区域范围,并对更新后的目标区域范围内的对象产生伤害。
通过本申请提供的实施例,在检测到目标时间段内目标射击道具的瞄准方向发生改变的情况下,更新目标射击道具每一次执行射击动作之前所瞄准的目标对象,得到更新后的目标对象及更新后的目标区域范围,从而对攻击范围较大的目标射击道具的攻击范围实现了进一步扩展,以对不同方向上的对象实现远距离的大范围伤害。
在一些实施例中,在目标时间段内,通过目标射击道具向目标对象所在的目标区域范围执行连续多次的射击动作,包括:
S1,当检测到目标时间段内目标射击道具所在位置并未改变时,向目标对象所在的目标区域范围执行连续多次的射击动作;
S2,当检测到目标时间段内目标射击道具所在位置发生改变时,根据目标射击道具每一次执行射击动作之前所在位置,更新每一次执行射击动作之前所瞄准的目标对象,并通过目标射击道具向更新后的目标对象所在的更新后的目标区域范围执行射击动作。
需要说明的是,如果当前使用目标射击道具的虚拟角色并未发生移动,则该目标射击道具所瞄准的目标对象也将不会发生变化,对应的,将会对该目标对象所在的目标区域范围内执行连续多次射击动作。
此外,目标射击道具连续多次射击动作之间有一定时间间隔,因而如果当前使用目标射击道具的虚拟角色在此期间发生移动,则对应的目标射击道具的连续多次射击动作所瞄准的目标对象也将随之发生变化。从而无需重新再次装备这类操作复杂的目标射击道具,就可以对移动过程中不同的目标对象实现远程大范围的攻击,以提高射击道具的应用范围,从而还将缩短射击任务的时长,提高射击任务的获胜概率。
通过本申请提供的实施例,在检测到目标时间段内目标射击道具的瞄准方向发生改变的情况下,更新目标射击道具每一次执行射击动作之前所瞄准的目标对象,得到更新后的目标对象及更新后的目标区域范围,从而对攻击范围较大的目标射击道具的攻击范围实现了进一步扩展,以对移动过程中不同的目标对象实现远程大范围的攻击。
在一些实施例中,在获取对虚拟对战场景中处于激活状态的目标射击道具执行的触发操作之前,还包括:
S1,获取道具配置指令;
S2,响应于道具配置指令,显示射击任务中用于提供射击道具的配置界面,其中,在配置界面包括为目标射击道具配置的目标槽位;
S3,当获取到对目标槽位执行的选择操作时,将目标射击道具添加至射击任务中,并将目标射击道具对应的道具触发图标显示在射击任务的操作区域中。
例如结合图12所示进行说明:假设获取到在射击任务中触发的道具配置指令的情况下,响应该道具配置指令,显示如图12中的(a)中所示的配置界面1202,其中显示有如图所示有弓弩或枪等各类射击道具。进一步,假设获取到对目标槽位1204执行的选择操作,则将上述选中的目标槽位1204中的射击道具(如强力弓弩)添加至射击任务中,并将其道具触发图标显示在射击任务的操作区域(如射击任务所提供的虚拟对战场景中的操作区域)中,如图12中的(b)所示的图标1206。
通过本申请提供的实施例,在运行射击任务之前,从为目标射击道具单独配置的目标操作(针对目标槽位的选择操作)中选择配置该目标射击道具,从而实现在射击任务中可以通过操作区域快速地调用并控制该目标射击道具执行连续多次的射击动作,从而节省了调用加载该目标射击道具的时间,提高了在射击任务中直接使用目标射击道具执行连续多次射击动作的控制效率。
在一些实施例中,在目标时间段内,通过目标射击道具向目标对象所在的目标区域范围执行连续多次的射击动作之后,还包括:
S1,将操作区域中目标射击道具对应的道具触发图标从第一显示状态调整为第二显示状态,其中,第一显示状态用于指示目标射击道具处于激活状态;第二显示状态用于指示目标射击道具处于冷冻状态;
S2,显示目标射击道具对应的道具触发图标在第二显示状态中的剩余时长;
S3,当剩余时长为零时,将目标射击道具对应的道具触发图标从第二显示状态调整为第一显示状态。
需要说明的是,在进入射击任务后,目标射击道具的初始状态为尚未激活的冷冻状态,在等到使用冷却时间结束后,将调整该目标射击道具解除冷冻状态,进入激活状态,以使得该目标射击道具允许被触发调用,执行射击动作。此外,在目标射击道具完成一组射击动作(一组射击动作中射击动作的数量可以根据实际应用场景进行设定)之后,也将重新再次进入冷冻状态,直至使用冷却时间结束后才可以再次进入激活状态。在上述状态变化的同时,上述目标射击道具的道具触发图标也将对应调整显示状态,以便于直观的提示玩家当前目标射击道具的状态。
在一些实施例中,在目标射击道具处于冷冻状态时,还将对应显示使用冷却时间的剩余时长,如以数字倒计时或进度条的形式呈现展示。
例如结合图13所示进行说明:在目标射击道具完成一组连续多次的射击动作之后,将同时调整道具触发图标的显示状态,即,从第一显示状态调整为第二显示状态,其中,第一显示状态用于指示目标射击道具处于激活状态,而第二显示状态用于指示目标射击道具处于冷冻状态,在该冷冻状态下,目标射击道具在收到触发操作后将没有响应。
进一步,在目标射击道具处于冷冻状态下,其道具触发图标将同时通过进度条1302显示其在该状态下的剩余时长,如图13中的(a)中所示。在剩余时长为零的情况下,则表示目标射击道具从冷冻状态切换为激活状态,同时其道具触发图标也从第二显示状态调整为第一显示状态,如图13中的(b)所示的图标1304。
通过本申请提供的实施例,通过目标射击道具的道具触发图标的显示状态来指示该目标射击道具的状态,以便于在紧张的射击任务过程中,可以直观地提示用户玩家是否可以调用该目标射击道具执行射击动作,从而达到简化控制操作的目的。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列 的动作组合,但是本领域技术人员应该知悉,本申请并不受所描述的动作顺序的限制,因为依据本申请,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于示例实施例,所涉及的动作和模块并不一定是本申请所必须的。
本申请实施例还提供了一种用于实施上述道具控制方法的道具控制装置。如图14所示,该装置包括:获取单元1402,配置为获取对虚拟对战场景中处于激活状态的目标射击道具执行的触发操作,其中,目标射击道具的攻击范围覆盖虚拟对战场景中的部分区域范围;确定单元1404,配置为响应于触发操作,确定目标射击道具所瞄准的目标对象;控制单元1406,配置为在目标时间段内,通过目标射击道具向目标对象所在的目标区域范围执行连续多次的射击动作。
在本实施例中,上述道具控制装置的实施例可以参考上述道具控制方法的实施例。
本申请实施例还提供了一种用于实施上述道具控制方法的电子设备,该电子设备可以是图1所示的终端设备或服务器。本实施例以该电子设备为终端设备为例来说明。如图15所示,该电子设备包括存储器1502和处理器1504,该存储器1502中存储有计算机程序,该处理器1504被设置为通过计算机程序执行本申请实施例提供的道具控制方法。
在一些实施例中,上述电子设备可以位于计算机网络的多个网络设备中的至少一个网络设备。
在一些实施例中,上述处理器可以被设置为通过计算机程序执行以下步骤:获取对虚拟对战场景中处于激活状态的目标射击道具执行的触发操作,其中,目标射击道具的攻击范围覆盖虚拟对战场景中的部分区域范围;响应于触发操作,确定目标射击道具所瞄准的目标对象;在目标时间段内,通过目标射击道具向目标对象所在的目标区域范围执行连续多次的射击动作。
本领域普通技术人员可以理解,图15所示的结构仅为示意,电子设备也可以是智能手机(如Android手机、iOS手机等)、平板电脑、掌上电脑以及MID、PAD等终端设备。图15并不对上述电子设备的结构造成限定。例如,电子设备还可包括比图15中所示更多或者更少的组件(如网络接口等),或者具有与图15所示不同的配置。
其中,存储器1502可用于存储软件程序以及模块,如本申请实施例中道具控制方法和装置对应的程序指令/模块,处理器1504通过运行存储在存储器1502内的软件程序以及模块,从而执行各种功能应用以及数据处理,即实现上述的道具控制方法。存储器1502可包括高速随机存储器,还可以包括非易失性存储器,如一个或者多个磁性存储装置、闪存、或者其他非易失性固态存储器。在一些实施例中,存储器1502可进一步包括相对于处理器1504远程设置的存储器,这些远程存储器可以通过网络连接至终端。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。其中,存储器1502可以但不限于用于存储虚拟对象的对象属性信息及射击动作对应的画面资源(对战资源)等信息。作为一种示例,如图15所示,上述存储器1502中可以但不限于包括上述道具控制装置中的获取单元1402、确定单元1404及控制单元1406。此外,还可以包括但不限于上述道具控制装置中的其他模块单元。
在一些实施例中,上述的传输装置1506用于经由一个网络接收或者发送数据。上述的网络实例可包括有线网络及无线网络。在一些实施例中,传输装置1506包括一个网络适配器(Network Interface Controller,NIC),其可通过网线与其他网络设备与路由器相连从而可与互联网或局域网进行通讯。在一些实施例中,传输装置1506为射频(Radio Frequency,RF)模块,其用于通过无线方式与互联网进行通讯。
此外,上述电子设备还包括:显示器1508,用于显示上述虚拟对战场景及其中的虚 拟对象和产生的射击过程画面;连接总线1510,用于连接上述电子设备中的各个模块部件。
在其他实施例中,上述终端设备或者服务器可以是一个分布式系统中的一个节点,其中,该分布式系统可以为区块链系统,该区块链系统可以是由该多个节点通过网络通信的形式连接形成的分布式系统。其中,节点之间可以组成点对点(P2P,Peer To Peer)网络,任意形式的计算设备,比如服务器、终端等电子设备都可以通过加入该点对点网络而成为该区块链系统中的一个节点。
本申请实施例提供了一种计算机程序产品或计算机程序,该计算机程序产品或计算机程序包括计算机指令,该计算机指令存储在计算机可读存储介质中。计算机设备的处理器从计算机可读存储介质读取该计算机指令,处理器执行该计算机指令,使得该计算机设备执行上述道具控制方法。
在一些实施例中,上述计算机可读的存储介质可以被设置为存储用于执行以下步骤的计算机程序:获取对虚拟对战场景中处于激活状态的目标射击道具执行的触发操作,其中,目标射击道具的攻击范围覆盖虚拟对战场景中的部分区域范围;响应于触发操作,确定目标射击道具所瞄准的目标对象;在目标时间段内,通过目标射击道具向目标对象所在的目标区域范围执行连续多次的射击动作。
在一些实施例中,本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令终端设备相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,存储介质可以包括:闪存盘、只读存储器(Read-Only Memory,ROM)、随机存取器(Random Access Memory,RAM)、磁盘或光盘等。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
上述实施例中的集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在上述计算机可读取的存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在存储介质中,包括若干指令用以使得一台或多台计算机设备(可为个人计算机、服务器或者网络设备等)执行本申请各个实施例方法的全部或部分步骤。
在本申请的上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本申请所提供的实施例中,应该理解到,所揭露的客户端,可通过其它的方式实现。其中,以上所描述的装置实施例仅仅是示意性的,例如单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,单元或模块的间接耦合或通信连接,可以是电性或其它的形式。
作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
以上仅是本申请的示例实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本申请原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也应视为本申请的保护范围。

Claims (16)

  1. 一种道具控制方法,由电子设备执行,所述方法包括:
    获取对虚拟对战场景中处于激活状态的目标射击道具执行的触发操作,其中,所述目标射击道具的攻击范围覆盖所述虚拟对战场景中的部分区域范围;
    响应于所述触发操作,确定所述目标射击道具所瞄准的目标对象;
    在目标时间段内,通过所述目标射击道具向所述目标对象所在的目标区域范围执行连续多次的射击动作。
  2. 根据权利要求1所述的方法,其中,所述响应于所述触发操作,确定所述目标射击道具所瞄准的目标对象,包括:
    响应于所述触发操作,获取所述触发操作的操作属性信息;
    将当前所显示的所述目标射击道具在瞄准方向上的准心,调整为与所述操作属性信息相匹配的目标尺寸大小;
    将所述准心选中的对象确定为所述目标对象。
  3. 根据权利要求2所述的方法,其中,在所述将当前所显示的所述目标射击道具在瞄准方向上的准心,调整为与所述操作属性信息相匹配的目标尺寸大小的过程中,还包括:
    获取与所述操作属性信息相匹配的射击参数,其中,所述射击参数包括所述目标射击道具射出的道具对象的射击初速度和初始重力加速度;
    按照所述射击参数确定所述道具对象在所述瞄准方向上的射击飞行轨迹,其中,所述射击飞行轨迹在无碰撞时的飞行距离与所述准心的所述目标尺寸大小为负相关。
  4. 根据权利要求3所述的方法,其中,所述获取与所述操作属性信息相匹配的射击参数,包括:
    当所述操作属性信息为操作时长时,获取与所述操作时长相匹配的所述射击参数,其中,所述操作时长与所述射击初速度为正相关,所述操作时长与所述初始重力加速度为负相关,所述操作时长包括按压操作的按压时长、手势操作的手势保持时长、以及姿态触发操作的保持时长中的任意一种;或者
    当所述操作属性信息为按压操作的按压压力时,获取与所述按压压力相匹配的所述射击参数,其中,所述按压压力与所述射击初速度为正相关,所述按压压力与所述初始重力加速度为负相关;或者
    当所述操作属性信息为滑动操作的滑动距离时,获取与所述滑动距离相匹配的所述射击参数,其中,以第一滑动方向执行的滑动操作得到的所述滑动距离与所述射击初速度为正相关,且与所述初始重力加速度为负相关;以第二滑动方向执行的滑动操作得到的所述滑动距离与所述射击初速度为负相关,且与所述初始重力加速度为正相关;或者
    当所述操作属性信息为手势操作的手势移动距离时,获取与所述手势移动距离相匹配的所述射击参数,其中,以第一手势方向执行的手势操作得到的所述手势移动距离与所述射击初速度为正相关,且与所述初始重力加速度为负相关;以第二手势方向执行的手势操作得到的所述手势移动距离与所述射击初速度为负相关,且与所述初始重力加速度为正相关。
  5. 根据权利要求2所述的方法,其中,所述将当前所显示的所述目标射击道具在瞄准方向上的准心,调整为与所述操作属性信息相匹配的目标尺寸大小,包括:
    当所述操作属性信息为操作时长时,将当前所显示的所述目标射击道具在瞄准方向上的准心,调整为与所述操作时长相匹配的目标尺寸大小,其中,所述操作时长与所述目标尺寸大小为负相关,所述操作时长包括按压操作的按压时长、手势操作的手势保持时长、以及姿态触发操作的保持时长中的任意一种;或者
    当所述操作属性信息为滑动操作的滑动距离时,将当前所显示的所述目标射击道具在瞄准方向上的准心,调整为与所述滑动距离相匹配的目标尺寸大小,其中,以第一滑动方向执行的滑动操作得到的所述滑动距离与所述目标尺寸大小为正相关;以第二滑动方向执行的滑动操作得到的所述滑动距离与所述目标尺寸大小为负相关;或者
    当所述操作属性信息为手势操作的手势移动距离时,将当前所显示的所述目标射击道具在瞄准方向上的准心,调整为与所述手势移动距离相匹配的目标尺寸大小,其中,以第一手势方向执行的手势操作得到的所述手势移动距离与所述目标尺寸大小为正相关;以第二手势方向执行的手势操作得到的所述手势移动距离与所述目标尺寸大小为负相关。
  6. 根据权利要求1所述的方法,其中,所述通过所述目标射击道具向所述目标对象所在的目标区域范围执行连续多次的射击动作之后,还包括:
    当所述目标对象位于所述目标射击道具的攻击范围、且所述目标射击道具射出的道具对象与所述目标对象发生碰撞时,对所述目标对象所在的目标区域范围内的全部对象各自的生命值进行调整处理;
    其中,调整后的生命值小于调整前的生命值,所述目标区域范围是以所述目标对象所在位置为中心、且以目标距离为半径得到的区域范围。
  7. 根据权利要求6所述的方法,其中,所述对所述目标对象所在的目标区域范围内的全部对象各自的生命值进行调整处理,包括:
    确定所述全部对象各自相对于所述目标区域范围的中心的距离;
    按照所述距离对所述全部对象各自的生命值进行调整处理,其中,所述距离与所述生命值的变化幅度为负相关。
  8. 根据权利要求1所述的方法,其中,所述通过所述目标射击道具向所述目标对象所在的目标区域范围执行连续多次的射击动作之后,还包括:
    当所述目标对象并未位于所述目标射击道具的攻击范围、且所述目标射击道具射出的道具对象与所述目标射击道具的攻击范围内的第一参考对象发生碰撞时,对所述第一参考对象所在的第一参考区域范围内的全部对象各自的生命值进行调整处理;
    其中,调整后的生命值小于调整前的生命值,所述第一参考对象相对所述目标射击道具的方向与所述目标对象相对所述目标射击道具的方向相同,所述第一参考区域范围是以所述第一参考对象所在位置为中心、且以第一参考距离为半径得到的区域范围;
    当所述目标对象并未位于所述目标射击道具的攻击范围、且所述目标射击道具射出的道具对象与所述目标射击道具的攻击范围内的任意一个对象均未发生碰撞时,对第二参考区域范围内的全部对象各自的生命值进行调整处理;
    其中,调整后的生命值小于调整前的生命值,所述第二参考区域范围是以所述道具对象的落地位置为中心、且以第二参考距离为半径得到的区域范围。
  9. 根据权利要求1所述的方法,其中,所述在目标时间段内,通过所述目标射击道具向所述目标对象所在的目标区域范围执行连续多次的射击动作,包括:
    当检测到所述目标时间段内所述目标射击道具的瞄准方向未改变时,通过所述目标射击道具向所述目标对象所在的目标区域范围执行连续多次的射击动作;
    当检测到所述目标时间段内所述目标射击道具的瞄准方向发生改变时,更新所述目标射击道具每一次执行射击动作之前所瞄准的目标对象,并通过所述目标射击道具向更新后的目标对象所在的更新后的目标区域范围执行所述射击动作。
  10. 根据权利要求1所述的方法,其中,所述在目标时间段内,通过所述目标射击道具向所述目标对象所在的目标区域范围执行连续多次的射击动作,包括:
    当检测到所述目标时间段内所述目标射击道具所在位置并未改变时,向所述目标对象所在的目标区域范围执行连续多次的射击动作;
    当检测到所述目标时间段内所述目标射击道具所在位置发生改变时,根据所述目标射击道具每一次执行射击动作之前所在位置,更新每一次执行射击动作之前所瞄准的目标对象,并通过所述目标射击道具向更新后的目标对象所在的更新后的目标区域范围执行所述射击动作。
  11. 根据权利要求1所述的方法,其中,所述虚拟对战场景运行于射击任务中;所述获取对虚拟对战场景中处于激活状态的目标射击道具执行的触发操作之前,还包括:
    获取道具配置指令;
    响应于所述道具配置指令,显示所述射击任务中用于提供射击道具的配置界面,其中,所述配置界面包括为所述目标射击道具配置的目标槽位;
    当获取到对所述目标槽位执行的选择操作时,将所述目标射击道具添加至所述射击任务中,并将所述目标射击道具对应的道具触发图标显示在所述射击任务的操作区域中。
  12. 根据权利要求11所述的方法,其中,所述通过所述目标射击道具向所述目标对象所在的目标区域范围执行连续多次的射击动作之后,还包括:
    将所述操作区域中所述目标射击道具对应的所述道具触发图标从第一显示状态调整为第二显示状态,其中,所述第一显示状态用于指示所述目标射击道具处于所述激活状态;所述第二显示状态用于指示所述目标射击道具处于冷冻状态;
    显示所述目标射击道具对应的所述道具触发图标在所述第二显示状态中的剩余时长;
    当所述剩余时长为零时,将所述目标射击道具对应的所述道具触发图标从所述第二显示状态调整为所述第一显示状态。
  13. 根据权利要求12所述的方法,其中,所述显示所述目标射击道具对应的所述道具触发图标在所述第二显示状态中的剩余时长,包括:
    根据倒计时提示的方式,显示所述目标射击道具对应的所述道具触发图标在所述第二显示状态中的剩余时长;
    其中,所述倒计时提示的方式包括读秒倒计时及进度条倒计时中的至少之一。
  14. 一种道具控制装置,包括:
    获取单元,配置为获取对虚拟对战场景中处于激活状态的目标射击道具执行的触发操作,其中,所述目标射击道具的攻击范围覆盖所述虚拟对战场景中的部分区域范围;
    确定单元,配置为响应于所述触发操作,确定所述目标射击道具所瞄准的目标对象;
    控制单元,配置为在目标时间段内,通过所述目标射击道具向所述目标对象所在的目标区域范围执行连续多次的射击动作。
  15. 一种计算机可读的存储介质,所述计算机可读的存储介质包括存储的计算机程序,其中,所述计算机程序运行时执行权利要求1至13任一项所述的方法。
  16. 一种电子设备,包括存储器和处理器,所述存储器中存储有计算机程序,所述处理器用于通过所述计算机程序执行权利要求1至13任一项所述的方法。
PCT/CN2021/098687 2020-07-10 2021-06-07 道具控制方法和装置、存储介质及电子设备 WO2022007569A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2022572422A JP7419568B2 (ja) 2020-07-10 2021-06-07 アイテム制御方法及びその装置、コンピュータプログラム並びに電子機器
KR1020227031420A KR20220139967A (ko) 2020-07-10 2021-06-07 소품 제어 방법과 장치, 저장 매체 및 전자 디바이스
US18/046,122 US20230057421A1 (en) 2020-07-10 2022-10-12 Prop control method and apparatus, storage medium, and electronic device
JP2024001020A JP2024062977A (ja) 2020-07-10 2024-01-09 アイテム制御方法及びその装置、記憶媒体並びに電子機器

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010664321.7A CN111659118B (zh) 2020-07-10 2020-07-10 道具控制方法和装置、存储介质及电子设备
CN202010664321.7 2020-07-10

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/046,122 Continuation US20230057421A1 (en) 2020-07-10 2022-10-12 Prop control method and apparatus, storage medium, and electronic device

Publications (1)

Publication Number Publication Date
WO2022007569A1 true WO2022007569A1 (zh) 2022-01-13

Family

ID=72391658

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/098687 WO2022007569A1 (zh) 2020-07-10 2021-06-07 道具控制方法和装置、存储介质及电子设备

Country Status (5)

Country Link
US (1) US20230057421A1 (zh)
JP (2) JP7419568B2 (zh)
KR (1) KR20220139967A (zh)
CN (1) CN111659118B (zh)
WO (1) WO2022007569A1 (zh)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111659118B (zh) * 2020-07-10 2021-04-09 腾讯科技(深圳)有限公司 道具控制方法和装置、存储介质及电子设备
CN112138385B (zh) * 2020-10-28 2022-07-29 腾讯科技(深圳)有限公司 虚拟射击道具的瞄准方法、装置、电子设备及存储介质
AU2021307015B2 (en) * 2020-11-13 2023-06-08 Tencent Technology (Shenzhen) Company Limited Virtual object control method and apparatus, storage medium, and electronic device
CN112619134B (zh) * 2020-12-22 2023-05-02 上海米哈游天命科技有限公司 发射目标飞行距离的确定方法、装置、设备及存储介质
CN113117330B (zh) * 2021-05-20 2022-09-23 腾讯科技(深圳)有限公司 虚拟对象的技能释放方法、装置、设备及介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130079138A1 (en) * 2011-09-28 2013-03-28 Konami Digital Entertainment Co., Ltd. Game Apparatus, Game Control Method, and Computer-Readable Non-Transitory Information Recording Medium Recording Program
US20140342788A1 (en) * 2013-05-15 2014-11-20 Sin Woo Kim Method of applying multiple crosshairs and recording medium having stored thereon program for executing the same
CN107661630A (zh) * 2017-08-28 2018-02-06 网易(杭州)网络有限公司 一种射击游戏的控制方法及装置、存储介质、处理器、终端
CN107773987A (zh) * 2017-10-24 2018-03-09 网易(杭州)网络有限公司 虚拟射击主体控制方法、装置、电子设备及存储介质
CN108159704A (zh) * 2017-12-28 2018-06-15 天脉聚源(北京)科技有限公司 一种手势操作处理方法及装置
CN111359215A (zh) * 2020-03-08 2020-07-03 北京智明星通科技股份有限公司 一种射击类游戏的控制方法、系统及设备
CN111659118A (zh) * 2020-07-10 2020-09-15 腾讯科技(深圳)有限公司 道具控制方法和装置、存储介质及电子设备

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130112586A (ko) * 2012-04-04 2013-10-14 주식회사 드래곤플라이 게임 장치 및 그 제어 방법
CN107930115A (zh) * 2017-09-13 2018-04-20 同济大学 一种用于射击游戏中的蓄力射击瞄准方法
CN110170168B (zh) * 2019-05-30 2022-05-27 腾讯科技(深圳)有限公司 虚拟对象射击控制方法、装置、电子设备及存储介质
CN110465098B (zh) * 2019-08-08 2020-09-25 腾讯科技(深圳)有限公司 控制虚拟对象使用虚拟道具的方法、装置、设备及介质
CN110448891B (zh) * 2019-08-08 2021-06-25 腾讯科技(深圳)有限公司 控制虚拟对象操作远程虚拟道具的方法、装置及存储介质
CN110465087B (zh) * 2019-08-23 2020-12-01 腾讯科技(深圳)有限公司 虚拟物品的控制方法、装置、终端及存储介质
CN110721468B (zh) * 2019-09-30 2020-09-15 腾讯科技(深圳)有限公司 互动道具控制方法、装置、终端及存储介质
CN110694261B (zh) * 2019-10-21 2022-06-21 腾讯科技(深圳)有限公司 控制虚拟对象进行攻击的方法、终端及存储介质
CN111097170B (zh) * 2019-12-11 2022-11-22 腾讯科技(深圳)有限公司 吸附框的调整方法和装置、存储介质及电子装置
CN111104021B (zh) * 2019-12-19 2022-11-08 腾讯科技(深圳)有限公司 虚拟道具的控制方法和装置、存储介质及电子装置
CN111111219B (zh) * 2019-12-19 2022-02-25 腾讯科技(深圳)有限公司 虚拟道具的控制方法和装置、存储介质及电子装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130079138A1 (en) * 2011-09-28 2013-03-28 Konami Digital Entertainment Co., Ltd. Game Apparatus, Game Control Method, and Computer-Readable Non-Transitory Information Recording Medium Recording Program
US20140342788A1 (en) * 2013-05-15 2014-11-20 Sin Woo Kim Method of applying multiple crosshairs and recording medium having stored thereon program for executing the same
CN107661630A (zh) * 2017-08-28 2018-02-06 网易(杭州)网络有限公司 一种射击游戏的控制方法及装置、存储介质、处理器、终端
CN107773987A (zh) * 2017-10-24 2018-03-09 网易(杭州)网络有限公司 虚拟射击主体控制方法、装置、电子设备及存储介质
CN108159704A (zh) * 2017-12-28 2018-06-15 天脉聚源(北京)科技有限公司 一种手势操作处理方法及装置
CN111359215A (zh) * 2020-03-08 2020-07-03 北京智明星通科技股份有限公司 一种射击类游戏的控制方法、系统及设备
CN111659118A (zh) * 2020-07-10 2020-09-15 腾讯科技(深圳)有限公司 道具控制方法和装置、存储介质及电子设备

Also Published As

Publication number Publication date
JP2023528327A (ja) 2023-07-04
US20230057421A1 (en) 2023-02-23
CN111659118B (zh) 2021-04-09
JP7419568B2 (ja) 2024-01-22
CN111659118A (zh) 2020-09-15
KR20220139967A (ko) 2022-10-17
JP2024062977A (ja) 2024-05-10

Similar Documents

Publication Publication Date Title
WO2022007569A1 (zh) 道具控制方法和装置、存储介质及电子设备
WO2019179294A1 (zh) 虚拟环境对战中的装备显示方法、装置、设备及存储介质
WO2021143260A1 (zh) 虚拟道具的使用方法、装置、计算机设备及存储介质
CN108310765B (zh) 图像的显示方法和装置、存储介质、电子装置
US20220379219A1 (en) Method and apparatus for controlling virtual object to restore attribute value, terminal, and storage medium
US20220161138A1 (en) Method and apparatus for using virtual prop, device, and storage medium
CN110507990B (zh) 基于虚拟飞行器的互动方法、装置、终端及存储介质
US20220226727A1 (en) Method and apparatus for displaying virtual item, device, and storage medium
CN110876849B (zh) 虚拟载具的控制方法、装置、设备及存储介质
WO2022199017A1 (zh) 虚拟道具的显示方法、装置、电子设备及存储介质
US11786817B2 (en) Method and apparatus for operating virtual prop in virtual environment, device and readable medium
WO2021135525A1 (zh) 虚拟道具获取方法、装置、存储介质及电子装置
CN114432701A (zh) 基于虚拟场景的射线显示方法、装置、设备以及存储介质
WO2023065962A1 (zh) 信息确定方法、装置、设备及存储介质
CN112755524B (zh) 虚拟目标展示方法、装置、电子设备及存储介质
CN113599822B (zh) 虚拟道具的控制方法和装置、存储介质及电子设备
CN112717394A (zh) 瞄准标记的显示方法、装置、设备及存储介质
CN111135567B (zh) 虚拟道具的操作方法和装置、存储介质及电子装置
JP5270787B1 (ja) フライトゲームシステム、同システムにおける方法、同方法を実行するためのプログラム及び同プログラムを記憶する情報記録媒体
US20220297003A1 (en) Method and apparatus for displaying virtual item, electronic device, and storage medium
CN114404933A (zh) 道具控制方法和装置、存储介质及电子设备
CN117547823A (zh) 虚拟道具控制方法、装置、计算机设备及存储介质
JP2014087587A (ja) フライトゲームシステム、同システムにおける方法、同方法を実行するためのプログラム及び同プログラムを記憶する情報記録媒体
CN114452648A (zh) 虚拟道具的操作方法和装置、存储介质及电子设备
CN118022330A (zh) 虚拟对象的互动方法、装置、设备、介质及程序产品

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21837019

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20227031420

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2022572422

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 06-06-2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21837019

Country of ref document: EP

Kind code of ref document: A1