WO2021017784A1 - 虚拟对象的控制方法、装置、终端及存储介质 - Google Patents

虚拟对象的控制方法、装置、终端及存储介质 Download PDF

Info

Publication number
WO2021017784A1
WO2021017784A1 PCT/CN2020/100906 CN2020100906W WO2021017784A1 WO 2021017784 A1 WO2021017784 A1 WO 2021017784A1 CN 2020100906 W CN2020100906 W CN 2020100906W WO 2021017784 A1 WO2021017784 A1 WO 2021017784A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
firearm
fire
firing
mirror
Prior art date
Application number
PCT/CN2020/100906
Other languages
English (en)
French (fr)
Inventor
刘智洪
杨金昊
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to SG11202110875SA priority Critical patent/SG11202110875SA/en
Priority to KR1020217033554A priority patent/KR102635988B1/ko
Priority to JP2021550057A priority patent/JP2022522699A/ja
Publication of WO2021017784A1 publication Critical patent/WO2021017784A1/zh
Priority to US17/471,980 priority patent/US20210402287A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/422Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle automatically for the purpose of assisting the player, e.g. automatic braking in a driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Definitions

  • the embodiments of the present application relate to the field of computer and Internet technologies, and in particular, to a method, device, terminal, and storage medium for controlling virtual objects.
  • players can control virtual guns held by virtual objects to shoot in the game scene provided by the game match, and kill the enemy virtual object to achieve the victory of the game match. .
  • the operation method of controlling virtual objects to shoot is relatively complicated and inefficient.
  • the embodiments of the present application provide a method, device, terminal, and storage medium for controlling virtual objects.
  • the technical solution is as follows:
  • an embodiment of the present application provides a method for controlling a virtual object, which is applied to a mobile terminal, and the method includes:
  • Displaying a user interface the user interface including a first fire button, and the first fire button is an operation control used to trigger the opening and firing of the mirror;
  • the virtual firearm held by the virtual object is controlled to enter the open mirror state; wherein, the open mirror state refers to the virtual environment through the virtual sight equipped with the virtual firearm State of observation
  • the virtual firearm is controlled to shoot.
  • an embodiment of the present application provides a virtual object control device, the device includes:
  • An interface display module for displaying a user interface, the user interface includes a first fire button, and the first fire button is an operation control for triggering the opening of the mirror and firing;
  • the mirror-opening control module is used to control the virtual firearm held by the virtual object to enter the mirror-opening state when the trigger signal corresponding to the first fire button is received; wherein, the mirror-opening state refers to the virtual firearm equipped with the virtual firearm. The state in which the scope is observing the virtual environment;
  • Condition detection module used to detect whether the firing conditions are met
  • the shooting control module is configured to control the virtual firearm to shoot when the virtual firearm is in the open mirror state if the firing condition is satisfied.
  • an embodiment of the present application provides a mobile terminal.
  • the mobile terminal includes a processor and a memory.
  • the memory stores at least one instruction, at least one program, code set, or instruction set, and the at least one instruction The at least one piece of program, the code set or the instruction set is loaded and executed by the processor to realize the control method of the virtual object.
  • an embodiment of the present application provides a computer-readable storage medium that stores at least one instruction, at least one program, code set, or instruction set, the at least one instruction, the at least one program , The code set or instruction set is loaded and executed by the processor to realize the control method of the virtual object.
  • an embodiment of the present application provides a computer program product, which when the computer program product runs on a mobile terminal, causes the mobile terminal to execute the aforementioned method for controlling virtual objects.
  • a first fire button for triggering the lens opening and firing in the user interface when a trigger signal corresponding to the first fire button is received, the virtual object held by the virtual object is controlled
  • the firearm enters the open mirror state, and when the virtual firearm is in the open mirror state, the virtual firearm is controlled to shoot at the same time, which realizes the ability to open the mirror and fire with one button. It only needs one operation to complete the mirror and fire without two steps. Completed, fully improving operation efficiency.
  • Fig. 1 is a schematic diagram of an implementation environment provided by an embodiment of the present application.
  • FIG. 2 is a schematic structural diagram of a mobile terminal provided by an embodiment of the present application.
  • Fig. 3 is a flowchart of a method for controlling virtual objects provided by an embodiment of the present application
  • Figure 4 exemplarily shows a schematic diagram of the first firing button
  • Figure 5 exemplarily shows a schematic diagram of a setting interface
  • FIG. 6 is a flowchart of a method for controlling virtual objects provided by another embodiment of the present application.
  • Figure 7 exemplarily shows a schematic diagram of the second firing button
  • FIG. 8 is a flowchart of a method for controlling virtual objects provided by another embodiment of the present application.
  • Figure 9 is a block diagram of a virtual object control device provided by an embodiment of the present application.
  • FIG. 10 is a block diagram of a virtual object control device provided by another embodiment of the present application.
  • Fig. 11 is a structural block diagram of a mobile terminal provided by an embodiment of the present application.
  • Virtual objects are virtual roles controlled by user accounts in applications. Taking the application as a game application as an example, a virtual object refers to a game character controlled by a user account in the game application.
  • the virtual object may be in the form of characters, animals, cartoons, or other forms, which are not limited in the embodiment of the present application.
  • the virtual object can be displayed in a three-dimensional form or in a two-dimensional form, which is not limited in the embodiment of the present application.
  • the operations that the user account can perform to control the virtual object may also be different.
  • user accounts can control virtual objects to perform operations such as shooting, running, jumping, picking up firearms, changing firearms, and adding bullets to firearms.
  • Virtual firearms refer to virtual items that can simulate real firearms for shooting.
  • the virtual firearm can be a three-dimensional model of the real firearm, and the virtual object can carry the virtual firearm and control the virtual firearm to shoot at a certain target.
  • Virtual firearms can include many different types of firearms, such as rifles, submachine guns, machine guns, shotguns, pistols, etc.
  • Firearms categories can be divided according to actual needs.
  • rifles can also be subdivided into different categories such as assault rifles and sniper rifles
  • machine guns can also be subdivided into different categories such as light machine guns and heavy machine guns.
  • a virtual sight refers to a virtual item equipped with a virtual firearm for auxiliary observation of the virtual environment.
  • the virtual sight may be a three-dimensional model of the real sight. After the virtual firearm is equipped with a virtual sight, it can enter the open state, which refers to the state of observing the virtual environment through the virtual sight.
  • the virtual sight can include magnifications of different magnifications (such as 2x, 3x, 4x, 6x, 8x, etc.), laser sights, red dot sights, holographic sights, etc.
  • FIG. 1 shows a schematic diagram of an implementation environment provided by an embodiment of the present application.
  • the implementation environment may include: a mobile terminal 10 and a server 20.
  • the mobile terminal 10 may be a portable electronic device such as a mobile phone, a tablet computer, a game console, an e-book reader, a multimedia playback device, and a wearable device.
  • the mobile terminal 10 can install a client terminal of a game application program, such as a client terminal of a shooting game application program.
  • the server 20 is used to provide background services for clients of applications (such as game applications) in the mobile terminal 10.
  • the server 20 may be a background server of the above-mentioned application program (such as a game application program).
  • the server 20 may be a server, a server cluster composed of multiple servers, or a cloud computing service center.
  • the mobile terminal 10 and the server 20 can communicate with each other through the network 30.
  • the network 30 may be a wired network or a wireless network.
  • the execution subject of each step may be a mobile terminal.
  • FIG. 2 shows a schematic structural diagram of a mobile terminal provided by an embodiment of the present application.
  • the mobile terminal 10 may include: a main board 110, an external output/input device 120, a memory 130, an external interface 140, a touch control system 150, and a power supply 160.
  • the motherboard 110 integrates processing elements such as a processor and a controller.
  • the external output/input device 120 may include a display component (such as a display screen), a sound playback component (such as a speaker), a sound collection component (such as a microphone), various keys, and so on.
  • a display component such as a display screen
  • a sound playback component such as a speaker
  • a sound collection component such as a microphone
  • various keys and so on.
  • the memory 130 stores program codes and data.
  • the external interface 140 may include an earphone interface, a charging interface, and a data interface.
  • the touch control system 150 may be integrated in the display components or keys of the external output/input device 120, and the touch control system 150 is used to detect touch operations performed by the user on the display components or keys.
  • the power supply 160 is used to supply power to other components in the mobile terminal 10.
  • the processor in the motherboard 110 can generate a user interface (such as a game interface) by executing or calling program codes and data stored in the memory, and output/output the generated user interface (such as a game interface) through an external device.
  • the input device 120 performs presentation.
  • the touch control system 150 can detect a touch operation performed when the user interacts with the user interface (such as a game interface), and respond to the touch operation.
  • the user interface of the shooting game is provided with a camera button and a fire button.
  • the virtual firearm held by the virtual object can be controlled to enter the open mirror state.
  • the player can observe the virtual environment through the virtual sight equipped with the virtual firearm.
  • the fire button the virtual gun held by the virtual object can be controlled to shoot.
  • a first fire button for triggering the lens opening and firing in the user interface when a trigger signal corresponding to the first fire button is received, the virtual object held by the virtual object is controlled
  • the firearm enters the open mirror state, and when the virtual firearm is in the open mirror state, the virtual firearm is controlled to shoot at the same time, which realizes the ability to open the mirror and fire with one button. It only needs one operation to complete the mirror and fire without two steps. Completed, fully improving operation efficiency.
  • FIG. 3 shows a flowchart of a method for controlling virtual objects provided by an embodiment of the present application.
  • the method can be applied to the mobile terminal introduced above, such as the client of an application (such as a shooting game application) applied to a mobile terminal.
  • the method can include the following steps:
  • a user interface is displayed.
  • the user interface includes a first fire button, and the first fire button is an operation control for triggering the opening of the mirror and firing.
  • the user interface may be a display interface of a game match.
  • the user interface is used to present the user with a virtual environment of the game match.
  • the user interface may include elements in the virtual environment, such as virtual Buildings, virtual props, virtual objects, etc.
  • the user interface also includes some operation controls, such as buttons, sliders, icons, etc., for users to operate.
  • the user interface 40 includes a first fire button 41, and the first fire button 41 is an operation control for triggering the opening of the mirror and firing. That is, the mirror opening function and the firing function are integrated on the same button, instead of using two different buttons to realize the two functions separately, so as to realize the ability to complete the mirror opening and fire with one key.
  • Step 302 When a trigger signal corresponding to the first fire button is received, control the virtual firearm held by the virtual object to enter a mirror-on state.
  • the open-scope state refers to the state of observing the virtual environment through the virtual sight equipped with the virtual gun.
  • the user interface can display the virtual sight equipped with the virtual gun, and display the virtual environment seen through the virtual sight.
  • the mobile terminal determines whether the trigger signal corresponding to the first fire button is received by the following method: when receiving the touch operation signal, obtain the touch position of the touch operation signal, and then obtain the touch position and the first fire button. If the distance between the center positions of a fire button is less than the first threshold, it is determined that the trigger signal corresponding to the first fire button is received.
  • the first threshold may be the same as the radius of the first fire button, or may be different from the radius of the first fire button, for example, slightly larger or slightly smaller than the radius of the first fire button .
  • the touch position of the touch operation signal is acquired, and then the touch position is compared with the configuration file.
  • the center position of the first fire button configured in is compared to calculate the distance between the two, and then the distance is compared with the first threshold to determine whether a trigger signal corresponding to the first fire button is received.
  • the above configuration file may be a configuration file in JSON (JavaScript Object Notation, JavaScript Object Notation) format.
  • the technician can modify the first fire button by changing the above-mentioned configuration file without having to modify the code of the entire application. Bring greater convenience to project version iteration.
  • the mobile terminal can also detect whether the virtual firearm held by the virtual object supports raising the mirror to fire. If the virtual firearm supports raising the mirror to fire, when the trigger signal corresponding to the first fire button is received, the control of the virtual object holding is executed. Some virtual firearms enter the step of opening the mirror state; otherwise, if the virtual firearm does not support raising the mirror to fire, when the trigger signal corresponding to the first fire button is received, the virtual firearm can be controlled to fire in the waist. Waist shot fire refers to a firing method that does not perform the opening operation but directly performs the shooting operation.
  • the mobile terminal detects whether the virtual firearm held by the virtual object supports raising the mirror to fire, including the following steps:
  • the setting item is used to set the firing method.
  • the firing method includes raising the mirror and shooting from the waist;
  • the virtual firearms may include a variety of different firearms categories, such as rifles, submachine guns, machine guns, shotguns, pistols, and so on.
  • firearms categories such as rifles, submachine guns, machine guns, shotguns, pistols, and so on.
  • users can set the corresponding firing method according to actual needs. For example, for a rifle, the user's corresponding firing method is raising the mirror to fire; for a submachine gun, the user's corresponding firing method is waist firing.
  • the firing mode setting area 51 of the setting interface 50 a variety of different types of firearms are provided, such as assault rifles, submachine guns, shotguns, sniper rifles, etc., for each
  • For the firearm category there are 2 setting items correspondingly, namely, raising the mirror to fire and waist fire. Users can flexibly set the firing method corresponding to each type of firearm according to actual needs. For example, as shown in Figure 5, the fire method corresponding to the assault rifle is set to raise the mirror fire, the fire method corresponding to the submachine gun is set to raise the mirror fire, the fire method corresponding to the shotgun is set to fire from the waist, and so on.
  • users can also set the firing method of all gun categories with one click, such as setting the firing method of all firearms categories to raise the mirror fire, or set the firing method of all firearms categories to waist fire with one click.
  • Step 303 It is detected whether the firing conditions are met.
  • the mobile terminal After receiving the trigger signal corresponding to the first firing button, the mobile terminal also needs to detect whether the current state meets the firing condition.
  • the firing conditions refer to the pre-set conditions that allow the virtual firearms held by the virtual object to fire.
  • the foregoing firing conditions may include a first firing condition set for the virtual object and a second firing condition set for the virtual firearm.
  • the mobile terminal can detect whether the virtual object meets the first firing condition; if the virtual object meets the first firing condition, then detect whether the virtual firearm meets the second firing condition; if the virtual firearm meets the second firing condition, determine that the firing condition is satisfied.
  • the mobile terminal determines that the firing condition is not satisfied.
  • detecting whether the virtual object meets the first firing condition includes: acquiring state information of the virtual object; according to the state information of the virtual object, detecting whether the virtual object meets the first firing condition; where the first firing condition includes but not Limited to at least one of the following: the virtual object is alive, the virtual object is not in the driving vehicle, and the virtual object is not in the water. If the virtual object does not meet the first firing condition, such as the virtual object has died or the virtual object is driving a vehicle, the virtual object cannot control the virtual firearm to fire.
  • the foregoing first firing condition can be set in advance, which is not limited in the embodiment of the present application.
  • detecting whether the virtual firearm meets the second firing condition includes: acquiring status information of the virtual firearm; according to the status information of the virtual firearm, detecting whether the virtual firearm meets the second firing condition; where the second firing condition includes But it is not limited to at least one of the following: the virtual firearm has remaining bullets, and the virtual firearm does not replace bullets. If the virtual firearm does not meet the second firing condition, such as the virtual firearm has no remaining bullets or the virtual firearm is changing bullets, the virtual object cannot control the virtual firearm to fire.
  • the aforementioned second firing condition can be set in advance, which is not limited in the embodiment of the present application.
  • step 304 if the firing conditions are met, the virtual firearm is controlled to shoot when the virtual firearm is in the open mirror state.
  • the virtual firearm is controlled to shoot at the same time while keeping the virtual firearm in the open mirror state.
  • the user clicks a button with one finger that is, the first fire button described above
  • the process can be ended, and the mobile terminal will not control the virtual firearm to perform the firing operation.
  • lens opening button and a fire button Some users can quickly open and fire the lens through multi-finger operations. For example, one finger clicks the lens opening button and the other finger clicks the fire button.
  • the mirror opening button and the firing button are usually set on the same side of the user interface, for example, they are both set on the right side of the user interface, and the left side of the user interface is set with joystick operation controls for controlling the movement of virtual objects, so the user needs to pass
  • the two fingers of the same hand respectively click the open lens button and the fire button, which is relatively difficult for the user to operate. Even if the lens opening button and the firing button are set on both sides of the interface, there are problems that require multi-finger operations.
  • one finger of the left hand needs to control movement
  • one finger of the right hand needs to click the lens opening button
  • the other finger of the left hand needs to click the fire button.
  • the user can trigger the completion of the two operations of opening the mirror and firing by clicking a button with one finger, which fully reduces the user's operation threshold and improves the operation efficiency.
  • multi-finger players such as three-finger players and four-finger players
  • two-finger players can also get a better gaming experience.
  • the virtual firearm if the virtual firearm supports continuous shooting, the virtual firearm is controlled to perform a firing operation every preset time interval within the duration of the trigger signal.
  • the preset time interval may be set in advance.
  • the preset time interval ie, the firing interval
  • the virtual firearm is controlled to perform a firing operation within the duration of the trigger signal.
  • the user when the virtual firearm is in a mirror-on state, the user can also move a finger to adjust the touch position corresponding to the trigger signal.
  • the mobile terminal obtains the touch position corresponding to the trigger signal, and adjusts the shooting direction of the virtual firearm according to the touch position corresponding to the trigger signal.
  • the mobile terminal determines the relative position relationship between the touch position corresponding to the trigger signal and the center point of the first fire button, and determines the shooting direction of the virtual firearm based on the relative position relationship.
  • the four directions of up, down, left, and right in the above relative position relationship respectively correspond to the four directions of north, south, west, and east in the virtual environment where the virtual object is located.
  • the mobile terminal determines the shooting direction of the virtual firearm, it can first control the virtual object to face the shooting direction, display the virtual environment within the frontal field of view of the virtual object in the user interface, and then control the virtual firearm to shoot at the shooting direction . That is to say, through the above method, the three operations of opening the scope, aiming and firing can be completed with one button, which fully improves the operation efficiency.
  • the user wants to quickly complete the three operations of opening, aiming and firing through multi-finger operations.
  • the user needs to control the virtual object to move with one finger on the left hand, and click the opening button with one finger on the right hand, and the other with the right hand. Only the finger rotates the angle of view to aim at the enemy virtual object, and the other finger of the left hand clicks the fire button, which places high demands on the user's operation.
  • the first fire button is configured with a rocker function. After the user clicks the first fire button, the shooting direction of the virtual firearm can be adjusted by moving the finger without leaving the screen, so that the user With one finger, you can complete the three operations of opening, aiming and firing, which fully reduces the user's operational difficulty.
  • the virtual gun held by the subject enters the mirror-opening state, and when the virtual gun is in the mirror-opening state, the virtual gun is controlled to shoot at the same time, realizing the ability to open the mirror and fire with one click, and only need one step to complete the mirroring and fire. There is no need to complete the two-step operation, which fully improves the operating efficiency.
  • the shooting direction of the virtual firearm can be adjusted adaptively according to the change of the touch position corresponding to the trigger signal, and the shooting direction of the virtual firearm can be adjusted by one key. Three operations were fired, which further improved operational efficiency.
  • the method for controlling a virtual object may include the following steps:
  • Step 601 Determine whether the virtual weapon held by the virtual object supports raising the mirror to fire; if yes, execute the following step 602; if not, execute the following step 604;
  • Step 602 judge whether the setting item of the virtual weapon is raising the mirror and firing; if yes, execute the following step 603; if not, execute the following step 604;
  • step 603 it is determined that the firing method is raising the mirror to fire
  • Step 604 Determine that the firing method is waist firing
  • Step 605 Determine whether the trigger signal corresponding to the first fire button is received; if yes, execute the following step 606, if not, execute step 605;
  • Step 606 Control the virtual weapon to enter the mirror-on state
  • Step 607 Determine whether the virtual object meets the first firing condition; if yes, perform the following step 608; if not, end the process;
  • Step 608 determine whether the virtual weapon meets the second firing condition; if yes, execute the following step 609; if not, execute step 607;
  • Step 609 Control the virtual weapon to perform the first firing operation
  • Step 610 Determine whether the interval from the last shooting operation has reached the preset time; if so, perform the following step 611; if not, continue to perform the step 610;
  • Step 611 determine whether the trigger signal disappears; if not, perform the following step 612, and perform step 610 again after the following step 612; if yes, perform the following step 613;
  • Step 612 control the virtual weapon to perform a shooting operation
  • Step 613 Control the virtual weapon to stop the shooting operation.
  • the user interface in addition to the first fire button, also includes a second fire button.
  • the second firing button is an operation control used to trigger waist firing.
  • the mobile terminal controls the virtual firearm held by the virtual object to fire in the waist.
  • the user interface 40 includes a first fire button 41 and a second fire button 42.
  • the second fire button 42 may be arranged beside the first fire button 41, for example, on the left side of the first fire button 41.
  • the size of the second fire button 42 may be appropriately smaller than the size of the first fire button 41.
  • the mobile terminal When the mobile terminal receives the touch operation signal, it can obtain the touch position of the touch operation signal, and then obtain the distance between the touch position of the touch operation signal and the center position of the second fire button. If the distance is less than the second threshold, It is determined that the trigger signal corresponding to the second fire button is received.
  • the second threshold When the second fire button is circular, the second threshold may be the same as the radius of the second fire button, or may be different from the radius of the second fire button, for example, slightly larger or slightly smaller than the radius of the second fire button .
  • the distance between the touch position of the touch operation signal and the center position of the second fire button is less than the second threshold, it is considered that the user's finger clicks on the second fire button, that is, it is determined that the second fire button is received. Trigger signal.
  • the coordinates of the center position of the second firing button and the above-mentioned second threshold may also be configured in the configuration file, which is not limited in the embodiment of the present application.
  • the first fire button and the second fire button are provided in the user interface at the same time, so as to give the user more diversified choices of firing methods.
  • the user can click the first fire button to achieve the mirror fire, and the user clicks
  • the second fire button can realize waist shot fire, which better meets the user's operating needs.
  • the method for controlling virtual objects may include the following steps:
  • Step 801 Determine whether the current mode is configured to display a double fire button; if yes, perform the following step 802; if not, end the process;
  • Step 802 Display the first fire button and the second fire button in the user interface
  • Step 803 Determine whether a trigger signal corresponding to one of the firing buttons is received; if yes, execute the following step 804; if not, execute step 803;
  • Step 804 determine whether the first fire button is clicked; if yes, perform the following step 805, and then perform step 807 after step 805; if not, perform the following step 806;
  • Step 805 trigger the first fire button
  • Step 806 trigger the second fire button
  • Step 807 Control the virtual weapon held by the virtual object to enter the mirror-on state
  • Step 808 determine whether the firing conditions are met; if yes, perform the following step 809; if not, end the process;
  • Step 809 Control the virtual weapon to perform shooting operations according to the selected firing method
  • Step 810 When it is detected that the trigger signal disappears, control the virtual weapon to stop performing the shooting operation.
  • FIG. 9 shows a block diagram of a virtual object control device provided by an embodiment of the present application.
  • the device has the function of realizing the foregoing method example, and the function can be realized by hardware, or by hardware executing corresponding software.
  • the device can be a mobile terminal, or can be set in the mobile terminal.
  • the device 900 may include: an interface display module 901, a lens opening control module 902, a condition detection module 903, and a shooting control module 904.
  • the interface display module 901 is configured to display a user interface, the user interface includes a first fire button, and the first fire button is an operation control used to trigger the opening of the mirror and fire;
  • the mirror-opening control module 902 is configured to control the virtual firearm held by the virtual object to enter the mirror-opening state when a trigger signal corresponding to the first fire button is received; wherein, the mirror-opening state refers to the virtual firearm equipped with The state of the virtual sight to observe the virtual environment;
  • the condition detection module 903 is used to detect whether the firing conditions are met
  • the shooting control module 904 is configured to control the virtual firearm to shoot when the virtual firearm is in the open mirror state if the firing condition is satisfied.
  • condition detection module 903 is configured to: detect whether the virtual object meets the first firing condition; if the virtual object meets the first firing condition, detect whether the virtual firearm meets the A second firing condition; if the virtual firearm meets the second firing condition, it is determined that the firing condition is satisfied.
  • the shooting control module 904 is configured to control the virtual firearm to perform a shooting operation every preset time interval within the duration of the trigger signal.
  • the device 900 further includes: a first position acquisition module 905, a distance acquisition module 906, and a button trigger module 907.
  • the first position obtaining module 905 is configured to obtain the touch position of the touch operation signal when the touch operation signal is received.
  • the distance obtaining module 906 is configured to obtain the distance between the touch position of the touch operation signal and the center position of the first fire button.
  • the button trigger module 907 is configured to determine that a trigger signal corresponding to the first fire button is received if the distance is less than a first threshold.
  • the device 900 further includes: a second position acquisition module 908 and a direction adjustment module 909.
  • the second position acquiring module 908 is configured to acquire the touch position corresponding to the trigger signal within the duration of the trigger signal.
  • the direction adjustment module 909 is configured to adjust the shooting direction of the virtual firearm according to the touch position corresponding to the trigger signal.
  • the user interface further includes a second fire button, and the second fire button is an operation control for triggering a waist fire.
  • the shooting control module 904 is further configured to control the virtual firearm held by the virtual object to fire in the waist when the trigger signal corresponding to the second fire button is received.
  • the device further includes: a fire detection module 910.
  • the fire detection module 910 is configured to detect whether the virtual firearm held by the virtual object supports raising the mirror to fire.
  • the mirror opening control module 902 is further configured to, if the virtual firearm supports the raising of the mirror to fire, when a trigger signal corresponding to the first fire button is received, execute the control virtual object holding The steps for the virtual firearm to enter the mirror state.
  • the fire detection module 910 is used to:
  • the setting item corresponding to the firearm category is the raising of the mirror to fire, it is determined that the virtual firearm supports the raising of the mirror to fire.
  • the virtual gun held by the subject enters the mirror-opening state, and when the virtual gun is in the mirror-opening state, the virtual gun is controlled to shoot at the same time, realizing the ability to open the mirror and fire with one click, and only need one step to complete the mirroring and fire. There is no need to complete the two-step operation, which fully improves the operating efficiency.
  • the device provided in the above embodiment when implementing its functions, only uses the division of the above functional modules for illustration. In practical applications, the above functions can be allocated by different functional modules as needed, namely The internal structure of the device is divided into different functional modules to complete all or part of the functions described above.
  • the apparatus and method embodiments provided by the above-mentioned embodiments belong to the same concept, and the specific implementation process is detailed in the method embodiments, which will not be repeated here.
  • FIG. 11 shows a structural block diagram of a mobile terminal 1100 according to an embodiment of the present application.
  • the mobile terminal 1100 may be a portable electronic device such as a mobile phone, a tablet computer, a game console, an e-book reader, a multimedia playback device, and a wearable device.
  • the mobile terminal is used to implement the virtual object control method provided in the foregoing embodiment.
  • the mobile terminal may be the mobile terminal 10 in the implementation environment shown in FIG. 1. Specifically:
  • the mobile terminal 1100 includes a processor 1101 and a memory 1102.
  • the processor 1101 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on.
  • the processor 1101 can be implemented in at least one hardware form among DSP (Digital Signal Processing), FPGA (Field Programmable Gate Array), PLA (Programmable Logic Array, Programmable Logic Array) .
  • the processor 1101 may also include a main processor and a coprocessor.
  • the main processor is a processor used to process data in the wake state, also called a CPU (Central Processing Unit, central processing unit); the coprocessor is A low-power processor used to process data in the standby state.
  • the processor 1101 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used for rendering and drawing content that needs to be displayed on the display screen.
  • the processor 1101 may further include an AI (Artificial Intelligence) processor, and the AI processor is used to process calculation operations related to machine learning.
  • AI Artificial Intelligence
  • the memory 1102 may include one or more computer-readable storage media, which may be non-transitory.
  • the memory 1102 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
  • the non-transitory computer-readable storage medium in the memory 1102 is used to store at least one instruction, at least one program, code set or instruction set, the at least one instruction, at least one program, code set or instruction It is configured to be executed by one or more processors to realize the above-mentioned control method of virtual objects.
  • the mobile terminal 1100 may optionally further include: a peripheral device interface 1103 and at least one peripheral device.
  • the processor 1101, the memory 1102, and the peripheral device interface 1103 may be connected by a bus or a signal line.
  • Each peripheral device can be connected to the peripheral device interface 1103 through a bus, a signal line, or a circuit board.
  • the peripheral device includes at least one of a radio frequency circuit 1104, a touch display screen 1105, a camera 1106, an audio circuit 1107, a positioning component 1108, and a power supply 1109.
  • FIG. 11 does not constitute a limitation on the mobile terminal 1100, and may include more or fewer components than shown in the figure, or combine some components, or adopt different component arrangements.
  • a computer-readable storage medium stores at least one instruction, at least one program, a code set, or an instruction set, the at least one instruction, the at least one program ,
  • the code set or the instruction set is executed by the processor to realize the control method of the virtual object.
  • the computer-readable storage medium may include: Read Only Memory (ROM), Random Access Memory (RAM), Solid State Drives (SSD, Solid State Drives), or optical discs.
  • random access memory may include resistive random access memory (ReRAM, Resistance Random Access Memory) and dynamic random access memory (DRAM, Dynamic Random Access Memory).
  • a computer program product is also provided, which is used to implement the above-mentioned control method for virtual objects when the computer program product is executed by a processor.
  • the "plurality” mentioned herein refers to two or more.
  • “And/or” describes the association relationship of the associated objects, indicating that there can be three types of relationships, for example, A and/or B, which can mean: A alone exists, A and B exist at the same time, and B exists alone.
  • the character "/” generally indicates that the associated objects are in an "or” relationship.
  • the numbering of the steps described in this article only exemplarily shows a possible order of execution among the steps. In some other embodiments, the above steps may also be executed out of the order of numbers, such as two different numbers. The steps are executed at the same time, or the two steps with different numbers are executed in the reverse order of the figure, which is not limited in the embodiment of the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Computer Security & Cryptography (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种虚拟对象的控制方法、装置、终端及存储介质,该方法包括:显示用户界面,该用户界面中包括第一开火按钮,该第一开火按钮是用于触发开镜并开火的操作控件(301);当接收到对应于第一开火按钮的触发信号时,控制虚拟对象持有的虚拟枪械进入开镜状态(302);检测是否满足开火条件(303);若满足开火条件,则在虚拟枪械处于开镜状态的情况下,控制虚拟枪械进行射击(304)。该方法实现了一键完成开镜并开火的能力,仅需一步操作即可完成开镜并开火,而无需两步操作完成,充分提高了操作效率。

Description

虚拟对象的控制方法、装置、终端及存储介质
本申请要求于2019年08月01日提交的申请号为201910707805.2、发明名称为“虚拟对象的控制方法、装置、终端及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及计算机和互联网技术领域,特别涉及一种虚拟对象的控制方法、装置、终端及存储介质。
背景技术
目前,在一些移动端的射击类游戏中,玩家可以在游戏对局提供的游戏场景中,控制虚拟对象所持有的虚拟枪械进行射击,通过击杀敌方虚拟对象,以达到游戏对局的胜利。
相关技术提供的控制虚拟对象进行射击的操作方式,较为复杂低效。
发明内容
本申请实施例提供了一种虚拟对象的控制方法、装置、终端及存储介质。所述技术方案如下:
一方面,本申请实施例提供了一种虚拟对象的控制方法,应用于移动终端,所述方法包括:
显示用户界面,所述用户界面中包括第一开火按钮,所述第一开火按钮是用于触发开镜并开火的操作控件;
当接收到对应于所述第一开火按钮的触发信号时,控制虚拟对象持有的虚拟枪械进入开镜状态;其中,所述开镜状态是指通过所述虚拟枪械配备的虚拟瞄准镜对虚拟环境进行观察的状态;
检测是否满足开火条件;
若满足所述开火条件,则在所述虚拟枪械处于所述开镜状态的情况下,控制所述虚拟枪械进行射击。
另一方面,本申请实施例提供了一种虚拟对象的控制装置,所述装置包括:
界面显示模块,用于显示用户界面,所述用户界面中包括第一开火按钮,所述第一开火按钮是用于触发开镜并开火的操作控件;
开镜控制模块,用于当接收到对应于所述第一开火按钮的触发信号时,控制虚拟对象持有的虚拟枪械进入开镜状态;其中,所述开镜状态是指通过所述虚拟枪械配备的虚拟瞄准镜对虚拟环境进行观察的状态;
条件检测模块,用于检测是否满足开火条件;
射击控制模块,用于若满足所述开火条件,则在所述虚拟枪械处于所述开镜状态的情况下,控制所述虚拟枪械进行射击。
再一方面,本申请实施例提供了一种移动终端,所述移动终端包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现上述虚拟对象的控制方法。
还一方面,本申请实施例提供了一种计算机可读存储介质,所述存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由处理器加载并执行以实现上述虚拟对象的控制方法。
又一方面,本申请实施例提供了一种计算机程序产品,当所述计算机程序产品在移动终端上运行时,使得移动终端执行上述虚拟对象的控制方法。
本申请实施例提供的技术方案中,通过在用户界面中设计用于触发开镜并开火的第一开火按钮,当接收到对应于该第一开火按钮的触发信号时,控制虚拟对象持有的虚拟枪械进入开镜状态,并在虚拟枪械处于开镜状态的情况下,同时控制虚拟枪械进行射击,实现了一键完成开镜并开火的能力,仅需一步操作即可完成开镜并开火,而无需两步操作完成,充分提高了操作效率。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请一个实施例提供的实施环境的示意图;
图2是本申请一个实施例提供的移动终端的结构示意图;
图3是本申请一个实施例提供的虚拟对象的控制方法的流程图;
图4示例性示出了第一开火按钮的示意图;
图5示例性示出了一种设置界面的示意图;
图6是本申请另一个实施例提供的虚拟对象的控制方法的流程图;
图7示例性示出了第二开火按钮的示意图;
图8是本申请另一个实施例提供的虚拟对象的控制方法的流程图;
图9是本申请一个实施例提供的虚拟对象的控制装置的框图;
图10是本申请另一个实施例提供的虚拟对象的控制装置的框图;
图11是本申请一个实施例提供的移动终端的结构框图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施方式作进一步地详细描述。
在对本申请实施例进行介绍说明之前,首先对本申请中涉及的相关名词进行解释说明。
1、虚拟对象
虚拟对象是指用户帐号在应用程序中控制的虚拟角色。以应用程序为游戏应用程序为例,虚拟对象是指用户帐号在游戏应用程序中控制的游戏角色。虚拟对象可以是人物形态,可以是动物、卡通或者其它形态,本申请实施例对此不作限定。虚拟对象可以三维形式展示,也可以二维形式展示,本申请实施例对此不作限定。
在不同的游戏应用程序中,用户帐号控制虚拟对象所能执行的操作也可能有所不同。例如,在射击类游戏应用程序中,用户帐号可以控制虚拟对象执行射击、奔跑、跳跃、拾取枪械、更换枪械、给枪械添加子弹等操作。
当然,除了游戏应用程序之外,其它类型的应用程序中也可以向用户展示虚拟对象,并给虚拟对象提供相应的功能。例如,AR(Augmented Reality,增强现实)类应用程序、社交类应用程序、互动娱乐类应用程序等,本申请实施例对此不作限定。另外,对于不同的应用程序来说,其所提供的虚拟对象的形 态也会有所不同,且相应的功能也会有所不同,这都可以根据实际需求预先进行配置,本申请实施例对此不作限定。
2、虚拟枪械
虚拟枪械是指能够模拟真实枪械进行射击的虚拟物品。虚拟枪械可以是真实枪械的三维模型,虚拟对象能够携带虚拟枪械,并控制虚拟枪械朝着某一目标进行射击。
虚拟枪械可以包括多种不同的枪械类别,如步枪、冲锋枪、机枪、散弹枪、手枪等。枪械类别可以结合实际需求进行划分,例如步枪还可以细分为突击步枪和狙击步枪等不同类别,机枪还可以细分为轻机枪和重机枪等不同类别。
3、虚拟瞄准镜
虚拟瞄准镜是指配备在虚拟枪械上的,用于对虚拟环境进行辅助观察的虚拟物品。虚拟瞄准镜可以是真实瞄准镜的三维模型。虚拟枪械装配有虚拟瞄准镜之后,可以进入开镜状态,该开镜状态是指通过虚拟瞄准镜对虚拟环境进行观察的状态。
虚拟瞄准镜可以包括不同放大倍数的倍镜(如2倍镜、3倍镜、4倍镜、6倍镜、8倍镜等)、激光瞄准镜、红点瞄准镜、全息瞄准镜等。
请参考图1,其示出了本申请一个实施例提供的实施环境的示意图。该实施环境可以包括:移动终端10和服务器20。
移动终端10可以是诸如手机、平板电脑、游戏主机、电子书阅读器、多媒体播放设备、可穿戴设备等便携式电子设备。移动终端10中可以安装游戏应用程序的客户端,如射击类游戏应用程序的客户端。
服务器20用于为移动终端10中的应用程序(如游戏应用程序)的客户端提供后台服务。例如,服务器20可以是上述应用程序(如游戏应用程序)的后台服务器。服务器20可以是一台服务器,也可以是由多台服务器组成的服务器集群,或者是一个云计算服务中心。
移动终端10和服务器20之间可通过网络30进行互相通信。该网络30可以是有线网络,也可以是无线网络。
在本申请方法实施例中,各步骤的执行主体可以是移动终端。请参考图2,其示出了本申请一个实施例提供的移动终端的结构示意图。该移动终端10可以包括:主板110、外部输出/输入设备120、存储器130、外部接口140、触控系 统150以及电源160。
其中,主板110中集成有处理器和控制器等处理元件。
外部输出/输入设备120可以包括显示组件(比如显示屏)、声音播放组件(比如扬声器)、声音采集组件(比如麦克风)以及各类按键等。
存储器130中存储有程序代码和数据。
外部接口140可以包括耳机接口、充电接口以及数据接口等。
触控系统150可以集成在外部输出/输入设备120的显示组件或者按键中,触控系统150用于检测用户在显示组件或者按键上执行的触控操作。
电源160用于对移动终端10中的其它各个部件进行供电。
在本申请实施例中,主板110中的处理器可以通过执行或者调用存储器中存储的程序代码和数据生成用户界面(如游戏界面),并将生成的用户界面(如游戏界面)通过外部输出/输入设备120进行展示。在展示用户界面(如游戏界面)的过程中,可以通过触控系统150检测用户与用户界面(如游戏界面)进行交互时执行的触控操作,并对该触控操作进行响应。
在相关技术中,射击类游戏的用户界面中提供有开镜按钮和开火按钮。用户点击开镜按钮之后,可以控制虚拟对象持有的虚拟枪械进入开镜状态,在开镜状态下,玩家能够通过虚拟枪械配备的虚拟瞄准镜对虚拟环境进行观察。用户点击开火按钮之后,可以控制虚拟对象持有的虚拟枪械进行射击。
在实际应用中,玩家有在开镜状态下进行射击的需求,从而能够更加准确地瞄准目标进行射击。但是,相关技术需要先点击开镜按钮,然后再点击开火按钮,才能实现在开镜状态下进行射击,操作复杂低效。
本申请实施例提供的技术方案中,通过在用户界面中设计用于触发开镜并开火的第一开火按钮,当接收到对应于该第一开火按钮的触发信号时,控制虚拟对象持有的虚拟枪械进入开镜状态,并在虚拟枪械处于开镜状态的情况下,同时控制虚拟枪械进行射击,实现了一键完成开镜并开火的能力,仅需一步操作即可完成开镜并开火,而无需两步操作完成,充分提高了操作效率。
请参考图3,其示出了本申请一个实施例提供的虚拟对象的控制方法的流程图。该方法可应用于上文介绍的移动终端中,如应用于移动终端的应用程序(如射击类游戏应用程序)的客户端中。该方法可以包括如下几个步骤:
步骤301,显示用户界面,该用户界面中包括第一开火按钮,该第一开火按钮是用于触发开镜并开火的操作控件。
以射击类游戏应用程序为例,用户界面可以是游戏对局的显示界面,该用户界面用于向用户呈现游戏对局的虚拟环境,如该用户界面中可以包括虚拟环境中的元素,如虚拟建筑、虚拟道具、虚拟对象等。可选地,该用户界面中还包括一些操作控件,如按钮、滑块、图标等,以供用户进行操作。
在本申请实施例中,如图4所示,用户界面40中包括第一开火按钮41,该第一开火按钮41是用于触发开镜并开火的操作控件。也即,将开镜功能和开火功能,集成在同一个按钮上,而非通过两个不同按钮来分别实现这两种功能,从而实现一键完成开镜并开火的能力。
步骤302,当接收到对应于第一开火按钮的触发信号时,控制虚拟对象持有的虚拟枪械进入开镜状态。
其中,开镜状态是指通过虚拟枪械配备的虚拟瞄准镜对虚拟环境进行观察的状态。例如,在开镜状态下,用户界面中可以显示虚拟枪械配备的虚拟瞄准镜,以及显示透过该虚拟瞄准镜所看到的虚拟环境。
在示例性实施例中,移动终端通过如下方式判断是否接收到对应于第一开火按钮的触发信号:在接收到触摸操作信号时,获取该触摸操作信号的触摸位置,然后获取该触摸位置与第一开火按钮的中心位置之间的距离,若该距离小于第一阈值,则确定接收到对应于第一开火按钮的触发信号。
当第一开火按钮为圆形时,上述第一阈值可以与该第一开火按钮的半径相同,也可以与该第一开火按钮的半径不同,例如略大于或略小于该第一开火按钮的半径。
当检测到触摸操作信号的触摸位置与第一开火按钮的中心位置之间的距离,小于第一阈值时,认为用户手指点击第一开火按钮,也即确定接收到对应于该第一开火按钮的触发信号。
示例性地,通过在配置文件中配置第一开火按钮的中心位置的坐标以及上述第一阈值,在接收到触摸操作信号时,获取该触摸操作信号的触摸位置,然后将该触摸位置与配置文件中配置的第一开火按钮的中心位置进行比对,计算出两者之间的距离,然后将该距离与第一阈值进行比对,确定是否接收到对应于第一开火按钮的触发信号。上述配置文件可以是JSON(JavaScript Object Notation,JavaScript对象简谱)格式的配置文件。通过将第一开火按钮的中心位 置的坐标以及上述第一阈值配置在配置文件中,使得技术人员通过更改上述配置文件即可完成对第一开火按钮的修改,而不必修改整个应用程序的代码,给项目版本迭代带来更大的便捷性。
另外,移动终端还可以检测虚拟对象持有的虚拟枪械是否支持举镜开火,若该虚拟枪械支持举镜开火,则当接收到对应于第一开火按钮的触发信号时,执行上述控制虚拟对象持有的虚拟枪械进入开镜状态的步骤;否则,若该虚拟枪械不支持举镜开火,则当接收到对应于第一开火按钮的触发信号时,可以控制该虚拟枪械进行腰射开火。腰射开火是指不执行开镜操作,直接执行射击操作的开火方式。
可选地,移动终端检测虚拟对象持有的虚拟枪械是否支持举镜开火,包括如下步骤:
1、获取虚拟枪械所属的枪械类别;
2、获取该枪械类别对应的设置项,该设置项用于设置开火方式,该开火方式包括举镜开火和腰射开火;
3、若枪械类别对应的设置项为举镜开火,则确定虚拟枪械支持举镜开火。
在本申请实施例中,虚拟枪械可以包括多种不同的枪械类别,如步枪、冲锋枪、机枪、散弹枪、手枪等。针对每一种枪械类别,用户可以根据实际需求,设置相应的开火方式。例如,对于步枪,用户对应设置的开火方式为举镜开火;对于冲锋枪,用户对应设置的开火方式为腰射开火。
示例性地,如图5所示,在设置界面50的开火方式设置区域51中,提供有多种不同的枪械类别,如包括突击步枪、冲锋枪、散弹枪、狙击步枪等,针对每一种枪械类别,对应提供有2个设置项,即举镜开火和腰射开火。用户可以根据实际需求,灵活设置每种枪械类别对应的开火方式。例如,如图5所示,设置突击步枪对应的开火方式为举镜开火,设置冲锋枪对应的开火方式为举镜开火,设置散弹枪对应的开火方式为腰射开火,等等。
当然,用户还可以一键设置所有枪械类别的开火方式,如一键设置所有枪械类别的开火方式为举镜开火,或者一键设置所有枪械类别的开火方式为腰射开火。
步骤303,检测是否满足开火条件。
移动终端在接收到对应于第一开火按钮的触发信号之后,还要检测当前状态是否满足开火条件。其中,开火条件是指预先设定的允许虚拟对象持有的虚 拟枪械进行开火射击的条件。
可选地,上述开火条件可以包括针对虚拟对象设定的第一开火条件,以及针对虚拟枪械设定的第二开火条件。移动终端可以检测虚拟对象是否满足第一开火条件;若虚拟对象满足第一开火条件,则检测虚拟枪械是否满足第二开火条件;若虚拟枪械满足第二开火条件,则确定满足开火条件。另外,若虚拟对象不满足第一开火条件,或者虚拟枪械不满足第二开火条件,则移动终端确定不满足开火条件。当然,上述检测虚拟对象是否满足第一开火条件和检测虚拟枪械是否满足第二开火条件这两个步骤,可以依次先后执行,也可以同时执行,本申请实施例对此不作限定。
可选地,检测虚拟对象是否满足第一开火条件,包括:获取虚拟对象的状态信息;根据该虚拟对象的状态信息,检测虚拟对象是否满足第一开火条件;其中,第一开火条件包括但不限于以下至少一项:虚拟对象处于存活状态、虚拟对象不在驾驶载具、虚拟对象不在水中。如果虚拟对象不满足第一开火条件,如虚拟对象已经死亡或者虚拟对象正在驾驶载具等,则虚拟对象无法控制虚拟枪械进行开火。上述第一开火条件可以预先进行设定,本申请实施例对此不作限定。
可选地,检测虚拟枪械是否满足第二开火条件,包括:获取虚拟枪械的状态信息;根据该虚拟枪械的状态信息,检测虚拟枪械是否满足所述第二开火条件;其中,第二开火条件包括但不限于以下至少一项:虚拟枪械具有剩余子弹、虚拟枪械不在更换子弹。如果虚拟枪械不满足第二开火条件,如虚拟枪械没有剩余子弹或者虚拟枪械正在更换子弹,则虚拟对象无法控制虚拟枪械进行开火。上述第二开火条件可以预先进行设定,本申请实施例对此不作限定。
步骤304,若满足开火条件,则在虚拟枪械处于开镜状态的情况下,控制虚拟枪械进行射击。
如果当前状态满足开火条件,则在保持虚拟枪械处于开镜状态的情况下,同时控制该虚拟枪械进行射击。通过上述方式,用户通过一个手指点击一个按键(也即上文介绍的第一开火按钮),即可触发完成开镜和开火两个操作,实现了“一键举镜开火”的功能。另外,如果当前状态不满足开火条件,则可以结束流程,移动终端不会控制虚拟枪械执行射击操作。
需要说明的一点是,相关技术设置开镜按钮和开火按钮,一些用户可以通过多指操作实现快速开镜和开火,例如一个手指点击开镜按钮,另一个手指点 击开火按钮。但是,由于开镜按钮和开火按钮通常设置在用户界面的同一侧,例如都设置在用户界面的右侧,用户界面的左侧设置用于控制虚拟对象进行移动的摇杆操作控件,因此用户需要通过同一个手的两个手指分别点击开镜按钮和开火按钮,这对于用户的操作难度是比较高的。即便是将开镜按钮和开火按钮设置在界面两侧,也存在需要多指操作的问题,例如左手一个手指需要控制移动,右手一个手指需要点击开镜按钮,左手另一个手指需要点击开火按钮。本申请实施例提供的技术方案,用户通过一个手指点击一个按键,即可触发完成开镜和开火两个操作,充分降低了用户的操作门槛,提升了操作效率。另外,在某种程度上,也消除了多指玩家(如三指玩家和四指玩家)相对于两指玩家的优势,使得两指玩家也能够获得较好的游戏体验。
另外,如果虚拟枪械支持连续射击,则在触发信号的持续时间内,控制虚拟枪械每隔预设时间间隔执行一次射击操作。其中,预设时间间隔可以预先进行设定,例如不同的虚拟枪械所对应的预设时间间隔(也即开火间隔)可以相同,也可以不同。如果虚拟枪械不支持连续射击,则在触发信号的持续时间内,控制虚拟枪械执行一次射击操作。
在示例性实施例中,在虚拟枪械处于开镜状态下,用户还可以通过移动手指,从而调整触发信号对应的触摸位置。相应地,在触发信号的持续时间内,移动终端获取该触发信号对应的触摸位置,根据该触发信号对应的触摸位置调整虚拟枪械的射击方向。可选地,移动终端确定触发信号对应的触摸位置与第一开火按钮的中心点之间的相对位置关系,基于该相对位置关系确定虚拟枪械的射击方向。例如,上述相对位置关系中的上、下、左、右四个方向,分别对应于虚拟对象所在虚拟环境中的北、南、西、东四个方向。这样,假设触发信号对应的触摸位置在第一开火按钮的中心点的右上方45°位置,那么确定虚拟枪械的射击方向为北偏东45°方向;假设触发信号对应的触摸位置在第一开火按钮的中心点的左下方30°位置,那么确定虚拟枪械的射击方向为南偏西30°方向。移动终端在确定出虚拟枪械的射击方向之后,可以先控制虚拟对象朝向该射击方向,在用户界面中显示虚拟对象的正面视野范围内的虚拟环境,然后再控制虚拟枪械对着该射击方向进行射击。也即通过上述方式,实现了通过一键完成开镜、瞄准和开火3个操作,充分提升了操作效率。
如果采用相关技术提供的方案,用户想要通过多指操作来快速完成开镜、瞄准和开火3个操作,用户需要左手一只手指控制虚拟对象进行移动,右手一 只手指点击开镜按钮,右手另一只手指转动视角瞄准敌方虚拟对象,左手另一只手指点击开火按钮,这对于用户操作提出了很高的要求。
本申请实施例提供的技术方案,通过给第一开火按钮配置摇杆功能,用户点击第一开火按钮之后,在手指不离开屏幕的情况下,移动手指可以调整虚拟枪械的射击方向,从而使得用户通过一个手指,就可以完成开镜、瞄准和开火3个操作,充分降低了用户的操作难度。
综上所述,本申请实施例提供的技术方案中,通过在用户界面中设计用于触发开镜并开火的第一开火按钮,当接收到对应于该第一开火按钮的触发信号时,控制虚拟对象持有的虚拟枪械进入开镜状态,并在虚拟枪械处于开镜状态的情况下,同时控制虚拟枪械进行射击,实现了一键完成开镜并开火的能力,仅需一步操作即可完成开镜并开火,而无需两步操作完成,充分提高了操作效率。
另外,在对应于该第一开火按钮的触发信号的持续时间内,能够根据该触发信号对应的触摸位置的变化,适应性地调整虚拟枪械的射击方向,实现了通过一键完成开镜、瞄准和开火3个操作,进一步提升了操作效率。
在一个示例中,结合参开图6,本申请实施例提供的虚拟对象的控制方法可以包括如下几个步骤:
步骤601,判断虚拟对象持有的虚拟武器是否支持举镜开火;若是,则执行下述步骤602;若否,则执行下述步骤604;
步骤602,判断虚拟武器的设置项是否为举镜开火;若是,则执行下述步骤603;若否,则执行下述步骤604;
步骤603,确定开火方式为举镜开火;
步骤604,确定开火方式为腰射开火;
步骤605,判断是否接收到对应于第一开火按钮的触发信号;若是,则执行下述步骤606,若否,则执行步骤605;
步骤606,控制虚拟武器进入开镜状态;
步骤607,判断虚拟对象是否满足第一开火条件;若是,则执行下述步骤608;若否,则结束流程;
步骤608,判断虚拟武器是否满足第二开火条件;若是,则执行下述步骤609;若否,则执行步骤607;
步骤609,控制虚拟武器执行第一次射击操作;
步骤610,判断距离上一次射击操作的间隔时长是否达到预设时长;若是,则执行下述步骤611;若否,则继续执行步骤610;
步骤611,判断触发信号是否消失;若否,则执行下述步骤612,并在下述步骤612之后再次执行步骤610;若是,则执行下述步骤613;
步骤612,控制虚拟武器执行一次射击操作;
步骤613,控制虚拟武器停止执行射击操作。
在示例性实施例中,用户界面中除了包括第一开火按钮之外,还包括第二开火按钮。该第二开火按钮是用于触发腰射开火的操作控件。当接收到对应于第二开火按钮的触发信号时,移动终端控制虚拟对象持有的虚拟枪械进行腰射开火。
示例性地,如图7所示,用户界面40中包括第一开火按钮41和第二开火按钮42。其中,第二开火按钮42可以设置在第一开火按钮41的旁边,例如设置在第一开火按钮41的左侧。并且,第二开火按钮42的尺寸,可以适当小于第一开火按钮41的尺寸。
移动终端在接收到触摸操作信号时,可以获取该触摸操作信号的触摸位置,然后获取该触摸操作信号的触摸位置与第二开火按钮的中心位置之间的距离,若该距离小于第二阈值,则确定接收到对应于第二开火按钮的触发信号。当第二开火按钮为圆形时,上述第二阈值可以与该第二开火按钮的半径相同,也可以与该第二开火按钮的半径不同,例如略大于或略小于该第二开火按钮的半径。当检测到触摸操作信号的触摸位置与第二开火按钮的中心位置之间的距离,小于第二阈值时,认为用户手指点击第二开火按钮,也即确定接收到对应于该第二开火按钮的触发信号。类似地,也可以在配置文件中配置第二开火按钮的中心位置的坐标以及上述第二阈值,本申请实施例对此不作限定。
在本申请实施例中,通过在用户界面中同时提供第一开火按钮和第二开火按钮,从而给用户更加多样化的开火方式的选择,用户点击第一开火按钮可以实现举镜开火,用户点击第二开火按钮可以实现腰射开火,更好地满足用户的操作需求。
由于一键举镜开火为快速举镜作战提供了便利,但是在近距离作战时,会因为要在用户界面中显示虚拟瞄准镜,而导致用户视线被遮挡,难以进行瞄准 开火的问题。通过在第一开火按钮的旁边设置第二开火按钮,可以很好地弥补上述不足。例如,用户在近距离作战时,可以点击第二开火按钮直接进行腰射开火,从而避免开镜所带来的问题。而在中远距离作战时,可以点击第一开火按钮进行举镜射击,从而给用户提供更加便捷、自由的开火选择。
另外,在用户界面中只显示第一开火按钮,还是同时显示第一开火按钮和第二开火按钮,这可以由用户根据自身实际需求进行设置,本申请实施例对此不作限定。
在一个示例中,结合参考图8,本申请实施例提供的虚拟对象的控制方法可以包括如下几个步骤:
步骤801,判断当前模式是否配置为显示双开火按钮;若是,则执行下述步骤802;若否,则结束流程;
步骤802,在用户界面中显示第一开火按钮和第二开火按钮;
步骤803,判断是否接收到对应于其中一个开火按钮的触发信号;若是,则执行下述步骤804;若否,则执行步骤803;
步骤804,判断是否点击第一开火按钮;若是,则执行下述步骤805,并在步骤805之后执行步骤807;若否,则执行下述步骤806;
步骤805,触发第一开火按钮;
步骤806,触发第二开火按钮;
步骤807,控制虚拟对象持有的虚拟武器进入开镜状态;
步骤808,判断是否满足开火条件;若是,则执行下述步骤809;若否,则结束流程;
步骤809,控制虚拟武器按照选择的开火方式执行射击操作;
步骤810,当检测到触发信号消失时,控制虚拟武器停止执行射击操作。
下述为本申请装置实施例,可以用于执行本申请方法实施例。对于本申请装置实施例中未披露的细节,请参照本申请方法实施例。
请参考图9,其示出了本申请一个实施例提供的虚拟对象的控制装置的框图。该装置具有实现上述方法示例的功能,所述功能可以由硬件实现,也可以由硬件执行相应的软件实现。该装置可以是移动终端,也可以设置在移动终端中。该装置900可以包括:界面显示模块901、开镜控制模块902、条件检测模块903和射击控制模块904。
界面显示模块901,用于显示用户界面,所述用户界面中包括第一开火按钮,所述第一开火按钮是用于触发开镜并开火的操作控件;
开镜控制模块902,用于当接收到对应于所述第一开火按钮的触发信号时,控制虚拟对象持有的虚拟枪械进入开镜状态;其中,所述开镜状态是指通过所述虚拟枪械配备的虚拟瞄准镜对虚拟环境进行观察的状态;
条件检测模块903,用于检测是否满足开火条件;
射击控制模块904,用于若满足所述开火条件,则在所述虚拟枪械处于所述开镜状态的情况下,控制所述虚拟枪械进行射击。
在示例性实施例中,所述条件检测模块903,用于:检测所述虚拟对象是否满足第一开火条件;若所述虚拟对象满足所述第一开火条件,则检测所述虚拟枪械是否满足第二开火条件;若所述虚拟枪械满足所述第二开火条件,则确定满足所述开火条件。
在示例性实施例中,所述射击控制模块904,用于在所述触发信号的持续时间内,控制所述虚拟枪械每隔预设时间间隔执行一次射击操作。
在示例性实施例中,如图10所示,所述装置900还包括:第一位置获取模块905、距离获取模块906和按钮触发模块907。
第一位置获取模块905,用于在接收到触摸操作信号时,获取所述触摸操作信号的触摸位置。
距离获取模块906,用于获取所述触摸操作信号的触摸位置与所述第一开火按钮的中心位置之间的距离。
按钮触发模块907,用于若所述距离小于第一阈值,则确定接收到对应于所述第一开火按钮的触发信号。
在示例性实施例中,如图10所示,所述装置900还包括:第二位置获取模块908和方向调整模块909。
第二位置获取模块908,用于在所述触发信号的持续时间内,获取所述触发信号对应的触摸位置。
方向调整模块909,用于根据所述触发信号对应的触摸位置调整所述虚拟枪械的射击方向。
在示例性实施例中,所述用户界面中还包括第二开火按钮,所述第二开火按钮是用于触发腰射开火的操作控件。
相应地,所述射击控制模块904,还用于当接收到对应于所述第二开火按钮 的触发信号时,控制所述虚拟对象持有的所述虚拟枪械进行腰射开火。
在示例性实施例中,如图10所示,所述装置还包括:开火检测模块910。
开火检测模块910,用于检测所述虚拟对象持有的所述虚拟枪械是否支持举镜开火。
所述开镜控制模块902,用于,还用于若所述虚拟枪械支持所述举镜开火,则当接收到对应于所述第一开火按钮的触发信号时,执行所述控制虚拟对象持有的虚拟枪械进入开镜状态的步骤。
在示例性实施例中,所述开火检测模块910,用于:
获取所述虚拟枪械所属的枪械类别;
获取所述枪械类别对应的设置项,所述设置项用于设置开火方式,所述开火方式包括所述举镜开火和腰射开火;
若所述枪械类别对应的设置项为所述举镜开火,则确定所述虚拟枪械支持所述举镜开火。
综上所述,本申请实施例提供的技术方案中,通过在用户界面中设计用于触发开镜并开火的第一开火按钮,当接收到对应于该第一开火按钮的触发信号时,控制虚拟对象持有的虚拟枪械进入开镜状态,并在虚拟枪械处于开镜状态的情况下,同时控制虚拟枪械进行射击,实现了一键完成开镜并开火的能力,仅需一步操作即可完成开镜并开火,而无需两步操作完成,充分提高了操作效率。
需要说明的是,上述实施例提供的装置,在实现其功能时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将设备的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的装置与方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
请参考图11,其示出了本申请一个实施例提供的移动终端1100的结构框图。该移动终端1100可以是诸如手机、平板电脑、游戏主机、电子书阅读器、多媒体播放设备、可穿戴设备等便携式电子设备。该移动终端用于实施上述实施例中提供的虚拟对象的控制方法。该移动终端可以是图1所示实施环境中的移动终端10。具体来讲:
通常,移动终端1100包括有:处理器1101和存储器1102。
处理器1101可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器1101可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。处理器1101也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器1101可以在集成有GPU(Graphics Processing Unit,图像处理器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器1101还可以包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器1102可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是非暂态的。存储器1102还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1102中的非暂态的计算机可读存储介质用于存储至少一个指令,至少一段程序、代码集或指令集,所述至少一条指令、至少一段程序、代码集或指令集,且经配置以由一个或者一个以上处理器执行,以实现上述虚拟对象的控制方法。
在一些实施例中,移动终端1100还可选包括有:外围设备接口1103和至少一个外围设备。处理器1101、存储器1102和外围设备接口1103之间可以通过总线或信号线相连。各个外围设备可以通过总线、信号线或电路板与外围设备接口1103相连。具体地,外围设备包括:射频电路1104、触摸显示屏1105、摄像头1106、音频电路1107、定位组件1108和电源1109中的至少一种。
本领域技术人员可以理解,图11中示出的结构并不构成对移动终端1100的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
在示例性实施例中,还提供了一种计算机可读存储介质,所述存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或所述指令集在被处理器执行时以实现上述虚 拟对象的控制方法。
可选地,该计算机可读存储介质可以包括:只读存储器(ROM,Read Only Memory)、随机存取记忆体(RAM,Random Access Memory)、固态硬盘(SSD,Solid State Drives)或光盘等。其中,随机存取记忆体可以包括电阻式随机存取记忆体(ReRAM,Resistance Random Access Memory)和动态随机存取存储器(DRAM,Dynamic Random Access Memory)。
在示例性实施例中,还提供一种计算机程序产品,所述计算机程序产品被处理器执行时,用于实现上述虚拟对象的控制方法。
应当理解的是,在本文中提及的“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。字符“/”一般表示前后关联对象是一种“或”的关系。另外,本文中描述的步骤编号,仅示例性示出了步骤间的一种可能的执行先后顺序,在一些其它实施例中,上述步骤也可以不按照编号顺序来执行,如两个不同编号的步骤同时执行,或者两个不同编号的步骤按照与图示相反的顺序执行,本申请实施例对此不作限定。
以上所述仅为本申请的示例性实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (13)

  1. 一种虚拟对象的控制方法,应用于移动终端,所述方法包括:
    显示用户界面,所述用户界面中包括第一开火按钮,所述第一开火按钮是用于触发开镜并开火的操作控件;
    当接收到对应于所述第一开火按钮的触发信号时,控制虚拟对象持有的虚拟枪械进入开镜状态;其中,所述开镜状态是指通过所述虚拟枪械配备的虚拟瞄准镜对虚拟环境进行观察的状态;
    检测是否满足开火条件;
    若满足所述开火条件,则在所述虚拟枪械处于所述开镜状态的情况下,控制所述虚拟枪械进行射击。
  2. 根据权利要求1所述的方法,其中,所述检测是否满足开火条件,包括:
    检测所述虚拟对象是否满足第一开火条件;
    若所述虚拟对象满足所述第一开火条件,则检测所述虚拟枪械是否满足第二开火条件;
    若所述虚拟枪械满足所述第二开火条件,则确定满足所述开火条件。
  3. 根据权利要求2所述的方法,其中,所述检测所述虚拟对象是否满足第一开火条件,包括:
    获取所述虚拟对象的状态信息;
    根据所述虚拟对象的状态信息,检测所述虚拟对象是否满足所述第一开火条件;其中,所述第一开火条件包括以下至少一项:所述虚拟对象处于存活状态、所述虚拟对象不在驾驶载具、所述虚拟对象不在水中。
  4. 根据权利要求2所述的方法,其中,所述检测所述虚拟枪械是否满足第二开火条件,包括:
    获取所述虚拟枪械的状态信息;
    根据所述虚拟枪械的状态信息,检测所述虚拟枪械是否满足所述第二开火条件;其中,所述第二开火条件包括以下至少一项:所述虚拟枪械具有剩余子弹、所述虚拟枪械不在更换子弹。
  5. 根据权利要求1所述的方法,其中,所述控制所述虚拟枪械进行射击,包括:
    在所述触发信号的持续时间内,控制所述虚拟枪械每隔预设时间间隔执行一次射击操作。
  6. 根据权利要求1所述的方法,其中,所述方法还包括:
    在接收到触摸操作信号时,获取所述触摸操作信号的触摸位置;
    获取所述触摸操作信号的触摸位置与所述第一开火按钮的中心位置之间的距离;
    若所述距离小于第一阈值,则确定接收到对应于所述第一开火按钮的触发信号。
  7. 根据权利要求1所述的方法,其中,所述控制虚拟对象持有的虚拟枪械进入开镜状态之后,还包括:
    在所述触发信号的持续时间内,获取所述触发信号对应的触摸位置;
    根据所述触发信号对应的触摸位置调整所述虚拟枪械的射击方向。
  8. 根据权利要求1至7任一项所述的方法,其中,所述用户界面中还包括第二开火按钮,所述第二开火按钮是用于触发腰射开火的操作控件:
    所述方法还包括:
    当接收到对应于所述第二开火按钮的触发信号时,控制所述虚拟对象持有的所述虚拟枪械进行腰射开火。
  9. 根据权利要求1至7任一项所述的方法,其中,所述方法还包括:
    检测所述虚拟对象持有的所述虚拟枪械是否支持举镜开火;
    若所述虚拟枪械支持所述举镜开火,则当接收到对应于所述第一开火按钮的触发信号时,执行所述控制虚拟对象持有的虚拟枪械进入开镜状态的步骤。
  10. 根据权利要求9所述的方法,其中,所述检测所述虚拟对象持有的所 述虚拟枪械是否支持举镜开火,包括:
    获取所述虚拟枪械所属的枪械类别;
    获取所述枪械类别对应的设置项,所述设置项用于设置开火方式,所述开火方式包括所述举镜开火和腰射开火;
    若所述枪械类别对应的设置项为所述举镜开火,则确定所述虚拟枪械支持所述举镜开火。
  11. 一种虚拟对象的控制装置,所述装置包括:
    界面显示模块,用于显示用户界面,所述用户界面中包括第一开火按钮,所述第一开火按钮是用于触发开镜并开火的操作控件;
    开镜控制模块,用于当接收到对应于所述第一开火按钮的触发信号时,控制虚拟对象持有的虚拟枪械进入开镜状态;其中,所述开镜状态是指通过所述虚拟枪械配备的虚拟瞄准镜对虚拟环境进行观察的状态;
    条件检测模块,用于检测是否满足开火条件;
    射击控制模块,用于若满足所述开火条件,则在所述虚拟枪械处于所述开镜状态的情况下,控制所述虚拟枪械进行射击。
  12. 一种移动终端,所述移动终端包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如权利要求1至10任一项所述的虚拟对象的控制方法。
  13. 一种计算机可读存储介质,所述存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由处理器加载并执行以实现如权利要求1至10任一项所述的虚拟对象的控制方法。
PCT/CN2020/100906 2019-08-01 2020-07-08 虚拟对象的控制方法、装置、终端及存储介质 WO2021017784A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
SG11202110875SA SG11202110875SA (en) 2019-08-01 2020-07-08 Virtual object control method and apparatus, terminal, and storage medium
KR1020217033554A KR102635988B1 (ko) 2019-08-01 2020-07-08 가상 객체 제어 방법 및 장치, 단말 및 저장 매체
JP2021550057A JP2022522699A (ja) 2019-08-01 2020-07-08 仮想オブジェクトの制御方法、装置、端末及びプログラム
US17/471,980 US20210402287A1 (en) 2019-08-01 2021-09-10 Virtual object control method and apparatus, terminal, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910707805.2 2019-08-01
CN201910707805.2A CN110339562B (zh) 2019-08-01 2019-08-01 虚拟对象的控制方法、装置、终端及存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/471,980 Continuation US20210402287A1 (en) 2019-08-01 2021-09-10 Virtual object control method and apparatus, terminal, and storage medium

Publications (1)

Publication Number Publication Date
WO2021017784A1 true WO2021017784A1 (zh) 2021-02-04

Family

ID=68183620

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/100906 WO2021017784A1 (zh) 2019-08-01 2020-07-08 虚拟对象的控制方法、装置、终端及存储介质

Country Status (6)

Country Link
US (1) US20210402287A1 (zh)
JP (1) JP2022522699A (zh)
KR (1) KR102635988B1 (zh)
CN (1) CN110339562B (zh)
SG (1) SG11202110875SA (zh)
WO (1) WO2021017784A1 (zh)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109598777B (zh) * 2018-12-07 2022-12-23 腾讯科技(深圳)有限公司 图像渲染方法、装置、设备及存储介质
CN110339562B (zh) * 2019-08-01 2023-09-15 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、终端及存储介质
CN110772793A (zh) * 2019-11-07 2020-02-11 腾讯科技(深圳)有限公司 虚拟资源的配置方法、装置、电子设备及存储介质
CN110841277B (zh) * 2019-11-07 2021-08-06 腾讯科技(深圳)有限公司 基于触控屏的虚拟操作对象的控制方法和装置、存储介质
CN110882538B (zh) * 2019-11-28 2021-09-07 腾讯科技(深圳)有限公司 虚拟活体角色的显示方法、装置、存储介质和计算机设备
CN111013136B (zh) * 2019-12-12 2021-09-03 腾讯科技(深圳)有限公司 虚拟场景中的移动控制方法、装置、设备及存储介质
CN111135567B (zh) * 2019-12-17 2022-01-04 腾讯科技(深圳)有限公司 虚拟道具的操作方法和装置、存储介质及电子装置
CN111228798A (zh) * 2020-01-06 2020-06-05 腾讯科技(深圳)有限公司 虚拟道具控制方法、装置、电子设备及存储介质
CN111265856A (zh) * 2020-01-17 2020-06-12 腾讯科技(深圳)有限公司 虚拟操作对象的控制方法、装置、存储介质及电子装置
CN111318015B (zh) * 2020-02-26 2021-10-12 腾讯科技(深圳)有限公司 虚拟物品的控制方法、装置、终端和存储介质
CN111359214B (zh) * 2020-03-05 2021-05-11 腾讯科技(深圳)有限公司 虚拟道具控制方法和装置、存储介质及电子装置
CN111613108A (zh) * 2020-04-10 2020-09-01 北京晶品特装科技有限责任公司 一种手雷模拟器、装置、方法及存储介质
CN111643895B (zh) * 2020-06-22 2023-08-25 腾讯科技(深圳)有限公司 操作响应方法、装置、终端及存储介质
CN112138391A (zh) * 2020-09-24 2020-12-29 网易(杭州)网络有限公司 游戏中的虚拟武器控制方法、装置、电子设备及存储介质
CN113499590A (zh) * 2021-08-04 2021-10-15 网易(杭州)网络有限公司 游戏中射击的控制方法、装置、电子设备及可读介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107773983A (zh) * 2017-10-18 2018-03-09 网易(杭州)网络有限公司 一种游戏中的射击控制方法及装置
CN108339272A (zh) * 2018-02-12 2018-07-31 网易(杭州)网络有限公司 虚拟射击主体控制方法及装置、电子设备、存储介质
US10183222B2 (en) * 2016-04-01 2019-01-22 Glu Mobile Inc. Systems and methods for triggering action character cover in a video game
CN110141869A (zh) * 2019-04-11 2019-08-20 腾讯科技(深圳)有限公司 操作控制方法、装置、电子设备及存储介质
CN110339562A (zh) * 2019-08-01 2019-10-18 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、终端及存储介质

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040266528A1 (en) * 2003-06-27 2004-12-30 Xiaoling Wang Apparatus and a method for more realistic video games on computers or similar devices using visible or invisible light and a light sensing device
US11255638B2 (en) * 2010-08-19 2022-02-22 Evrio, Inc. Display indicating aiming point relative to target size indicator
TW201343227A (zh) * 2012-04-25 2013-11-01 Fu Li Ye Internat Corp 具有觸控面板裝置媒體的互動遊戲控制方法
US9248370B2 (en) * 2012-05-25 2016-02-02 Phillip B. Summons Targeting system and method for video games
KR101298030B1 (ko) * 2013-03-12 2013-08-26 주식회사 네시삼십삼분 슈팅 게임이 기록된 컴퓨터 판독 가능한 기록매체
US11465040B2 (en) * 2013-12-11 2022-10-11 Activision Publishing, Inc. System and method for playing video games on touchscreen-based devices
US9901824B2 (en) * 2014-03-12 2018-02-27 Wargaming.Net Limited User control of objects and status conditions
WO2015199780A2 (en) * 2014-04-01 2015-12-30 Baker Joe D Mobile ballistics processing and targeting display system
JP5727655B1 (ja) * 2014-09-17 2015-06-03 株式会社Pgユニバース 情報処理装置、情報処理方法及びプログラム
US20160381297A1 (en) * 2015-06-26 2016-12-29 Jsc Yukon Advanced Optics Worldwide Providing enhanced situational-awareness using magnified picture-in-picture within a wide field-of-view optical image
CN105760076B (zh) * 2016-02-03 2018-09-04 网易(杭州)网络有限公司 游戏控制方法及装置
US9919213B2 (en) * 2016-05-03 2018-03-20 Hothead Games Inc. Zoom controls for virtual environment user interfaces
CN107661630A (zh) * 2017-08-28 2018-02-06 网易(杭州)网络有限公司 一种射击游戏的控制方法及装置、存储介质、处理器、终端
US10807001B2 (en) * 2017-09-12 2020-10-20 Netease (Hangzhou) Network Co., Ltd. Information processing method, apparatus and computer readable storage medium
CN116450020A (zh) * 2017-09-26 2023-07-18 网易(杭州)网络有限公司 虚拟射击主体控制方法、装置、电子设备及存储介质
CN107773987B (zh) * 2017-10-24 2020-05-22 网易(杭州)网络有限公司 虚拟射击主体控制方法、装置、电子设备及存储介质
CN107913515B (zh) * 2017-10-25 2019-01-08 网易(杭州)网络有限公司 信息处理方法及装置、存储介质、电子设备
CN108671540A (zh) * 2018-05-09 2018-10-19 腾讯科技(深圳)有限公司 虚拟环境中的配件切换方法、设备及存储介质
CN109126128A (zh) * 2018-08-21 2019-01-04 苏州蜗牛数字科技股份有限公司 一种vr游戏中武器的管理系统及方法
CN109589601B (zh) * 2018-12-10 2022-04-08 网易(杭州)网络有限公司 虚拟瞄准镜控制方法及装置、电子设备和存储介质
CN109701274B (zh) * 2018-12-26 2019-11-08 网易(杭州)网络有限公司 信息处理方法及装置、存储介质、电子设备
US10946281B2 (en) * 2019-03-29 2021-03-16 Nvidia Corporation Using playstyle patterns to generate virtual representations of game players
CN110038297A (zh) * 2019-04-12 2019-07-23 网易(杭州)网络有限公司 移动终端的游戏操作方法及装置、存储介质及电子设备
US11534681B2 (en) * 2020-10-29 2022-12-27 Google Llc Virtual console gaming controller

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10183222B2 (en) * 2016-04-01 2019-01-22 Glu Mobile Inc. Systems and methods for triggering action character cover in a video game
CN107773983A (zh) * 2017-10-18 2018-03-09 网易(杭州)网络有限公司 一种游戏中的射击控制方法及装置
CN108339272A (zh) * 2018-02-12 2018-07-31 网易(杭州)网络有限公司 虚拟射击主体控制方法及装置、电子设备、存储介质
CN110141869A (zh) * 2019-04-11 2019-08-20 腾讯科技(深圳)有限公司 操作控制方法、装置、电子设备及存储介质
CN110339562A (zh) * 2019-08-01 2019-10-18 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、终端及存储介质

Also Published As

Publication number Publication date
CN110339562A (zh) 2019-10-18
KR20210138082A (ko) 2021-11-18
SG11202110875SA (en) 2021-10-28
KR102635988B1 (ko) 2024-02-08
US20210402287A1 (en) 2021-12-30
JP2022522699A (ja) 2022-04-20
CN110339562B (zh) 2023-09-15

Similar Documents

Publication Publication Date Title
WO2021017784A1 (zh) 虚拟对象的控制方法、装置、终端及存储介质
JP7350088B2 (ja) 仮想オブジェクトの制御方法、装置、デバイス及びコンピュータプログラム
JP7286219B2 (ja) 仮想オブジェクトの被撃提示方法、被撃提示装置、モバイル端末、及びコンピュータプログラム
WO2020063526A1 (zh) 控制虚拟角色的方法、装置、终端及计算机可读存储介质
JP5563709B2 (ja) タッチセンシティブ表面を介した仮想空間とのインタラクションを容易にするシステム及び方法
CN110478895B (zh) 虚拟物品的控制方法、装置、终端及存储介质
CN111921188B (zh) 虚拟对象的控制方法、装置、终端及存储介质
WO2023020125A1 (zh) 虚拟环境画面的显示方法、装置、终端、介质及程序产品
CN110597449B (zh) 基于虚拟环境的道具使用方法、装置、终端及存储介质
US20230330537A1 (en) Virtual object control method and apparatus, terminal and storage medium
CN111318015B (zh) 虚拟物品的控制方法、装置、终端和存储介质
US20230330528A1 (en) Information determining method and apparatus, device, and storage medium
CN112121416B (zh) 虚拟道具的控制方法、装置、终端及存储介质
CN111905380B (zh) 虚拟对象的控制方法、装置、终端及存储介质
CN113687761B (zh) 游戏控制方法及装置、电子设备、存储介质
CN113694515B (zh) 界面显示方法、装置、终端及存储介质
CN111643895B (zh) 操作响应方法、装置、终端及存储介质
CN113398574A (zh) 辅助瞄准调整方法、装置、存储介质及计算机设备
CN113617030B (zh) 虚拟对象的控制方法、装置、终端及存储介质
WO2024051422A1 (zh) 虚拟道具的显示方法、装置、设备、介质和程序产品
CN112426725A (zh) 虚拟对象的控制方法、装置、终端及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20847114

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021550057

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20217033554

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20847114

Country of ref document: EP

Kind code of ref document: A1