WO2020207206A1 - 操作控制方法、装置、电子设备及存储介质 - Google Patents

操作控制方法、装置、电子设备及存储介质 Download PDF

Info

Publication number
WO2020207206A1
WO2020207206A1 PCT/CN2020/079706 CN2020079706W WO2020207206A1 WO 2020207206 A1 WO2020207206 A1 WO 2020207206A1 CN 2020079706 W CN2020079706 W CN 2020079706W WO 2020207206 A1 WO2020207206 A1 WO 2020207206A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
function
shooting
mode
display mode
Prior art date
Application number
PCT/CN2020/079706
Other languages
English (en)
French (fr)
Inventor
杨槿
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to KR1020217017252A priority Critical patent/KR102578242B1/ko
Priority to AU2020256524A priority patent/AU2020256524A1/en
Priority to SG11202104911TA priority patent/SG11202104911TA/en
Priority to CA3132897A priority patent/CA3132897A1/en
Priority to BR112021019455A priority patent/BR112021019455A2/pt
Priority to JP2021531040A priority patent/JP7231737B2/ja
Publication of WO2020207206A1 publication Critical patent/WO2020207206A1/zh
Priority to US17/317,853 priority patent/US20210260479A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/218Input arrangements for video game devices characterised by their sensors, purposes or types using pressure sensors, e.g. generating a signal proportional to the pressure applied by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/219Input arrangements for video game devices characterised by their sensors, purposes or types for aiming at specific areas on the display, e.g. light-guns
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/22Setup operations, e.g. calibration, key configuration or button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/422Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle automatically for the purpose of assisting the player, e.g. automatic braking in a driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons

Definitions

  • This application relates to the field of computer technology, in particular to an operation control method, device, electronic equipment and storage medium.
  • the corresponding function is usually implemented according to the user's operation. For example, when an attack operation is received, the virtual object is controlled to attack.
  • the operation process is cumbersome and complex, and the operation control efficiency is low. Therefore, an operation control method is urgently needed to solve the above-mentioned cumbersome and complex operation and low efficiency problems.
  • an operation control method, device, electronic device, and storage medium are provided.
  • an operation control method which is executed by an electronic device, and the method includes:
  • the target function to be triggered in the display mode switching function and shooting function and the target function corresponding to the target function are determined according to the type of the virtual item controlled by the current virtual object Target model;
  • the target function is executed in the graphical user interface.
  • an operation control method which is executed by an electronic device, and the method includes:
  • the target button corresponding to multiple control functions, the multiple control functions including at least two of a display mode switching function, a shooting function, an action control function, or a viewing angle adjustment function;
  • At least one of the type of virtual prop controlled by the current virtual object, the motion state of the current virtual object, or the environment of the current virtual object in the virtual scene Determining the target function to be triggered among the plurality of control functions and the target mode corresponding to the target function;
  • the target function is executed in the graphical user interface.
  • an operation control device includes:
  • a display module for displaying a target button in a graphical user interface, the target button corresponding to a display mode switching function and a shooting function;
  • the determining module is used to determine the target function to be triggered in the display mode switching function and shooting function and the target function to be triggered in the display mode switching function and the shooting function according to the type of the virtual item controlled by the current virtual object when a touch operation on the target button is detected The target mode corresponding to the target function;
  • the execution module is configured to execute the target function in the graphical user interface based on the target mode.
  • an operation control device includes:
  • the display module is configured to display a target button in a graphical user interface, the target button corresponding to a plurality of control functions, the plurality of control functions including at least two of a display mode switching function, a shooting function, an action control function, or a viewing angle adjustment function A
  • the determining module is configured to, when a touch operation on the target button is detected, according to the type of virtual prop controlled by the current virtual object, the motion state of the current virtual object, or the current virtual object’s position in the virtual scene At least one item in the environment, determining the target function to be triggered among the multiple control functions and the target mode corresponding to the target function;
  • the execution module is configured to execute the target function in the graphical user interface based on the target mode.
  • an electronic device including a memory and a processor, the memory stores computer-readable instructions, and when the computer-readable instructions are executed by the processor, the processor executes the above operation control Method steps.
  • one or more non-volatile storage media storing computer-readable instructions are provided.
  • the computer-readable instructions are executed by one or more processors, the one or more processors perform the above operation control. Method steps.
  • FIG. 1 is a schematic diagram of a display mode of a virtual scene provided by an embodiment of the present application
  • FIG. 2 is a schematic diagram of a display mode of a virtual scene provided by an embodiment of the present application.
  • FIG. 3 is a flowchart of an operation control method provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a display interface of a target button provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a configuration interface provided by an embodiment of the present application.
  • Fig. 6 is a schematic diagram of a function configuration interface provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of an interface in the process of executing a target function when the type of a virtual prop provided by an embodiment of the present application is the first type;
  • FIG. 8 is a schematic diagram of an interface after executing a target function when the type of a virtual prop provided by an embodiment of the present application is the second type;
  • FIG. 9 is a schematic diagram of an interface in the process of executing a target function when the type of virtual prop is the third type provided by an embodiment of the present application.
  • FIG. 10 is a flowchart of an operation control method provided by an embodiment of the present application.
  • FIG. 11 is a flowchart of an operation control method provided by an embodiment of the present application.
  • FIG. 12 is a schematic structural diagram of an operation control device provided by an embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of an operation control device provided by an embodiment of the present application.
  • FIG. 14 is a schematic structural diagram of a terminal provided by an embodiment of the present application.
  • Fig. 15 is a schematic structural diagram of a server provided by an embodiment of the present application.
  • the embodiments of this application mainly relate to electronic games or simulated training scenarios.
  • the user can perform operations on the terminal in advance. After the terminal detects the user's operation, it can download the game configuration file of the electronic game.
  • the configuration file may include the electronic game application program, interface display data or virtual scene data, etc., so that the user can call the game configuration file when logging in the electronic game on the terminal to render and display the electronic game interface.
  • the user can perform a touch operation on the terminal. After the terminal detects the touch operation, it can determine the game data corresponding to the touch operation, and render and display the game data.
  • the game data may include virtual scene data, the Behavior data of virtual objects in the virtual scene, etc.
  • the virtual scene involved in this application can be used to simulate a three-dimensional virtual space or a two-dimensional virtual space.
  • the three-dimensional virtual space or the two-dimensional virtual space can be an open space.
  • the virtual scene can be used to simulate the real environment in reality.
  • the virtual scene can include sky, land, sea, etc.
  • the land can include environmental elements such as deserts and cities, and the user can control virtual objects to perform in the virtual scene.
  • the virtual object may be a virtual avatar used to represent the user, or a virtual avatar used to represent a creature associated with the user, such as a pet.
  • the virtual image can be in any form, such as a human or an animal, which is not limited in this application.
  • the virtual scene may also include other virtual objects, that is, the virtual scene may include multiple virtual objects.
  • Each virtual object has its own shape and volume in the virtual scene and occupies a part of the space in the virtual scene.
  • the user can control the virtual object to free fall, glide or open the parachute to fall in the sky of the virtual scene, run, jump, crawl or bend forward on the land, etc., and can also control The virtual object swims, floats, or dives in the ocean.
  • the user can also control the virtual object to take a vehicle to move in the virtual scene.
  • the virtual props may be cold weapons or hot weapons, which is not specifically limited in the embodiment of the present application.
  • the terminal screen displays the view of the virtual object controlled by the current terminal.
  • the terminal screen may also display the aiming point of the virtual object controlled by the current terminal.
  • the aiming point can be used to mark the current terminal controlled virtual object If the aiming target is in the view angle picture, the position of the aiming point in the virtual scene can be used as the attack point of the virtual object controlled by the current terminal.
  • the aiming point may be displayed in the center position of the terminal screen.
  • the aiming point may also be displayed in other positions, which is not specifically limited in the embodiment of the present application.
  • the display styles of the aiming point can include multiple types, and the display style of the aiming point can be displayed by the system default, or it can be adjusted according to the user's setting.
  • the user sees the aiming point displayed on the terminal, he can determine whether the position of the virtual scene corresponding to the current aiming point is the area he wants to aim at. If not, the user can adjust the sight of the virtual scene through the viewing angle adjustment operation to adjust the aiming point Area.
  • users usually want to quickly and accurately adjust the aiming point to other virtual objects in the virtual scene, so that they can shoot, slap or box the other virtual objects.
  • the viewing angle adjustment operation may include multiple operation modes.
  • the viewing angle adjustment operation may be a sliding operation, and the terminal detecting the sliding operation may be based on the sliding direction, sliding distance, and sliding speed of the sliding operation, Determine the rotation direction, rotation angle, and rotation speed of the viewing angle corresponding to the sliding operation.
  • the sliding direction of the sliding operation may correspond to the rotation direction of the viewing angle
  • the sliding distance of the sliding operation may be positively related to the rotation angle of the viewing angle.
  • the sliding speed of the sliding operation may also be positively related to the rotation speed of the viewing angle. .
  • the viewing angle adjustment operation may also be a rotation operation on the terminal.
  • the angular velocity sensor for example, a gyroscope
  • the terminal may rotate according to the rotation direction of the rotation operation.
  • Angle and rotation speed determine the rotation direction, rotation angle and rotation speed of the viewing angle.
  • the rotation direction of the rotation operation may be the rotation direction of the viewing angle
  • the rotation angle of the rotation operation may be positively related to the rotation angle of the viewing angle
  • the rotation speed of the rotation operation may be positively related to the rotation speed of the viewing angle.
  • the viewing angle adjustment operation may also be a key operation, a drag operation on the virtual joystick area, or a toggle operation on the real joystick device, etc., which is not specifically limited in this application.
  • the viewing angle adjustment operation may also include other methods, such as gesture operation, which is not limited in the embodiment of the present application.
  • different control effects can also be achieved through the combination of the above-mentioned viewing angle adjustment operations.
  • the user's viewing angle adjustment operation on the viewing angle is a sliding operation, and the terminal detects that during the sliding operation
  • the pressing force of the operation during the sliding operation is determined based on whether the pressing force is greater than the preset pressing force, so as to determine whether to shoot.
  • virtual objects can usually control virtual props to fight with other virtual objects.
  • Some firearms and props can also be equipped with sights to observe the virtual scene based on the sights.
  • the sights can be mechanical sights, which refer to the observation equipment that is originally equipped on the firearms.
  • the sight may also be a sight that is subsequently equipped on the firearm, for example, a scope.
  • the sight can have a magnification, and the magnification can be 1, or a value greater than 1.
  • the scope can be a red dot scope, a holographic scope, a double scope, a quadruple scope or an eightfold scope, etc., wherein the red dot scope and the holographic scope have a magnification of 1, and a double The magnification of the scope, the quadruple scope and the eightfold scope is greater than 1.
  • the scope of the scope can also be other values, for example, the scope can also be a triple, six or fifteen times The scope of the scope is not specifically limited in the embodiment of the application.
  • the sight is used to assist the virtual object in aiming and shooting. Therefore, when the virtual object controls the virtual props to aim or shoot, the display mode of the virtual scene can be switched to the display mode based on the sight, which is more convenient and more convenient. Aim and shoot at enemy virtual objects accurately.
  • the virtual scene is not in the sight-based display mode, but in the first display mode.
  • the user wants to control the virtual object to accurately shoot other virtual objects that appear in the virtual scene, and switch the display mode of the virtual scene to the display mode based on sights.
  • the sights on the virtual props can be controlled by the virtual objects. To observe the virtual scene.
  • FIG. 3 is a flowchart of an operation control method provided by an embodiment of the present application.
  • the method can be applied to an electronic device.
  • the electronic device can be provided as a terminal or a server, which is not limited in the embodiment of the present application.
  • the operation control method is executed by the terminal as an example for description. Referring to FIG. 3, the method may include the following steps:
  • the terminal displays a target button in a graphical user interface.
  • the target button corresponds to a display mode switching function and a shooting function.
  • the terminal may provide an operation control function, and the terminal may provide a target button.
  • the target button may correspond to multiple control functions. The user can perform a single touch operation on the target button to achieve multiple control functions.
  • the target button may correspond to a display mode switching function and a shooting function.
  • the method for the terminal to provide the target button may be: the terminal may display the target button in a graphical user interface, so that the user can perform a touch operation on the target button. Specifically, the terminal can implement at least one of the following steps 1 to 3 when displaying the target button:
  • Step 1 The terminal displays the target button at the target position in the graphical user interface.
  • Step 2 The terminal displays the target button in the graphical user interface according to the target size.
  • Step 3 The terminal displays the target button in the graphical user interface according to the target transparency.
  • the display situation of the target button may include at least one display parameter, and the at least one display parameter may include at least one of display position, size, or transparency, wherein the display position of the target button is the target position, and the size is The target size and the transparency are the target transparency.
  • the target position may be determined according to the contact position of the user's hand with the terminal screen when the user is holding the terminal, and may be within a preset range of the contact position.
  • the target position can be set to a position that is easier for the user to click to reduce the complexity of user operations.
  • the target position where the target button 401 is located can be in the lower right corner area of the graphical user interface. When the user holds the terminal, the finger is just at this position.
  • the target size can be smaller and the target transparency to be larger, so that the target button will not cause too much occlusion on the graphical user interface. That is, the target size can be smaller than the normal size of the button, and the target transparency can be greater than the normal transparency, where the normal size of the button is the average size of the button, the normal transparency is the general transparency of the button, or the normal transparency can be 0, These can be set by relevant technical personnel according to requirements, which are not limited in the embodiment of the present application.
  • the target button can also display target prompt information.
  • the target prompt information is used to prompt that the target button has a display mode switching function and a shooting control function.
  • the target button can display the style of aiming point and bullet, so It is suggested that the target button can provide display mode switching function and shooting control function at the same time.
  • At least one of the target position, target size, and target transparency can also be customized by the user according to their own usage habits.
  • the terminal can provide a configuration interface, and the user can configure Customize settings in the interface to change the display of the target button.
  • the setting process of this display situation can be realized through at least one of the following steps 1 to 3:
  • Step 1 The terminal obtains the position adjustment information of the target button based on the configuration interface, and obtains the target position of the target button based on the position adjustment information.
  • Step 2 The terminal obtains the size adjustment information of the target button based on the configuration interface, and obtains the target size of the target button based on the size adjustment information.
  • Step 3 The terminal obtains the transparency adjustment information of the target button based on the configuration interface, and obtains the target transparency of the target button based on the transparency adjustment information.
  • the display parameter adjustment options of the target button can be provided in the configuration interface.
  • the display parameter adjustment options are: button size (size), transparency, and the target button.
  • the terminal can obtain corresponding adjustment information according to the operation, and adjust the display parameters of the target button based on the adjustment information.
  • the user can adjust the adjustment bar of the button size, and the terminal can adjust the size of the displayed target button based on the user's adjustment operation on the adjustment bar, so as to provide an adjustment preview effect, such as "179%” in Figure 5 "Means that the size of the target button is 179% of the default size.
  • the user can also adjust the transparency adjustment bar. For example, "100%" in FIG.
  • the transparency of the target button is 100%.
  • the terminal can adjust the position of the target button based on the dragging operation of the target button.
  • the position of the target button can change as the position of the drag operation changes.
  • the dashed line position is the original position of the target button.
  • the terminal may display the target button in the configuration interface, and the user may select the target button.
  • the terminal may display the target button Display parameter adjustment options.
  • other buttons may be provided in the configuration interface, so that the user can also set the display parameters of other buttons, which is not limited in the embodiment of the present application.
  • the state of the operation control function may include an on state and an off state, and the user can set the state of the operation control function to determine whether the terminal is required to provide the operation control function for it. Specifically, when it is determined that the operation control function is in the on state according to the configuration information, the terminal may execute the step of displaying the target button in the graphical user interface. That is, the terminal provides the operation control function when it is determined that the operation control function is turned on, and when it is determined that the operation control function is turned off according to the configuration information, the terminal may not perform step 301.
  • the terminal can set the status of the operation control function through the function configuration interface. Specifically, the terminal can set the status of the operation control function based on the function configuration interface and the status setting operation of the operation control function. That is, based on the function configuration interface, the configuration information of the operation control function is determined to determine the state of the operation control function.
  • the status setting options of the operation control function can be provided in the function configuration interface, for example, the opening option and the closing option.
  • the user can perform touch operations on the status setting options of the operation control function to change the status of the operation control function .
  • the target button can be called "one-key open mirror fire button", and the target button can be surrounded by an on ("on") and off option ("off").
  • the user can select the on or off option to change the use state of the target button, that is, to change the state of the operation control function. If the user selects the on option, the operation control function is in the on state, and if the user selects the off option, the operation control function is in the off state.
  • the terminal obtains the type of the virtual item controlled by the current virtual object. If the type is the first type, perform step 303 and step 304, and if the type is the first type For the second type, step 305 and step 306 are executed. If the type is the third type, step 307 and step 308 are executed.
  • the terminal can obtain the type of the virtual item controlled by the current virtual object, so as to further determine the operation control function that needs to be provided according to the type.
  • the type of the virtual item can be set by relevant technicians according to requirements, or can be determined based on the name of the virtual item, of course, it can also be set by the user according to their own usage habits, which is not limited in the embodiment of the application.
  • the types of virtual props may include the first type, the second type, and the third type.
  • the terminal may perform the following two steps from step 303 to step 308 respectively. . That is, when the types of the virtual items are different, the terminal can provide different control functions, and the modes of the provided control functions can also be different. This mode is used to indicate how to perform corresponding functions based on touch operations.
  • the type of the virtual item includes three types as an example for description.
  • the virtual item may also include a fourth type. When the type of the virtual item is the fourth type, the terminal may also perform other operation control functions.
  • the embodiment of the present application does not limit the type of the virtual item.
  • the terminal determines the display mode switching function and the shooting function as the target function to be triggered, determines the target display mode switching mode as the target mode corresponding to the display mode switching function, and determines the first shooting mode as the shooting The target mode corresponding to the function.
  • the terminal may execute step 303, and execute the following step 304 based on the target function determined in step 303 and the corresponding target mode.
  • the terminal may determine to trigger both the display mode switching function and the shooting function.
  • the display mode of the virtual scene may include a first display mode and a second display mode.
  • the second display mode may be a display mode based on a sight
  • the first display mode may be a display mode other than the second display mode.
  • the display mode switching function refers to switching the display mode of the virtual scene.
  • the terminal can also determine the target mode of the two respectively.
  • the terminal determines that the target mode corresponding to the display mode switching function is the target display mode switching mode, and the target mode corresponding to the shooting function is the first shooting mode.
  • the target modes of the two are used to indicate how to execute the corresponding target function at different stages of the touch operation.
  • the touch operation may include multiple stages: when the touch operation starts, a period during which the touch operation continues, and when the touch operation ends.
  • the first type of virtual props may include multiple shooting types.
  • the first type of virtual props may include a first shooting type and a second shooting type.
  • the first shooting type may be an automatic shooting type
  • the second shooting type may be a single shot shooting type.
  • the first type of virtual props may also include a shooting mode, for example, the first type of virtual
  • the props can include the first shooting type, that is, the automatic shooting type.
  • the automatic shooting type refer to step 304 below, and the embodiments of the present application will not be described in detail here.
  • the terminal Based on the target display mode switching mode and the first shooting mode, the terminal switches the display mode of the virtual scene from the first display mode to the second display mode when detecting the start of the touch operation.
  • the shooting function is continuously executed during the continuous period, and when the end of the touch operation is detected, the display mode of the virtual scene is switched from the second display mode to the first display mode.
  • the terminal After the terminal determines the target function and the corresponding target mode, it can execute the corresponding target function based on the target mode and touch operation.
  • the target display mode switching mode is used to indicate that the display mode of the virtual scene is switched from the first display mode to the second display mode when the touch operation starts, and the display mode of the virtual scene is changed from the second display mode when the touch operation ends.
  • the second display mode is switched to the first display mode.
  • the first shooting mode is used to indicate that the shooting function is continuously executed during the duration of the touch operation. Therefore, the terminal can execute this step 304 to realize the display mode switching function and shooting function.
  • the touch operation is a long-press operation as an example.
  • the user can perform a long-press operation on the target button.
  • the terminal can switch the display mode of the virtual scene from the first display mode to the second display mode (the display mode based on the sight), as shown in FIG. 7, the display mode of the virtual scene in FIG. 7 is the second display mode. That is to say, it is based on the display mode of the sight.
  • the terminal can continue to perform the shooting function.
  • the terminal can stop performing the shooting function and change the display mode of the virtual scene from the first The second display mode is switched to the first display mode.
  • the process of switching the display mode from the first display mode to the second display mode is called “mirror-on”
  • the process of switching the display mode from the second display mode to the first display mode is called “mirror-off”.
  • the terminal may turn on the lens when detecting the start of the touch operation and continue shooting, and stop shooting and turn off the lens when the touch operation ends.
  • the user only needs to press a target button to complete the operation process of "opening-shooting-closing the mirror", which greatly speeds up the user's aiming and shooting experience in emergency situations.
  • the display mode switching (aiming) and shooting process can be quickly completed by operating the target button once, and because the button operation is simple and convenient, the game experience of the player can be effectively improved.
  • the first type of virtual props may include multiple shooting types, and the modes of operation control functions that the terminal needs to perform in different shooting types may also be different. Therefore, if the type of the virtual item is the first type, the terminal can obtain the current shooting type of the virtual item, and determine how to perform the operation control function further based on the shooting type.
  • the following uses two shooting types as examples:
  • the terminal may execute the above step 303, that is, execute the determination of the display mode switching function and the shooting function as the target function to be triggered, and the target display mode switching mode as the The step of displaying the target mode corresponding to the mode switching function, and determining the first shooting mode as the target mode corresponding to the shooting function.
  • the terminal may further perform step 304.
  • the first shooting type may be used to indicate that the virtual prop can be continuously shot based on one operation.
  • the first shooting type may be called an automatic shooting type.
  • the terminal may execute step 303 and step 304.
  • the terminal may determine the display mode switching function and the shooting function as the target function to be triggered, and the target display mode switching mode as the target mode corresponding to the display mode switching function ,
  • the second shooting mode is determined as the target mode corresponding to the shooting function.
  • the second shooting type can be used to indicate that the virtual prop is shot once based on one operation.
  • the second shooting type may be referred to as a single shot type.
  • the terminal can determine that the target mode corresponding to the shooting function is the second shooting type.
  • Shooting mode so that in step 304, the terminal can perform the shooting function in different situations.
  • the lens can be opened when the touch operation starts, and the lens can be shot and closed when the touch operation ends.
  • the terminal may not acquire the shooting type of the virtual prop, and perform steps 303 and 304 when the shooting type is the first shooting type or the second shooting type, that is, no Taking the shooting type of the virtual prop as the reference factor of the shooting mode, the embodiment of the present application does not limit which method is used.
  • the above steps 303 and 304 are the operation control steps that the terminal needs to perform when the type of the virtual item is the first type. If the type is the second type, the terminal can perform the following steps 305 and 306, if the type is the third type Type, the terminal can perform the following steps 307 and 308.
  • the terminal determines the display mode switching function and the shooting function as the target function to be triggered, determines the target display mode switching mode as the target mode corresponding to the display mode switching function, and determines the second shooting mode as the shooting The target mode corresponding to the function.
  • step 302 after the terminal obtains the type of the virtual item, if the type is the second type, the terminal can perform step 305, and based on the target function determined in step 305 and the corresponding target mode, perform the following step 306 .
  • the terminal may determine to trigger both the display mode switching function and the shooting function.
  • the terminal can also determine the target mode of the two respectively, the terminal determines that the target mode corresponding to the display mode switching function is the target display mode switching mode, and the target mode corresponding to the shooting function is the second shooting mode.
  • the target modes of the two are used to indicate how to execute the corresponding target function at different stages of the touch operation.
  • the second type of virtual props may include a second shooting mode, and the second shooting mode is used to indicate that the virtual props are shot once based on one operation.
  • the second shooting mode of the second type of virtual prop can be the same as the second shooting mode when the first type of virtual prop is in the second shooting type in step 304. This is called a single shot type.
  • the terminal Based on the target display mode switching mode and the second shooting mode, the terminal switches the display mode of the virtual scene from the first display mode to the second display mode when detecting the start of the touch operation, and when the touch is detected When the control operation ends, the shooting function is executed to switch the display mode of the virtual scene from the second display mode to the first display mode.
  • the terminal After the terminal determines the target function and the corresponding target mode, it can execute the corresponding target function based on the target mode and touch operation.
  • the target display mode switching mode is the same as that in step 304, and the second shooting mode is used to indicate that the shooting function is executed when the touch operation ends. Therefore, the terminal executes step 306 to realize the display mode switching function and shooting function.
  • the touch operation is taken as an example of a long-press operation.
  • the user can perform a long-press operation on the target button.
  • the terminal The display mode of the virtual scene can be switched from the first display mode to the second display mode (display mode based on sights).
  • the terminal can stop performing the shooting function and change the display mode of the virtual scene from the second display mode.
  • the second display mode is switched to the first display mode.
  • the process of switching the display mode to the display mode based on the sight is referred to as "opening the mirror", and the process of switching the display mode from the display mode based on the sight to the first display mode is referred to as “closing the mirror”, as shown in Figure 8. It is shown that in this step 306, the terminal can turn on the mirror when detecting the start of the touch operation, and shoot and turn off the mirror when the touch operation ends.
  • FIG. 8 only shows the virtual scene after the lens is turned off at the end of the touch operation, and other processes are not shown. Among them, the display mode of the virtual scene is the display mode based on the sight, please refer to FIG. 7.
  • steps 305 and 306 are operation control steps that the terminal needs to perform when the type of the virtual item is the second type. If the type is the third type, the terminal can perform the following steps 307 and 308.
  • the terminal determines the shooting function as the target function to be triggered, and determines the third shooting mode as the target mode corresponding to the shooting function.
  • step 302 after the terminal obtains the type of the virtual item, if the type is the third type, the terminal can perform step 307, and based on the target function determined in step 307 and the corresponding target mode, perform the following step 308 .
  • the terminal can trigger the shooting function without triggering the display mode switching function. Therefore, the terminal can also determine the target mode of the shooting function, that is, the third shooting mode.
  • the third shooting mode is used to indicate at which stage of the touch operation the shooting function is executed.
  • the terminal executes the shooting function when detecting the start of the touch operation.
  • the terminal After the terminal determines the target function and the corresponding target mode, it can execute the corresponding target function based on the target mode and touch operation. In this step 308, the terminal may execute the shooting function based on the third shooting mode. In a possible implementation manner, the third target mode is used to indicate that the shooting function is executed when the touch operation starts. Therefore, the terminal executes this step 308 to realize the shooting mode.
  • a touch operation as a click operation.
  • the terminal can perform shooting functions. If the touch operation is a long-press operation, when the touch operation starts, the terminal can execute the shooting function, and the terminal can no longer execute the shooting function until the touch operation ends. As shown in FIG. 9, in this step 308, the terminal can shoot when it detects the start of the touch operation. In the operation process of this touch operation, the terminal may not perform the above-mentioned lens opening and closing processes.
  • step 302 step 303, step 305 and step 307 are when a touch operation on the target button is detected, according to the type of virtual item controlled by the current virtual object, determine the display mode switching function and The target function to be triggered in the shooting function and the target mode corresponding to the target function.
  • step 304, step 306, and step 308 are the process of executing the target function in the GUI based on the target mode.
  • the virtual prop When the type is different, the terminal determines that the target function to be triggered and the corresponding target mode may be different, and the operation control steps that the terminal needs to perform may also be different.
  • the above process divides the types of virtual props and provides different operation control procedures for different types of virtual props, instead of adopting the same operation control procedure, so that most firearms will not feel uncomfortable when shooting. This will not affect the situation of engagement, and will not lose too much due to small losses.
  • the operation and control process is flexible and convenient, which can meet the needs of different shooting scenarios. And through a target button, you can achieve a variety of operating control procedures, providing a smooth operating experience, will not bring operating pressure to the user, but simplifies the user operation, and the user can customize the settings according to their own habits. Adjust the function of the button to the situation that best suits your hand to improve user experience.
  • the above only provides the cases where the types of virtual props include the first type, the second type and the third type.
  • the virtual props may also include other types, and the above can be performed by relevant technical personnel according to requirements.
  • the setting of course, can also be explained by the user according to his own habits.
  • the terminal may provide an operation control function configuration interface.
  • the operation control function configuration interface may include control function options and mode options. The user can select and operate in the control function options and execute logic The setting operation is performed in the options, so that the terminal can update the control function corresponding to the target button and the target mode corresponding to the control function based on the selection operation or setting operation.
  • the operation control function configuration interface can also provide virtual prop type setting options. The user can set the type of virtual props in this type setting option, and can also set which types of virtual channels have. The above can be determined by Related technical personnel make custom settings according to their own usage habits, which are not limited in the embodiment of the present application.
  • the viewing angle adjustment function may also be provided through the target button.
  • the terminal When the user performs a touch operation on the target button, the terminal may also perform a virtual operation based on the touch operation. The viewing angle of the scene is adjusted.
  • the target button may also correspond to the viewing angle adjustment function.
  • the terminal when a touch operation on the target button is detected, the terminal may also determine the viewing angle adjustment function as the target function to be triggered, and the target viewing angle adjustment mode as the The target mode corresponding to the viewing angle adjustment function.
  • the terminal may also adjust the viewing angle of the virtual scene based on the operation direction and operation speed of the touch operation.
  • the target rotation speed of the viewing angle of the virtual scene may be obtained based on the operation direction and operation speed of the touch operation.
  • the target rotation speed includes the direction and size, and the terminal may be based on The target rotation speed is used to adjust the viewing angle of the virtual scene.
  • the terminal can change the display mode of the virtual scene from the first display
  • the mode is switched to the second display mode, and the shooting function is performed continuously during the touch operation.
  • the user wants to adjust the viewing angle, he can press and slide the target button, or drag the target button, so that the terminal can operate based on the operation
  • the direction and operating speed are adjusted for viewing angle.
  • the position in the virtual scene aimed at by the aiming point will also change, so that during the viewing angle adjustment process, the user can adjust the aiming position and continue shooting until the touch operation ends ,
  • the terminal stops shooting and switches back to the first display mode.
  • the terminal switches the display mode of the virtual scene from the first display mode to the second display mode.
  • the user The viewing angle can be adjusted to change the aiming position. After the adjustment, the user stops the touch operation. At this time, the terminal can shoot at the position aimed at by the current aiming point and switch back to the first display mode.
  • the terminal when the touch operation starts, the terminal can shoot, and when the touch operation continues, it no longer shoots, but adjusts the viewing angle based on the touch operation.
  • the target button may also be used to provide an action control function.
  • the terminal may also control the virtual device based on the touch operation. The subject makes corresponding actions.
  • the target button may also correspond to an action control function.
  • the terminal may also determine the action control function as the target function to be triggered, and the target action control mode as the action The target mode corresponding to the control function.
  • the terminal may also perform any of the following:
  • the terminal controls the current virtual object to perform the target action.
  • the terminal controls the current virtual object to perform the target action; when the end of the touch operation is detected, the terminal controls the current virtual object to return to executing the target action The action before the target action.
  • the target action can be a sideways action, or a kneeling action, a squat action, or a jumping action.
  • the terminal can control the virtual object to turn sideways and execute Other actions in steps 304, 306, or 308 may also perform the above-mentioned viewing angle adjustment step.
  • the terminal may control the virtual object to restore from the sideways state to the previous state.
  • the above-mentioned operation control method can also provide action control functions for multiple actions.
  • the terminal can also control the virtual The subject makes corresponding actions.
  • the target button also corresponds to an action control function.
  • the action control function may include multiple action control modes, and each action control mode may correspond to an action of the virtual object, or may correspond to a control situation of an action of the virtual object.
  • the terminal may also obtain at least one of the current motion state of the virtual object and the environment of the current virtual object in the virtual scene, as a basis for judgment
  • the terminal may determine the action control function as the target function to be triggered, according to at least one of the current motion state of the virtual object and the environment of the current virtual object in the virtual scene , Determine the target motion control mode among the multiple motion control modes as the target mode corresponding to the motion control function.
  • the terminal may also perform any of the following:
  • the terminal controls the current virtual object to execute the target action corresponding to the target action control mode.
  • the terminal controls the current virtual object to execute the target action corresponding to the target action control mode; when the end of the touch operation is detected, the terminal controls the current virtual object The action reverts to the action before the target action was executed.
  • the action control function can correspond to the action control mode of multiple actions, and the terminal can obtain it according to the current motion state of the virtual object, or according to the environment of the current virtual object in the virtual scene, or both. This time, it is necessary to control which action is the target action performed by the virtual object, and to control the virtual object to perform the target action according to which action mode, so as to control the virtual object to make corresponding actions in the subsequent.
  • the multiple actions may include three actions sideways, squatting down, and lying down.
  • the terminal can determine that the target action control mode is the action control mode of sideways action, and then Control the virtual object to make sideways movements.
  • the terminal may determine that the target motion control mode is the motion control mode of a squatting motion, and then control the virtual object to make a squatting motion.
  • the multiple motion control modes and the corresponding relationship between the motion control modes and the motion state of the virtual object and the environment in the virtual scene can all be It is set by the relevant technical personnel according to requirements, or the user may customize the setting according to his own usage habits, which is not limited in the embodiment of the present application.
  • Fig. 10 where the operation control function is referred to as the "one-key opening and firing function", and the terminal can detect the one-key opening and opening function. Whether the fire function is turned on, if it is turned on, the terminal can read the position, size and transparency data of the one-key mirror fire button in the custom panel (configuration interface) and apply it, that is, the terminal can judge the status of the operation control function If the operation control function is on, the terminal can display a target button in the graphical user interface, and the target button has three display parameters, which can be configured in the configuration interface.
  • the terminal can detect the type of the firearm that the player is holding, which means that the terminal can obtain the type of virtual props controlled by the current virtual object. If the type is the first type, for example, the virtual prop is a shooter rifle with a fully automatic shooting type. You can turn on the mirror when the user presses the target button, shoot during the hold, and turn off the mirror when the user lets go. If the type is the second type, for example, the virtual prop is a sniper rifle with no automatic shooting type, the terminal can open the mirror when the user presses the target button, and close the mirror and shoot when the user releases it. If the type is the third type, for example, the virtual prop is a precision shooter rifle without automatic shooting type, the terminal can shoot when the user presses it. Since then, the terminal has completed this one-key mirroring and firing function and detecting the player's next operation. If the one-button mirroring and firing function is turned off, the terminal can gradually detect whether the player has enabled the function in each subsequent frame.
  • the target button corresponding to the display mode switching function and the shooting function is displayed in the graphical user interface.
  • the type of the virtual item controlled by the virtual object can be determined.
  • the function or functions to be triggered by the touch operation is determined, which mode needs to be followed to execute the corresponding function this time, and the display mode switching function and the shooting function are corresponding to the same button.
  • FIG. 11 is a flowchart of an operation control method provided by an embodiment of the present application. Referring to FIG. 11, the method may include the following steps:
  • the terminal displays a target button in a graphical user interface.
  • the target button corresponds to multiple control functions, and the multiple control functions include at least two of a display mode switching function, a shooting function, an action control function, or a viewing angle adjustment function.
  • This step 1101 is the same as the above step 301, and the display process of the target button may also include at least one of the following steps 1 to 3:
  • Step 1 The terminal displays the target button at the target position in the graphical user interface.
  • Step 2 The terminal displays the target button in the graphical user interface according to the target size.
  • Step 3 The terminal displays the target button in the graphical user interface according to the target transparency.
  • At least one of the target position, target size, and target transparency can be set by relevant technicians according to their needs, or they can be set by the user according to their own habits.
  • the user-configured setting can also include the following steps 1 to 3 At least one step in:
  • Step 1 The terminal obtains the position adjustment information of the target button based on the configuration interface, and obtains the target position of the target button based on the position adjustment information.
  • Step 2 The terminal obtains the size adjustment information of the target button based on the configuration interface, and obtains the target size of the target button based on the size adjustment information.
  • Step 3 The terminal obtains the transparency adjustment information of the target button based on the configuration interface, and obtains the target transparency of the target button based on the transparency adjustment information.
  • step 301 For the above content, please refer to the above step 301, which will not be repeated here in the embodiment of the application.
  • the terminal When a touch operation on the target button is detected, the terminal according to at least one of the type of virtual prop controlled by the current virtual object, the motion state of the current virtual object, or the environment of the current virtual object in the virtual scene To determine the target function to be triggered among the multiple control functions and the target mode corresponding to the target function.
  • This step 1102 is the same as the content shown in the above step 302 and step 303, or the content shown in step 302 and step 305, or the content shown in step 302 and step 307, except that the content shown in the above step 302 and step 303, Or the content shown in step 302 and step 305, or the content shown in step 302 and step 307 only provides the type of virtual props controlled by the current virtual object, and determines the target button from the multiple control functions corresponding to the target button.
  • the terminal may also be based on the current motion state of the virtual object, or according to the environment of the current virtual object in the virtual scene, or according to any of the above three influencing factors Combination to determine the target function and the target mode corresponding to the target function.
  • the process of determining the target function and target mode according to the motion state or the environment shown in step 308 above is different in the embodiments of the present application. One enumerate.
  • the corresponding relationship between the influencing factors and the control function, and the target mode corresponding to the target function can be set by the relevant technical personnel according to requirements, or by The user performs custom settings according to his own usage habits, which is not limited in the embodiment of the present application.
  • the terminal executes the target function in the graphical user interface based on the target mode.
  • This step 1103 is the same as the above-mentioned steps 304, 306 and step 308, and will not be repeated here in the embodiment of the present application.
  • the three types of examples in the embodiment shown in FIG. 3 and any implementation manner are the same in this embodiment, and the embodiment of the present application will not be repeated here.
  • the target button corresponding to multiple control functions is displayed in the graphical user interface.
  • the target function to be triggered by this operation can be determined according to various influencing factors. Which, and in what mode should these target functions be executed? Associate multiple control functions with the same button. By performing a single touch operation on a button, multiple operation control functions can be realized, and the factors can be affected.
  • the target button performs one operation to realize different operation control functions, the operation process is simple and convenient, and the operation control efficiency is high.
  • steps in the embodiments of the present application are not necessarily executed in sequence in the order indicated by the step numbers. Unless specifically stated in this article, the execution of these steps is not strictly limited in order, and these steps can be executed in other orders. Moreover, at least part of the steps in each embodiment may include multiple sub-steps or multiple stages. These sub-steps or stages are not necessarily executed at the same time, but can be executed at different times. The execution of these sub-steps or stages The sequence is not necessarily performed sequentially, but may be performed alternately or alternately with at least a part of other steps or sub-steps or stages of other steps.
  • FIG. 12 is a schematic structural diagram of an operation control device provided by an embodiment of the present application.
  • the device may include:
  • the first display module 1201 is configured to display a target button in a graphical user interface, the target button corresponding to the display mode switching function and the shooting function;
  • the first determining module 1202 is used to determine the target function to be triggered in the display mode switching function and the shooting function according to the type of the virtual item controlled by the current virtual object when a touch operation on the target button is detected.
  • the target mode corresponding to the target function
  • the first execution module 1203 is configured to execute the target function in the graphical user interface based on the target mode.
  • the first determining module 1202 is configured to perform any one of the following:
  • the display mode switching function and the shooting function are determined as the target function to be triggered, and the target display mode switching mode is determined as the target mode corresponding to the display mode switching function.
  • the first shooting mode is determined as the target mode corresponding to the shooting function;
  • the display mode switching function and the shooting function are determined as the target function to be triggered, and the target display mode switching mode is determined as the target mode corresponding to the display mode switching function.
  • the second shooting mode is determined as the target mode corresponding to the shooting function;
  • the shooting function is determined as the target function to be triggered, and the third shooting mode is determined as the target mode corresponding to the shooting function.
  • the first execution module 1203 is configured to execute any one of the following:
  • the type of the virtual prop is the first type, based on the target display mode switching mode and the first shooting mode, when the start of the touch operation is detected, the display mode of the virtual scene is switched from the first display mode to the second A display mode, continuously performing the shooting function during the duration of the touch operation, and switching the display mode of the virtual scene from the second display mode to the first display mode when the end of the touch operation is detected;
  • the type of the virtual item is the second type, based on the target display mode switching mode and the second shooting mode, when the start of the touch operation is detected, the display mode of the virtual scene is switched from the first display mode to the second Display mode, when the end of the touch operation is detected, the shooting function is executed to switch the display mode of the virtual scene from the second display mode to the first display mode;
  • the shooting function is executed.
  • the first determining module 1202 is used to:
  • the shooting type is the first shooting type
  • execute the display mode switching function and the shooting function to determine the target function to be triggered, and determine the target display mode switching mode as the target mode corresponding to the display mode switching function , The step of determining the first shooting mode as the target mode corresponding to the shooting function;
  • the display mode switching function and the shooting function are determined as the target function to be triggered, and the target display mode switching mode is determined as the target mode corresponding to the display mode switching function.
  • the second shooting mode is determined as the target mode corresponding to the shooting function.
  • the first display module 1201 is configured to perform at least one of the following:
  • the target button is displayed in the graphical user interface according to the target transparency.
  • the device further includes an acquisition module configured to perform at least one of the following:
  • the transparency adjustment information of the target button is obtained, and based on the transparency adjustment information, the target transparency of the target button is obtained.
  • the target button also corresponds to a viewing angle adjustment function
  • the first determining module 1202 is further configured to determine the viewing angle adjustment function as the to-be-triggered when a touch operation on the target button is detected.
  • the target function of, the target viewing angle adjustment mode is determined as the target mode corresponding to the viewing angle adjustment function;
  • the first execution module 1203 is also configured to adjust the mode based on the target viewing angle, and during the duration of the touch operation, adjust the viewing angle of the virtual scene based on the operation direction and operation speed of the touch operation.
  • the target button also corresponds to an action control function
  • the first determining module 1202 is further configured to determine the action control function as the to-be-triggered when a touch operation on the target button is detected To determine the target action control mode as the target mode corresponding to the action control function;
  • the first execution module 1203 is also used to execute any one of the following:
  • the current virtual object is controlled to perform the target action; when the end of the touch operation is detected, the action of controlling the current virtual object is restored to before the target action is performed Actions.
  • the target button also corresponds to an action control function
  • the first determining module 1202 is further configured to determine the action control function as the to-be-triggered when a touch operation on the target button is detected
  • the target motion control mode of the multiple motion control modes is determined as the target corresponding to the motion control function mode
  • the first execution module 1203 is also used to execute any one of the following:
  • the current virtual object is controlled to execute the target action corresponding to the target action control mode; when the end of the touch operation is detected, the action of the current virtual object is controlled Revert to the action before the target action was executed.
  • the first display module 1201 is configured to execute the step of displaying the target button in the graphical user interface when it is determined that the operation control function is in the on state according to the configuration information.
  • the device provided by the embodiment of the present application displays the target button corresponding to the display mode switching function and the shooting function in the graphical user interface.
  • a touch operation on the target button is detected, it can be based on the virtual props controlled by the virtual object.
  • the operation control device provided in the above embodiment realizes operation control
  • only the division of the above functional modules is used as an example for illustration.
  • the above functions can be allocated by different functional modules as needed. That is, the internal structure of the electronic device is divided into different functional modules to complete all or part of the functions described above.
  • the operation control device and the operation control method embodiment provided by the above-mentioned embodiment belong to the same concept. For the specific implementation process, please refer to the method embodiment, which will not be repeated here.
  • FIG. 13 is a schematic structural diagram of an operation control device provided by an embodiment of the present application.
  • the device may include:
  • the second display module 1301 is configured to display a target button in a graphical user interface, the target button corresponding to a plurality of control functions, the plurality of control functions include at least one of a display mode switching function, a shooting function, an action control function, or a viewing angle adjustment function Two
  • the second determining module 1302 is configured to, when a touch operation on the target button is detected, according to the type of virtual prop controlled by the current virtual object, the motion state of the current virtual object, and the current virtual object’s position in the virtual scene At least one item in the environment, determining the target function to be triggered among the multiple control functions and the target mode corresponding to the target function;
  • the second execution module 1303 is configured to execute the target function in the graphical user interface based on the target mode.
  • the second determining module 1302 is configured to perform any one of the following:
  • the display mode switching function and the shooting function are determined as the target function to be triggered, and the target display mode switching mode is determined as the target mode corresponding to the display mode switching function.
  • the first shooting mode is determined as the target mode corresponding to the shooting function;
  • the display mode switching function and the shooting function are determined as the target function to be triggered, and the target display mode switching mode is determined as the target mode corresponding to the display mode switching function.
  • the second shooting mode is determined as the target mode corresponding to the shooting function;
  • the shooting function is determined as the target function to be triggered, and the third shooting mode is determined as the target mode corresponding to the shooting function.
  • the second execution module 1303 is configured to execute any one of the following:
  • the type of the virtual prop is the first type, based on the target display mode switching mode and the first shooting mode, when the start of the touch operation is detected, the display mode of the virtual scene is switched from the first display mode to the second A display mode, continuously performing the shooting function during the duration of the touch operation, and switching the display mode of the virtual scene from the second display mode to the first display mode when the end of the touch operation is detected;
  • the type of the virtual item is the second type, based on the target display mode switching mode and the second shooting mode, when the start of the touch operation is detected, the display mode of the virtual scene is switched from the first display mode to the second Display mode, when the end of the touch operation is detected, the shooting function is executed to switch the display mode of the virtual scene from the second display mode to the first display mode;
  • the shooting function is executed.
  • the second determining module 1302 is used to:
  • the shooting type is the first shooting type
  • execute the display mode switching function and the shooting function to determine the target function to be triggered, and determine the target display mode switching mode as the target mode corresponding to the display mode switching function , The step of determining the first shooting mode as the target mode corresponding to the shooting function;
  • the display mode switching function and the shooting function are determined as the target function to be triggered, and the target display mode switching mode is determined as the target mode corresponding to the display mode switching function.
  • the second shooting mode is determined as the target mode corresponding to the shooting function.
  • the second display module 1301 is configured to perform at least one of the following:
  • the target button is displayed in the graphical user interface according to the target transparency.
  • the device further includes an acquisition module configured to perform at least one of the following:
  • the transparency adjustment information of the target button is obtained, and based on the transparency adjustment information, the target transparency of the target button is obtained.
  • the second determining module 1302 is further configured to determine the viewing angle adjustment function as the to-be-triggered when a touch operation on the target button is detected and the target button corresponds to the viewing angle adjustment function.
  • Target function determining the target angle of view adjustment mode as the target mode corresponding to the angle of view adjustment function;
  • the second execution module 1303 is configured to adjust the viewing angle of the virtual scene based on the target viewing angle adjustment mode, and during the duration of the touch operation, based on the operating direction and operating speed of the touch operation.
  • the second determining module 1302 is configured to determine the action control function as the target to be triggered when a touch operation on the target button is detected and the target button corresponds to an action control function Function to determine the target action control mode as the target mode corresponding to the action control function;
  • the second execution module 1303 is configured to execute at least one of the following:
  • the second determining module 1302 is configured to determine the action control function as the target to be triggered when a touch operation on the target button is detected and the target button corresponds to an action control function Function, according to at least one of the motion state of the current virtual object and the environment of the current virtual object in the virtual scene, determining the target motion control mode of the multiple motion control modes as the target mode corresponding to the motion control function;
  • the second execution module 1303 is configured to execute at least one of the following:
  • the second display module 1301 is configured to execute the step of displaying the target button in the graphical user interface when it is determined according to the configuration information that the operation control function is in the on state.
  • the operation to be triggered can be determined according to multiple influencing factors What are the target functions of the device, and in what mode should these target functions be executed? Associate multiple control functions with the same button.
  • Influencing factors one operation of the target button realizes different operation control functions, the operation process is simple and convenient, and the operation control efficiency is high.
  • the operation control device provided in the above embodiment realizes operation control
  • only the division of the above functional modules is used as an example for illustration.
  • the above functions can be allocated by different functional modules as needed. That is, the internal structure of the electronic device is divided into different functional modules to complete all or part of the functions described above.
  • the operation control device and the operation control method embodiment provided by the above-mentioned embodiment belong to the same concept. For the specific implementation process, please refer to the method embodiment, which will not be repeated here.
  • the above-mentioned electronic device may be provided as the terminal shown in FIG. 14 below, or may be provided as the server shown in FIG. 15 below, which is not limited in the embodiment of the present application.
  • FIG. 14 is a schematic structural diagram of a terminal provided by an embodiment of the present application.
  • the terminal 1400 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, moving picture expert compression standard audio layer 3), MP4 (Moving Picture Experts Group Audio Layer IV, moving picture expert compressing standard audio Level 4) Player, laptop or desktop computer.
  • the terminal 1400 may also be called user equipment, portable terminal, laptop terminal, desktop terminal and other names.
  • the terminal 1400 includes: one or more processors 1401 and one or more memories 1402.
  • the processor 1401 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on.
  • the processor 1401 may adopt at least one hardware form among DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array, Programmable Logic Array). achieve.
  • the processor 1401 may also include a main processor and a coprocessor.
  • the main processor is a processor used to process data in the wake state, also called a CPU (Central Processing Unit, central processing unit); the coprocessor is A low-power processor used to process data in the standby state.
  • the processor 1401 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used to render and draw content that needs to be displayed on the display screen.
  • the processor 1401 may further include an AI (Artificial Intelligence) processor, which is used to process computing operations related to machine learning.
  • AI Artificial Intelligence
  • the memory 1402 may include one or more computer-readable storage media, which may be non-transitory.
  • the memory 1402 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
  • the non-transitory computer-readable storage medium in the memory 1402 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 1401 to implement the operation control provided by the method embodiment of the present application. method.
  • the terminal 1400 optionally further includes: a peripheral device interface 1403 and at least one peripheral device.
  • the processor 1401, the memory 1402, and the peripheral device interface 1403 may be connected by a bus or a signal line.
  • Each peripheral device can be connected to the peripheral device interface 1403 through a bus, a signal line, or a circuit board.
  • the peripheral device includes: at least one of a radio frequency circuit 1404, a display screen 1405, a camera 1406, an audio circuit 1407, a positioning component 1408, and a power supply 1409.
  • the peripheral device interface 1403 may be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1401 and the memory 1402.
  • the processor 1401, the memory 1402, and the peripheral device interface 1403 are integrated on the same chip or circuit board; in some other embodiments, any one of the processor 1401, the memory 1402, and the peripheral device interface 1403 or The two can be implemented on separate chips or circuit boards, which are not limited in this embodiment.
  • the radio frequency circuit 1404 is used for receiving and transmitting RF (Radio Frequency, radio frequency) signals, also called electromagnetic signals.
  • the radio frequency circuit 1404 communicates with a communication network and other communication devices through electromagnetic signals.
  • the radio frequency circuit 1404 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals.
  • the radio frequency circuit 1404 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a user identity module card, and so on.
  • the radio frequency circuit 1404 can communicate with other terminals through at least one wireless communication protocol.
  • the wireless communication protocol includes but is not limited to: metropolitan area network, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area network and/or WiFi (Wireless Fidelity, wireless fidelity) network.
  • the radio frequency circuit 1404 may also include NFC (Near Field Communication) related circuits, which is not limited in this application.
  • the display screen 1405 is used for displaying UI (User Interface).
  • the UI can include graphics, text, icons, videos, and any combination thereof.
  • the display screen 1405 also has the ability to collect touch signals on or above the surface of the display screen 1405.
  • the touch signal may be input to the processor 1401 as a control signal for processing.
  • the display screen 1405 may also be used to provide virtual buttons and/or virtual keyboards, also called soft buttons and/or soft keyboards.
  • the display screen 1405 may be one display screen 1405, which is provided with the front panel of the terminal 1400; in other embodiments, there may be at least two display screens 1405, which are respectively arranged on different surfaces of the terminal 1400 or in a folded design; In still other embodiments, the display screen 1405 may be a flexible display screen, which is disposed on the curved surface or the folding surface of the terminal 1400. Moreover, the display screen 1405 can also be set as a non-rectangular irregular pattern, that is, a special-shaped screen.
  • the display screen 1405 can be made of materials such as LCD (Liquid Crystal Display) and OLED (Organic Light-Emitting Diode).
  • the camera assembly 1406 is used to capture images or videos.
  • the camera assembly 1406 includes a front camera and a rear camera.
  • the front camera is set on the front panel of the terminal, and the rear camera is set on the back of the terminal.
  • the camera assembly 1406 may also include a flash.
  • the flash can be a single-color flash or a dual-color flash. Dual color temperature flash refers to a combination of warm light flash and cold light flash, which can be used for light compensation under different color temperatures.
  • the audio circuit 1407 may include a microphone and a speaker.
  • the microphone is used to collect sound waves of the user and the environment, and convert the sound waves into electrical signals and input them to the processor 1401 for processing, or input to the radio frequency circuit 1404 to implement voice communication. For the purpose of stereo collection or noise reduction, there may be multiple microphones, which are respectively set in different parts of the terminal 1400.
  • the microphone can also be an array microphone or an omnidirectional acquisition microphone.
  • the speaker is used to convert the electrical signal from the processor 1401 or the radio frequency circuit 1404 into sound waves.
  • the speaker can be a traditional membrane speaker or a piezoelectric ceramic speaker.
  • the speaker When the speaker is a piezoelectric ceramic speaker, it can not only convert the electrical signal into human audible sound waves, but also convert the electrical signal into human inaudible sound waves for purposes such as distance measurement.
  • the audio circuit 1407 may also include a headphone jack.
  • the positioning component 1408 is used to locate the current geographic location of the terminal 1400 to implement navigation or LBS (Location Based Service, location-based service).
  • the positioning component 1408 may be a positioning component based on the GPS (Global Positioning System, Global Positioning System) of the United States, the Beidou system of China, the Granus system of Russia, or the Galileo system of the European Union.
  • the power supply 1409 is used to supply power to various components in the terminal 1400.
  • the power source 1409 may be alternating current, direct current, disposable batteries, or rechargeable batteries.
  • the rechargeable battery may support wired charging or wireless charging.
  • the rechargeable battery can also be used to support fast charging technology.
  • the terminal 1400 further includes one or more sensors 1410.
  • the one or more sensors 1410 include, but are not limited to: an acceleration sensor 1411, a gyroscope sensor 1412, a pressure sensor 1413, a fingerprint sensor 1414, an optical sensor 1415, and a proximity sensor 1416.
  • the acceleration sensor 1411 can detect the magnitude of acceleration on the three coordinate axes of the coordinate system established by the terminal 1400.
  • the acceleration sensor 1411 can be used to detect the components of the gravitational acceleration on three coordinate axes.
  • the processor 1401 may control the display screen 1405 to display the user interface in a horizontal view or a vertical view according to the gravity acceleration signal collected by the acceleration sensor 1411.
  • the acceleration sensor 1411 may also be used for the collection of game or user motion data.
  • the gyroscope sensor 1412 can detect the body direction and rotation angle of the terminal 1400, and the gyroscope sensor 1412 can cooperate with the acceleration sensor 1411 to collect the user's 3D actions on the terminal 1400. Based on the data collected by the gyroscope sensor 1412, the processor 1401 can implement the following functions: motion sensing (such as changing the UI according to the user's tilt operation), image stabilization during shooting, game control, and inertial navigation.
  • the pressure sensor 1413 may be arranged on the side frame of the terminal 1400 and/or the lower layer of the display screen 1405.
  • the processor 1401 performs left and right hand recognition or quick operation according to the holding signal collected by the pressure sensor 1413.
  • the processor 1401 controls the operability controls on the UI interface according to the user's pressure operation on the display screen 1405.
  • the operability control includes at least one of a button control, a scroll bar control, an icon control, and a menu control.
  • the fingerprint sensor 1414 is used to collect the user's fingerprint.
  • the processor 1401 identifies the user's identity according to the fingerprint collected by the fingerprint sensor 1414, or the fingerprint sensor 1414 identifies the user's identity according to the collected fingerprint.
  • the processor 1401 authorizes the user to perform related sensitive operations, including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings.
  • the fingerprint sensor 1414 may be provided on the front, back or side of the terminal 1400. When a physical button or a manufacturer logo is provided on the terminal 1400, the fingerprint sensor 1414 can be integrated with the physical button or the manufacturer logo.
  • the optical sensor 1415 is used to collect the ambient light intensity.
  • the processor 1401 may control the display brightness of the display screen 1405 according to the ambient light intensity collected by the optical sensor 1415. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1405 is increased; when the ambient light intensity is low, the display brightness of the display screen 1405 is decreased.
  • the processor 1401 may also dynamically adjust the shooting parameters of the camera assembly 1406 according to the ambient light intensity collected by the optical sensor 1415.
  • the proximity sensor 1416 also called a distance sensor, is usually arranged on the front panel of the terminal 1400.
  • the proximity sensor 1416 is used to collect the distance between the user and the front of the terminal 1400.
  • the processor 1401 controls the display screen 1405 to switch from the on-screen state to the off-screen state; when the proximity sensor 1416 detects When the distance between the user and the front of the terminal 1400 gradually increases, the processor 1401 controls the display screen 1405 to switch from the rest screen state to the bright screen state.
  • FIG. 14 does not constitute a limitation on the terminal 1400, and may include more or fewer components than shown, or combine certain components, or adopt different component arrangements.
  • the server 1500 may have relatively large differences due to different configurations or performance, and may include one or more processors (central processing units, CPU) 1501 and one Or multiple memories 1502, wherein at least one instruction is stored in the one or more memories 1502, and the at least one instruction is loaded and executed by the one or more processors 1501 to implement the operation control provided by the foregoing method embodiments method.
  • the server 1500 may also have components such as a wired or wireless network interface, a keyboard, and an input/output interface for input and output.
  • the server 1500 may also include other components for implementing device functions, which will not be repeated here.
  • a computer-readable storage medium such as a memory including instructions, which may be executed by a processor to complete the operation control method in the foregoing embodiment.
  • the computer-readable storage medium may be a read-only memory (Read-Only Memory, ROM), a random access memory (Random Access Memory, RAM), a CD-ROM (Compact Disc Read-Only Memory, CD-ROM), Tapes, floppy disks and optical data storage devices, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种操作控制方法、装置、电子设备及存储介质。方法包括:终端在图形用户界面中显示目标按钮(301);当检测到对该目标按钮的触控操作时,终端获取该当前虚拟对象所控制的虚拟道具的类型(302),确定所述显示模式切换功能和射击功能中待触发的目标功能以及所述目标功能所对应的目标模式;基于所述目标模式,在所述图形用户界面中执行所述目标功能。

Description

操作控制方法、装置、电子设备及存储介质
本申请要求于2019年04月11日提交中国专利局,申请号为201910290727.0,申请名称为“操作控制方法、装置、电子设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及计算机技术领域,特别涉及一种操作控制方法、装置、电子设备及存储介质。
背景技术
随着计算机技术的发展以及终端功能的多样化,在终端上能够进行的游戏种类越来越多。例如,可以在终端中进行射击类游戏。
目前,游戏对应的操作控制方法中,通常是根据用户的操作实现对应的功能,例如,当接收到攻击操作时,则控制虚拟对象进行攻击。然而,经常存在操作过程繁琐复杂,操作控制效率低的情况,因而亟需一种操作控制方法,解决上述操作繁琐复杂、效率低的问题。
发明内容
根据本申请提供的各种实施例,提供了一种操作控制方法、装置、电子设备及存储介质。
一方面,提供了一种操作控制方法,由电子设备执行,所述方法包括:
在图形用户界面中显示目标按钮,所述目标按钮对应于显示模式切换功能和射击功能;
当检测到对所述目标按钮的触控操作时,根据当前虚拟对象所控制的虚拟道具的类型,确定所述显示模式切换功能和射击功能中待触发的目标功能以及所述目标功能所对应的目标模式;及
基于所述目标模式,在所述图形用户界面中执行所述目标功能。
一方面,提供了一种操作控制方法,由电子设备执行,所述方法包括:
在图形用户界面中显示目标按钮,所述目标按钮对应于多个控制功能,所 述多个控制功能包括显示模式切换功能、射击功能、动作控制功能或者视角调整功能中至少两个;
当检测到对所述目标按钮的触控操作时,根据当前虚拟对象所控制的虚拟道具的类型、所述当前虚拟对象的运动状态或者所述当前虚拟对象在虚拟场景中的环境中至少一项,确定所述多个控制功能中待触发的目标功能以及所述目标功能所对应的目标模式;及
基于所述目标模式,在所述图形用户界面中执行所述目标功能。
一方面,提供了一种操作控制装置,所述装置包括:
显示模块,用于在图形用户界面中显示目标按钮,所述目标按钮对应于显示模式切换功能和射击功能;
确定模块,用于当检测到对所述目标按钮的触控操作时,根据当前虚拟对象所控制的虚拟道具的类型,确定所述显示模式切换功能和射击功能中待触发的目标功能以及所述目标功能所对应的目标模式;及
执行模块,用于基于所述目标模式,在所述图形用户界面中执行所述目标功能。
一方面,提供了一种操作控制装置,所述装置包括:
显示模块,用于在图形用户界面中显示目标按钮,所述目标按钮对应于多个控制功能,所述多个控制功能包括显示模式切换功能、射击功能、动作控制功能或者视角调整功能中至少两个;
确定模块,用于当检测到对所述目标按钮的触控操作时,根据当前虚拟对象所控制的虚拟道具的类型、所述当前虚拟对象的运动状态或者所述当前虚拟对象在虚拟场景中的环境中至少一项,确定所述多个控制功能中待触发的目标功能以及所述目标功能所对应的目标模式;及
执行模块,用于基于所述目标模式,在所述图形用户界面中执行所述目标功能。
一方面,提供了一种电子设备,包括存储器和处理器,所述存储器中存储有计算机可读指令,所述计算机可读指令被所述处理器执行时,使得所述处理器执行上述操作控制方法的步骤。
一方面,提供了一个或多个存储有计算机可读指令的非易失性存储介质,所述计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行上述操作控制方法的步骤。
本申请的一个或多个实施例的细节在下面的附图和描述中提出。本申请的其它特征、目的和优点将从说明书、附图以及权利要求书变得明显。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的一种虚拟场景的显示模式的示意图;
图2是本申请实施例提供的一种虚拟场景的显示模式的示意图;
图3是本申请实施例提供的一种操作控制方法的流程图;
图4是本申请实施例提供的一种目标按钮的显示界面示意图;
图5是本申请实施例提供的一种配置界面的示意图;
图6是本申请实施例提供的一种功能配置界面的示意图;
图7是本申请实施例提供的一种虚拟道具的类型为第一类型时执行目标功能的过程中的界面示意图;
图8是本申请实施例提供的一种虚拟道具的类型为第二类型时执行目标功能后的界面示意图;
图9是本申请实施例提供的一种虚拟道具的类型为第三类型时执行目标功能的过程中的界面示意图;
图10是本申请实施例提供的一种操作控制方法的流程图;
图11是本申请实施例提供的一种操作控制方法的流程图;
图12是本申请实施例提供的一种操作控制装置的结构示意图;
图13是本申请实施例提供的一种操作控制装置的结构示意图
图14是本申请实施例提供的一种终端的结构示意图;及
图15是本申请实施例提供的一种服务器的结构示意图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施方式作进一步地详细描述。
本申请实施例主要涉及电子游戏或者模拟训练场景,以电子游戏场景为例, 用户可以提前在该终端上进行操作,该终端检测到用户的操作后,可以下载电子游戏的游戏配置文件,该游戏配置文件可以包括该电子游戏的应用程序、界面显示数据或虚拟场景数据等,以使得该用户在该终端上登录电子游戏时可以调用该游戏配置文件,对电子游戏界面进行渲染显示。用户可以在终端上进行触控操作,该终端检测到触控操作后,可以确定该触控操作所对应的游戏数据,并对该游戏数据进行渲染显示,该游戏数据可以包括虚拟场景数据、该虚拟场景中虚拟对象的行为数据等。
本申请涉及到的虚拟场景可以用于模拟一个三维虚拟空间,也可以用于模拟一个二维虚拟空间,该三维虚拟空间或二维虚拟空间可以是一个开放空间。该虚拟场景可以用于模拟现实中的真实环境,例如,该虚拟场景中可以包括天空、陆地、海洋等,该陆地可以包括沙漠、城市等环境元素,用户可以控制虚拟对象在该虚拟场景中进行移动。其中,虚拟对象可以是一个虚拟的用于代表用户的虚拟形象,也可以是一个虚拟的用于代表与用户有关联关系的生物的虚拟形象,例如,宠物。该虚拟形象可以是任一种形态,例如,人或者动物等,本申请对此不限定。
该虚拟场景中还可以包括其他虚拟对象,也即是该虚拟场景中可以包括多个虚拟对象,每个虚拟对象在虚拟场景中具有自身的形状和体积,占据虚拟场景中的一部分空间。以射击类游戏为例,用户可以控制虚拟对象在该虚拟场景的天空中自由下落、滑翔或者打开降落伞进行下落等,在陆地上中跑动、跳动、爬行或者弯腰前行等,也可以控制虚拟对象在海洋中游泳、漂浮或者下潜等,当然,用户也可以控制虚拟对象乘坐载具在该虚拟场景中进行移动,在此仅以上述场景进行举例说明,本申请实施例对此不作具体限定。用户也可以控制虚拟对象使用虚拟道具与其他虚拟对象进行战斗,该虚拟道具可以是冷兵器,也可以是热兵器,本申请实施例对此不作具体限定。
一般地,终端屏幕显示的是当前终端控制的虚拟对象的视角画面,该终端屏幕上也可以显示该当前终端控制的虚拟对象的瞄准点,该瞄准点可以用于标注当前终端控制的虚拟对象的视角画面中的瞄准目标,则该瞄准点在该虚拟场景中的位置即可作为当前终端控制的虚拟对象的攻击落点。
具体地,该瞄准点可以在该终端屏幕的中心位置显示,当然,该瞄准点也可以在其他位置显示,本申请实施例对此不作具体限定。该瞄准点的显示样式可以包括多种,则该瞄准点显示时可以采用系统默认的显示样式,也可以根据 用户的设置进行调整。用户看到终端上显示的瞄准点,可以确定当前瞄准点对应的虚拟场景的位置是否为自己想要瞄准的区域,如果不是,用户可以通过视角调整操作调整虚拟场景的视角来调整该瞄准点瞄准的区域。当然,用户通常是希望能快速且精准地将瞄准点调整至该虚拟场景中的其他虚拟对象的身上,从而可以对该其他虚拟对象进行射击、拍击或者拳击等。
该视角调整操作可以包括多种操作方式,在一种可能实现方式中,该视角调整操作可以是滑动操作,终端检测到该滑动操作,可以基于该滑动操作的滑动方向、滑动距离以及滑动速度,确定该滑动操作对应的视角的转动方向、转动角度以及转动速度。例如,该滑动操作的滑动方向可以对应于视角的转动方向,该滑动操作的滑动距离大小可以与视角的转动角度正相关,当然,该滑动操作的滑动速度也可以与该视角的转动速度正相关。
在另一种可能实现方式中,该视角调整操作还可以是对终端的转动操作,该终端中的角速度传感器(例如陀螺仪)检测到该转动操作时,可以根据该转动操作的转动方向、转动角度以及转动速度,确定视角的转动方向、转动角度以及转动速度。例如,该转动操作的转动方向可以为视角的转动方向,该转动操作的转动角度可以与该视角的转动角度正相关,该转动操作的转动速度可以与该视角的转动速度正相关。当然,该视角调整操作也可以是按键操作、对虚拟摇杆区域的拖拽操作或者对真实摇杆设备的拨动操作等,本申请对此不作具体限定。
当然,该视角调整操作还可以包括其他方式,例如,手势操作,本申请实施例对此不作限定。在用户对虚拟对象进行控制时,也可以通过上述几种视角调整操作的结合实现不同的控制效果,例如,该用户对视角的视角调整操作为滑动操作,而该在滑动操作时,终端检测到该滑动操作过程中操作的按压力度,从而基于该按压力度是否大于预设的按压力度,从而决定是否进行射击等。上述仅为一种示例性说明,具体实施中如何对上述几种视角调整操作结合,可以实现哪种控制效果,本申请在此不做具体限定。
在上述电子游戏场景中,虚拟对象通常可以控制虚拟道具与其他虚拟对象进行战斗。在一些枪械道具上还可以装备有瞄具,从而基于瞄具来观察虚拟场景,该瞄具可以为机械瞄具,机械瞄具是指枪械道具上原本即装备有的观测设备。该瞄具也可以为该枪械道具上后续配备的瞄具,例如,瞄准镜。其中,该瞄准镜可以具有倍率,该倍率可以是1,也可以是大于1的数值。例如,该瞄准 镜可以是红点瞄准镜、全息瞄准镜、二倍瞄准镜、四倍瞄准镜或者八倍瞄准镜等,其中,红点瞄准镜和全息瞄准镜的倍率为1,而二倍瞄准镜、四倍瞄准镜和八倍瞄准镜的倍率则大于1,当然,该瞄准镜的倍率还可以是其它数值,例如,该瞄准镜还可以是三倍镜、六倍镜或者十五倍镜等,本申请实施例对该瞄准镜的倍率不作具体限定。
一般,该瞄具即用于辅助虚拟对象进行瞄准和射击,因而,当虚拟对象控制虚拟道具进行瞄准或射击时,可以通过将虚拟场景的显示模式切换至基于瞄具的显示模式,这样方便更准确地对敌方虚拟对象进行瞄准和射击。例如,如图1所示,虚拟场景未处于基于瞄具的显示模式,而是处于第一显示模式。如图2所示,用户想要控制虚拟对象准确射击出现在虚拟场景中的其它虚拟对象,将虚拟场景的显示模式切换至基于瞄具的显示模式,可以通过虚拟对象控制的虚拟道具上的瞄具来观察虚拟场景。
图3是本申请实施例提供的一种操作控制方法的流程图,该方法可以应用于电子设备,该电子设备可以被提供为终端,也可以被提供为服务器,本申请实施例对此不作限定。下述实施例中仅以该操作控制方法由终端执行为例进行说明,参见图3,该方法可以包括以下步骤:
301、终端在图形用户界面中显示目标按钮,该目标按钮对应于显示模式切换功能和射击功能。
在本申请实施例中,终端可以提供操作控制功能,终端可以提供目标按钮,该目标按钮可以对应于多种控制功能,用户可以通过对目标按钮进行一次触控操作以实现多种控制功能。具体地,该目标按钮可以对应于显示模式切换功能和射击功能。
其中,终端提供该目标按钮的方法可以为:终端可以在图形用户界面中显示目标按钮,从而用户可以对该目标按钮进行触控操作。具体地,该终端在显示目标按钮时可以通过下述步骤一至步骤三中至少一个步骤实现:
步骤一、终端在图形用户界面中的目标位置上,显示该目标按钮。
步骤二、终端在图形用户界面中按照目标尺寸,显示该目标按钮。
步骤三、终端在图形用户界面中按照目标透明度,显示该目标按钮。
也即是,该目标按钮的显示情况中可以包括至少一个显示参数,该至少一个显示参数可以包括显示位置、尺寸或者透明度中至少一项,其中,该目标按钮的显示位置为目标位置,尺寸为目标尺寸,透明度为目标透明度,通过上述 至少一个显示参数的设置,可以使得该目标按钮的显示情况可以灵活变化,满足实际需要。
在一种可能实现方式中,该目标位置、目标尺寸和目标透明度中至少一项可以由相关技术人员根据需求进行设置,本申请实施例对此不作限定。例如,目标位置可以根据用户手持终端时,用户手部与终端屏幕的接触位置确定,可以是在接触位置的预设范围内。例如,可以将该目标位置设置为一个用户较容易点击的位置,以降低用户操作的复杂度,比如,如图4所示,目标按钮401所在的目标位置可以为图形用户界面的右下角区域中的位置,用户握持终端时手指刚好位于该位置上。可以将该目标尺寸设置的较小一些,将目标透明度设置得较大一些,这样可以使得该目标按钮不会对图形用户界面造成过多遮挡。也即是,该目标尺寸可以小于按钮正常尺寸,该目标透明度可以大于正常透明度,其中,该按钮正常尺寸为按钮的平均尺寸,该正常透明度为按钮的一般透明度,或者该正常透明度可以为0,这些均可以由相关技术人员根据需求进行设置,本申请实施例对此不作限定。
该目标按钮中还可以显示有目标提示信息,该目标提示信息用于提示该目标按钮同时具有显示模式切换功能和射击控制功能,比如该目标按钮中可以显示有瞄准点和子弹的样式,这样可以提示该目标按钮可以同时提供显示模式切换功能和射击控制功能。
在另一种可能实现方式中,该目标位置、目标尺寸和目标透明度中至少一项还可以由用户根据自身的使用习惯进行自定义设置,具体地,终端可以提供配置界面,用户可以在该配置界面中进行自定义设置,以改变该目标按钮的显示情况。该显示情况的设置过程可以通过下述步骤一至步骤三中至少一个步骤实现:
步骤一、终端基于配置界面,获取该目标按钮的位置调整信息,基于该位置调整信息,获取该目标按钮的目标位置。
步骤二、终端基于配置界面,获取该目标按钮的尺寸调整信息,基于该尺寸调整信息,获取该目标按钮的目标尺寸。
步骤三、终端基于配置界面,获取该目标按钮的透明度调整信息,基于该透明度调整信息,获取该目标按钮的目标透明度。
例如,如图5所示,在该配置界面中可以提供有该目标按钮的显示参数调整选项,该显示参数调整选项即为:按钮大小(尺寸)、透明度以及该目标按 钮,用户可以对该显示参数调整选项进行操作,终端则可以根据操作获取到相应的调整信息,从而基于调整信息,对目标按钮的显示参数进行调整。具体地,用户可以调整该按钮大小的调整条,终端可以基于用户对该调整条进行调整操作,对显示的该目标按钮的尺寸进行调整,以提供调整预览效果,例如图5中的“179%”表示目标按钮的大小为默认大小的179%。同理地,用户同样可以对透明度的调整条进行调整操作,例如图5中的“100%”表示目标按钮的透明度为100%。用户在调整该目标按钮的目标位置时,可以通过对该目标按钮进行拖动操作实现,终端可以基于该对该目标按钮的拖动操作,对该目标按钮的位置进行调整。例如,目标按钮的位置可以随着拖动操作的位置变化而变化,图5中,虚线位置为目标按钮的原始位置,当接收到对目标按钮的拖动操作,随着拖动位置的变化,目标按钮由原始位置开始,也跟随拖动位置移动。
在一个具体的可能实施例中,终端可以在该配置界面中显示该目标按钮,用户可以对该目标按钮进行选中操作,当终端检测到对该目标按钮的选中操作时,可以显示该目标按钮的显示参数调整选项。在一种可能实现方式中,该配置界面中还可以提供有其他按钮,从而用户也可以对其他按钮的显示参数进行设置,本申请实施例对此不作限定。
在一种可能实现方式中,该操作控制功能的状态可以包括开启状态和关闭状态,用户可以通过对该操作控制功能的状态进行设置,来确定是否需要终端为其提供操作控制功能。具体地,当根据配置信息确定操作控制功能处于开启状态时,终端可以执行该在图形用户界面中显示目标按钮的步骤。也即是,在确定操作控制功能开启时终端提供操作控制功能,当根据配置信息确定操作控制功能处于关闭状态时,终端可以不执行该步骤301。
其中,终端在对该操作控制功能的状态进行设置时可以通过功能配置界面实现,具体地,终端可以基于功能配置界面和该操作控制功能的状态设置操作,对该操作控制功能的状态进行设置,也即是基于功能配置界面,确定该操作控制功能的配置信息,以确定该操作控制功能的状态。在该功能配置界面中可以提供有该操作控制功能的状态设置选项,例如,开启选项和关闭选项,用户可以对该操作控制功能的状态设置选项进行触控操作,以改变该操作控制功能的状态。例如,如图6所示,在一个具体示例中,该目标按钮可以称为“一键开镜开火按钮”,该目标按钮周围可以设置有开启(“开”)和关闭选项(“关”),用户可以对该开启或关闭选项进行选择操作,以改变该目标按钮的使用状态, 也即是改变该操作控制功能的状态。如果用户选择开启选项,则该操作控制功能处于开启状态,如果用户选择关闭选项,则该操作控制功能处于关闭状态。
302、当检测到对该目标按钮的触控操作时,终端获取该当前虚拟对象所控制的虚拟道具的类型,如果该类型为第一类型,则执行步骤303和步骤304,如果该类型为第二类型,则执行步骤305和步骤306,如果该类型为第三类型,则执行步骤307和步骤308。
在该图形用户界面中显示有目标按钮时,用户可以对该目标按钮进行触控操作,从而终端可以基于用户的触控操作,提供相应的操作控制功能。在本申请实施例中,当前虚拟对象所控制的虚拟道具的类型不同时,终端所提供的操作控制功能也可以不同。因而,在步骤302中,终端可以获取当前虚拟对象所控制的虚拟道具的类型,从而根据该类型进一步确定所需要提供的操作控制功能。其中,该虚拟道具的类型可以是由相关技术人员根据需求进行设置,也可以基于该虚拟道具的名称确定,当然,也可以由用户根据自身使用习惯进行设置,本申请实施例对此不作限定。
在本申请实施例中,虚拟道具的类型可以包括第一类型、第二类型和第三类型,在该虚拟道具的类型不同时,终端可以分别执行下述步骤303至步骤308中的两个步骤。也即是在该虚拟道具的类型不同时,终端可以提供不同的控制功能,提供的控制功能的模式也可以不同,该模式用于表示如何基于触控操作执行相应功能。在此仅以该虚拟道具的类型包括三个类型为例进行说明,当然,该虚拟道具还可以包括第四类型,当该虚拟道具的类型为第四类型时,终端还可以执行其他操作控制功能,本申请实施例对该虚拟道具的类型不作限定。
303、终端将该显示模式切换功能和该射击功能确定为该待触发的目标功能,将目标显示模式切换模式确定为该显示模式切换功能所对应的目标模式,将第一射击模式确定为该射击功能所对应的目标模式。
终端获取到虚拟道具的类型后,如果该类型为第一类型,则终端可以执行该步骤303,并基于该步骤303确定的目标功能和对应的目标模式,执行下述步骤304。在该虚拟道具的类型为第一类型时,终端可以确定既触发显示模式切换功能,也触发射击功能。虚拟场景的显示模式可以包括第一显示模式和第二显示模式,其中,第二显示模式可以为基于瞄具的显示模式,第一显示模式可以为该第二显示模式之外的显示模式。该显示模式切换功能即是指对该虚拟场景的显示模式进行切换。
对于该显示模式切换功能和射击功能,终端还可以分别确定二者的目标模式,终端确定显示模式切换功能所对应的目标模式为目标显示模式切换模式,射击功能所对应的目标模式为第一射击模式。二者的目标模式用于表示在该触控操作的不同阶段时如何执行对应的目标功能。其中,该触控操作可以包括多个阶段:该触控操作开始时、触控操作持续时的期间以及该触控操作结束时。
在一种可能实现方式中,该第一类型的虚拟道具可以包括多种射击类型,具体地,该第一类型的虚拟道具可以包括第一射击类型和第二射击类型,例如,在一个具体示例中,该第一射击类型可以为自动射击类型,该第二射击类型可以为单发射击类型,当然,该第一类型的虚拟道具还可以包括一种射击模式,例如,该第一类型的虚拟道具可以包括第一射击类型,也即是,自动射击类型。具体可以参见下述步骤304,本申请实施例在此先不作过多说明。
304、终端基于该目标显示模式切换模式和该第一射击模式,当检测到该触控操作开始时,将虚拟场景的显示模式从第一显示模式切换至第二显示模式,在该触控操作持续的期间持续执行射击功能,当检测到该触控操作结束时,将虚拟场景的显示模式从第二显示模式切换至第一显示模式。
终端确定了目标功能和对应的目标模式后,即可基于目标模式和触控操作来执行相应的目标功能。具体地,该目标显示模式切换模式用于表示在触控操作开始时将虚拟场景的显示模式从第一显示模式切换至第二显示模式,在触控操作结束时将虚拟场景的显示模式从第二显示模式切换至第一显示模式。该第一射击模式用于表示在触控操作中持续的期间持续执行射击功能。因而,终端可以执行该步骤304,实现该显示模式切换功能和射击功能。
例如,在此以触控操作为长按操作为例进行说明,以该虚拟道具为具有自动射击类型的射手步枪为例,用户可以对目标按钮进行长按操作,在该长按操作开始时,终端可以将虚拟场景的显示模式从第一显示模式切换至第二显示模式(基于瞄具的显示模式),如图7所示,图7中该虚拟场景的显示模式即为第二显示模式,也即是基于瞄具的显示模式,在该长按操作持续的过程中,终端可以持续执行射击功能,在用户结束长按操作时,终端可以停止执行射击功能,将虚拟场景的显示模式从第二显示模式切换至第一显示模式。在此称将显示模式从第一显示模式切换至第二显示模式的过程为“开镜”,称将显示模式从第二显示模式切换至第一显示模式的过程为“关镜”,因而,该步骤304中,终端可以在检测到触控操作开始时开镜,并持续射击,在触控操作结束时停止 射击并关镜。用户仅需要按一个目标按钮就可以完成“开镜-射击-关镜”的操作流程,极大的加快了用户在紧急情况下的瞄准射击体验。
在电子游戏中,在紧急的交战情况下,玩家经常没有足够的时间来进行多次操作,既完成显示模式切换又完成射击,从而可能在交火上容易落后而导致失败。而在本申请实施例中,通过对目标按钮进行一次操作,即可快速完成显示模式切换(瞄准)和射击流程,而且因为按钮操作简单便捷,可以有效提高玩家的游戏体验。
在一种可能实现方式中,该第一类型的虚拟道具可以包括多种射击类型,不同射击类型时终端所需要执行的操作控制功能的模式也可以不同。因而,如果该虚拟道具的类型为第一类型,终端可以获取该虚拟道具当前的射击类型,从而基于该射击类型,来确定进一步如何执行操作控制功能。下述通过两种射击类型为例进行说明:
当该射击类型为第一射击类型时,终端可以执行上述步骤303,也即是执行将该显示模式切换功能和该射击功能确定为该待触发的目标功能,将目标显示模式切换模式确定为该显示模式切换功能所对应的目标模式,将第一射击模式确定为该射击功能所对应的目标模式的步骤。当然,终端还可以进一步执行步骤304。其中,该第一射击类型可以用于表示该虚拟道具基于一次操作可持续射击,例如,该第一射击类型可以称为自动射击类型。则当该虚拟道具的射击类型为该第一射击类型时,终端可以执行该步骤303和步骤304。
当该射击类型为第二射击类型时,终端可以将该显示模式切换功能和该射击功能确定为该待触发的目标功能,将目标显示模式切换模式确定为该显示模式切换功能所对应的目标模式,将第二射击模式确定为该射击功能所对应的目标模式。其中,第二射击类型可以用于表示该虚拟道具基于一次操作射击一次。例如,第二射击类型可以称为单发射击类型。则当该虚拟道具的射击类型为第二射击类型时,该虚拟道具无法在触控操作持续的期间持续射击,与上述步骤303不同的是,终端可以确定射击功能所对应的目标模式为第二射击模式,从而在步骤304中,终端可以执行射击功能的情况也不同,可以在触控操作开始时开镜,在触控操作结束时射击并关镜。
当然,还有一种可能实现方式,该终端可以不获取该虚拟道具的射击类型,在该射击类型为第一射击类型或第二射击类型时均执行该步骤303和步骤304,也即是,不将虚拟道具的射击类型作为射击模式的参考因素,本申请实施例对 具体采用哪种方式不作限定。
上述步骤303和步骤304为虚拟道具的类型为第一类型时终端所需要执行的操作控制步骤,如果该类型为第二类型,终端可以执行下述步骤305和步骤306,如果该类型为第三类型,终端可以执行下述步骤307和步骤308。
305、终端将该显示模式切换功能和该射击功能确定为该待触发的目标功能,将目标显示模式切换模式确定为该显示模式切换功能所对应的目标模式,将第二射击模式确定为该射击功能所对应的目标模式。
上述步骤302后,终端获取到虚拟道具的类型后,如果该类型为第二类型,则终端可以执行该步骤305,并基于该步骤305确定的目标功能和对应的目标模式,执行下述步骤306。
在该虚拟道具的类型为第二类型时,终端可以确定既触发显示模式切换功能,也触发射击功能。对于该显示模式切换功能和射击功能,终端还可以分别确定二者的目标模式,终端确定显示模式切换功能所对应的目标模式为目标显示模式切换模式,射击功能所对应的目标模式为第二射击模式。二者的目标模式用于表示在该触控操作的不同阶段时如何执行对应的目标功能。
在一种可能实现方式中,该第二类型的虚拟道具可以包括第二射击模式,该第二射击模式用于表示该虚拟道具基于一次操作射击一次。这样该第二类型的虚拟道具的第二射击模式可以与上述步骤304中第一类型的虚拟道具处于第二射击类型时的第二射击模式同理。在此称之为单发射击类型。
306、终端基于该目标显示模式切换模式和该第二射击模式,当检测到该触控操作开始时,将虚拟场景的显示模式从第一显示模式切换至第二显示模式,当检测到该触控操作结束时,执行该射击功能,将虚拟场景的显示模式从第二显示模式切换至第一显示模式。
终端确定了目标功能和对应的目标模式后,即可基于目标模式和触控操作来执行相应的目标功能。具体地,该目标显示模式切换模式与上述步骤304中同理,该第二射击模式用于表示在该触控操作结束时执行射击功能。因而,终端执行该步骤306,实现该显示模式切换功能和射击功能。
例如,在此以触控操作为长按操作为例进行说明,以虚拟对象所控制的虚拟道具为狙击步枪为例,用户可以对目标按钮进行长按操作,在该长按操作开始时,终端可以将虚拟场景的显示模式从第一显示模式切换至第二显示模式(基于瞄具的显示模式),在用户结束长按操作时,终端可以停止执行射击功能, 将虚拟场景的显示模式从第二显示模式切换至第一显示模式。在此称将显示模式切换至基于瞄具的显示模式的过程为“开镜”,称将显示模式从基于瞄具的显示模式切换至第一显示模式的过程为“关镜”,如图8所示,该步骤306中,终端可以在检测到触控操作开始时开镜,在触控操作结束时射击并关镜。图8中仅示出了触控操作结束时关镜后的虚拟场景,其他过程并未示出,其中,该虚拟场景的显示模式为基于瞄具的显示模式可以参见图7。
上述步骤305和步骤306为虚拟道具的类型为第二类型时终端所需要执行的操作控制步骤,如果该类型为第三类型,终端可以执行下述步骤307和步骤308。
307、终端将该射击功能确定为该待触发的目标功能,将第三射击模式确定为该射击功能所对应的目标模式。
上述步骤302后,终端获取到虚拟道具的类型后,如果该类型为第三类型,则终端可以执行该步骤307,并基于该步骤307确定的目标功能和对应的目标模式,执行下述步骤308。
在该虚拟道具的类型为第三类型时,终端可以触发射击功能,而不触发显示模式切换功能,因而,终端还可以确定该射击功能的目标模式,也即是,第三射击模式。该第三射击模式用于表示在该触控操作的哪个阶段执行该射击功能。
308、终端基于该第三射击模式,当检测到该触控操作开始时,执行该射击功能。
终端确定了目标功能和对应的目标模式后,即可基于目标模式和触控操作来执行相应的目标功能。在该步骤308中,终端可以基于第三射击模式,执行射击功能。在一种可能实现方式中,该第三目标模式用于表示在该触控操作开始时执行射击功能。因而,终端执行该步骤308,实现该射击模式。
例如,在此以触控操作为点击操作为例进行说明,以虚拟对象所控制的虚拟道具为不具有自动射击类型的精确射手步枪为例,用户可以对目标按钮进行点击操作,在该点击操作开始时,终端可以执行射击功能。如果该触控操作为长按操作,在该触控操作开始时,终端可以执行射击功能,直至该触控操作结束,该终端可以不再执行该射击功能。如图9所示,该步骤308中,终端可以在检测到触控操作开始时射击。在该触控操作的操作过程中,终端可以不进行上述开镜和关镜的过程。
需要说明的是,上述步骤302、步骤303、步骤305和步骤307为当检测到对该目标按钮的触控操作时,根据当前虚拟对象所控制的虚拟道具的类型,确定该显示模式切换功能和射击功能中待触发的目标功能以及该目标功能所对应的目标模式的过程,步骤304、步骤306和步骤308为基于该目标模式,在该图形用户界面中执行该目标功能的过程,该虚拟道具的类型不同时,该终端确定待触发的目标功能和对应的目标模式可以不同则终端所需要执行的操作控制步骤也可以不同。
上述过程对虚拟道具的类型进行了划分,为不同类型的虚拟道具提供了不同的操作控制流程,而并非采用相同的操作控制流程,这样不会出现大部分的枪械射击手感发生不适的情况,更不会因此而影响交战情况,不会因小失大,操作控制流程灵活、便捷,可以满足不同射击场景下的需求。且通过一个目标按钮,即可实现多种操作控制流程,提供了平滑的操作体验,不会给用户带来操作压力,反而简化了用户操作,且用户可以根据自身使用习惯自定义设置,将该按钮的功能调整至最适合自己手感的情况,以提高用户体验。
上述仅提供了虚拟道具的类型包括第一类型、第二类型和第三类型的情况,在一种可能实现方式中,该虚拟道具还可以包括其他类型,上述均可以由相关技术人员根据需求进行设置,当然,也可以由用户根据自身使用习惯进行说明。
在一种可能实现方式中,该终端可以提供操作控制功能配置界面,在该操作控制功能配置界面中可以包括控制功能选项和模式选项,用户可以在控制功能选项中进行选择操作,并在执行逻辑选项中进行设置操作,从而终端可以基于该选择操作或设置操作,更新该目标按钮对应的控制功能,以及控制功能对应的目标模式。当然,该操作控制功能配置界面中还可以提供虚拟道具的类型设置选项,用户可以在该类型设置选项中,设置虚拟道具的类型,也可以设置每个类型的虚拟道具有哪些,上述均可以由相关技术人员根据自身使用习惯进行自定义设置,本申请实施例对此不作限定。
在一种可能实现方式中,上述操作控制方法的流程中,还可以通过该目标按钮提供视角调整功能,在用户对该目标按钮进行触控操作时,终端还可以基于该触控操作,对虚拟场景的视角进行调整。具体地,该目标按钮还可以对应于视角调整功能。在上述步骤302、303、305和307中,当检测到对该目标按钮的触控操作时,终端还可以将该视角调整功能确定为该待触发的目标功能,将目标视角调整模式确定为该视角调整功能所对应的目标模式。上述步骤304、 306和308中,基于该目标视角调整模式,在该触控操作持续的期间,终端还可以基于该触控操作的操作方向和操作速度,对虚拟场景的视角进行调整。
在一个具体的实施例中,上述视角调整过程中,可以基于该触控操作的操作方向和操作速度,来获取虚拟场景的视角的目标转动速度,该目标转动速度包括方向和大小,终端可以基于该目标转动速度来对虚拟场景的视角进行调整。
例如,以该触控操作为长按操作和滑动操作的结合为例进行说明,如果虚拟道具的类型为第一类型,在触控操作开始时,终端可以将虚拟场景的显示模式从第一显示模式切换至第二显示模式,并在触控操作持续的期间持续执行射击功能,在用户想要调整视角时,可以按压住目标按钮进行滑动,或者拖动该目标按钮,从而终端可以基于操作方向和操作速度进行视角调整。在该虚拟场景的视角进行调整时,瞄准点所瞄准的虚拟场景中的位置也会发生改变,从而在该视角调整过程中,用户可以调整瞄准的位置,并持续进行射击,直至触控操作结束,终端停止射击并切换回第一显示模式。
又例如,如果虚拟道具的类型为第二类型,在触控操作开始时,终端将虚拟场景的显示模式从第一显示模式切换至第二显示模式,在该触控操作持续的过程中,用户可以对视角进行调整,以改变瞄准的位置,在调整后,用户停止触控操作,这时终端可以对当前瞄准点所瞄准的位置进行射击并切换回第一显示模式。
又例如,如果虚拟道具的类型为第三类型,在触控操作开始时,终端可以进行射击,在触控操作继续时,不再射击,而是基于触控操作对视角进行调整。
在一种可能实现方式中,上述操作控制方法的流程中,还可以通过该目标按钮提供动作控制功能,在用户对该目标按钮进行触控操作时,终端还可以基于该触控操作,控制虚拟对象做出相应的动作。
具体地,该目标按钮还可以对应于动作控制功能。上述步骤302、303、305和307中,当检测到对该目标按钮的触控操作时,终端还可以将该动作控制功能确定为该待触发的目标功能,将目标动作控制模式确定为该动作控制功能所对应的目标模式。上述步骤304、306和308中,终端还可以执行下述任一项:
基于该目标动作控制模式,当检测到该触控操作开始时,终端控制该当前虚拟对象执行目标动作。
基于该目标动作控制模式,当检测到该触控操作开始时,终端控制该当前虚拟对象执行目标动作;;当检测到该触控操作结束时,终端控制该当前虚拟 对象的动作恢复至执行该目标动作之前的动作。
例如,该目标动作可以为侧身动作,或者趴下动作、蹲下动作或跳跃动作等,以该目标动作为侧身动作为例,在该触控操作开始时,终端可以控制虚拟对象侧身,并执行上述步骤304、306或308中的其它动作,也可以执行上述视角调整步骤,在触控操作结束时,终端可以再控制虚拟对象从侧身的状态恢复至之前的状态。
在另一种可能实现方式中,上述操作控制方法的流程中还可以提供多种动作的动作控制功能,在用户对该目标按钮进行触控操作时,终端还可以基于该触控操作,控制虚拟对象做出相应的动作。
具体地,该目标按钮还对应于动作控制功能。该动作控制功能中可以包括多个动作的动作控制模式,每种动作控制模式可以对应虚拟对象的一种动作,也可以对应于虚拟对象的一种动作的一种控制情况。上述步骤302中,当检测到对该目标按钮的触控操作时,终端还可以获取该当前虚拟对象的运动状态和该当前虚拟对象在虚拟场景中的环境中至少一项,以此作为判断依据,在上述步骤303、305和307中,终端可以将该动作控制功能确定为该待触发的目标功能,根据该当前虚拟对象的运动状态和该当前虚拟对象在虚拟场景中的环境中至少一项,将多个动作控制模式中的目标动作控制模式确定为该动作控制功能所对应的目标模式。上述步骤304、306和308中,终端还可以执行下述任一项:
基于该目标动作控制模式,当检测到该触控操作开始时,终端控制该当前虚拟对象执行该目标动作控制模式对应的目标动作。
基于该目标动作控制模式,当检测到该触控操作开始时,终端控制该当前虚拟对象执行该目标动作控制模式对应的目标动作;当检测到该触控操作结束时,终端控制该当前虚拟对象的动作恢复至执行该目标动作之前的动作。
在该实现方式中,该动作控制功能可以对应于多个动作的动作控制模式,终端可以根据当前虚拟对象的运动状态,或根据当前虚拟对象在虚拟场景中的环境,或根据二者,来获取本次要控制虚拟对象执行的目标动作为哪个动作,按照什么动作模式来控制该虚拟对象执行该目标动作,从而在后续控制虚拟对象做出相应动作。例如,该多个动作可以包括侧身、蹲下和趴下三个动作,在该虚拟对象在虚拟场景中的石头周围时,终端可以确定目标动作控制模式为侧身动作的动作控制模式,并在之后控制虚拟对象做出侧身动作。在该虚拟对象处于爬行状态时,终端可以确定目标动作控制模式为蹲下动作的动作控制模式, 并在之后控制该虚拟对象做出蹲下动作。
在此仅以两个具体示例来进行说明,需要说明的是,该多个动作控制模式以及动作控制模式与虚拟对象的运动状态和在虚拟场景中的环境中任一项的对应关系,均可以由相关技术人员根据需求进行设置,也可以由用户根据自身使用习惯进行自定义设置,本申请实施例对此不作限定。
下面通过图10以一个具体示例的方式,来对上述操作控制方法的具体流程进行说明,参见图10,在此称该操作控制功能为“一键开镜开火功能”,终端可以检测该一键开镜开火功能是否开启,如果已开启,则终端可以读取自定义面板(配置界面)中一键开镜开火按钮的位置,尺寸和透明度数据并应用,也即是指,终端可以判断操作控制功能的状态,如果该操作控制功能处于开启状态,终端可以在图形用户界面中显示目标按钮,且该目标按钮具有三个显示参数,该三个显示参数可以在配置界面中进行配置。终端可以检测玩家手持枪械的类型,也即是指终端可以获取当前虚拟对象所控制的虚拟道具的类型,如果该类型为第一类型,例如,虚拟道具为具有全自动射击类型的射手步枪,终端可以在用户按下该目标按钮时开镜,在按住期间射击,并在用户松手时关镜。如果该类型为第二类型,例如,虚拟道具为无自动射击类型的狙击步枪,终端可以在用户按下该目标按钮时开镜,用户松手时关镜并射击。如果该类型为第三类型,例如,虚拟道具为无自动射击类型的精确射手步枪,终端可以在用户按下时射击。自此,终端完成了本次一键开镜开火功能,并检测玩家下一步的操作,如果该一键开镜开火功能已关闭,则终端可以在之后每帧数中逐步检测玩家是否开启该功能。
本申请实施例通过在图形用户界面中显示有对应于显示模式切换功能和射击功能的目标按钮,检测到对目标按钮的触控操作时,可以根据虚拟对象所控制的虚拟道具的类型来确定该触控操作待触发的功能为哪个或哪几个,确定本次执行相应功能需要遵循哪种模式,将显示模式切换功能和射击功能均对应于同一按钮,通过对一个按钮进行一次触控操作,即可实现多种操作控制功能,且可以根据虚拟道具的类型,一次操作实现不同的操作控制功能,操作过程简单便捷,操作控制效率高。
图11是本申请实施例提供的一种操作控制方法的流程图,参见图11,该方法可以包括以下步骤:
1101、终端在图形用户界面中显示目标按钮,该目标按钮对应于多个控制 功能,该多个控制功能包括显示模式切换功能、射击功能、动作控制功能或者视角调整功能中至少两个。
该步骤1101与上述步骤301同理,该目标按钮的显示过程也可以包括下述步骤一至步骤三中至少一个步骤:
步骤一、终端在图形用户界面中的目标位置上,显示该目标按钮。
步骤二、终端在图形用户界面中按照目标尺寸,显示该目标按钮。
步骤三、终端在图形用户界面中按照目标透明度,显示该目标按钮。
该目标位置、目标尺寸和目标透明度中至少一项可以由相关技术人员根据需求进行设置,也可以由用户根据自身习惯进行设置,具体地,该用户配置的设置也可以包括下述步骤一至步骤三中至少一个步骤:
步骤一、终端基于配置界面,获取该目标按钮的位置调整信息,基于该位置调整信息,获取该目标按钮的目标位置。
步骤二、终端基于配置界面,获取该目标按钮的尺寸调整信息,基于该尺寸调整信息,获取该目标按钮的目标尺寸。
步骤三、终端基于配置界面,获取该目标按钮的透明度调整信息,基于该透明度调整信息,获取该目标按钮的目标透明度。
以上内容均可以参见上述步骤301,本申请实施例在此不多做赘述。
1102、当检测到对该目标按钮的触控操作时,终端根据当前虚拟对象所控制的虚拟道具的类型、该当前虚拟对象的运动状态或者该当前虚拟对象在虚拟场景中的环境中至少一项,确定该多个控制功能中待触发的目标功能以及该目标功能所对应的目标模式。
该步骤1102与上述步骤302和步骤303所示内容,或步骤302和步骤305所示内容,或步骤302和步骤307所示内容同理,不同的是,上述步骤302和步骤303所示内容,或步骤302和步骤305所示内容,或步骤302和步骤307所示内容中仅提供了根据当前虚拟对象所控制的虚拟道具的类型,从该目标按钮所对应的多个控制功能中确定出待触发的目标功能以及该目标功能所对应的目标模式的示例,该终端还可以基于该当前虚拟对象的运动状态,或根据当前虚拟对象在虚拟场景中的环境,或根据上述三种影响因素的任意组合,来确定该目标功能和目标功能所对应的目标模式,例如,上述步骤308中所示的根据运动状态或该环境,来确定目标功能和目标模式的过程,本申请实施例在此不一一列举。
需要说明的是,该三种影响因素具体采用哪一种或哪几种、影响因素与控制功能的对应关系以及目标功能所对应的目标模式均可以由相关技术人员根据需求进行设置,也可以由用户根据自身使用习惯进行自定义设置,本申请实施例对此不作限定。
1103、终端基于该目标模式,在该图形用户界面中执行该目标功能。
该步骤1103与上述步骤304、306和步骤308同理,本申请实施例在此不多做赘述。上述图3所示实施例中的三种类型的示例,以及任一实现方式在本实施例中均同理,本申请实施例在此不多做赘述。
本申请实施例通过在图形用户界面中显示有对应于多个控制功能的目标按钮,检测到对目标按钮的触控操作时,可以根据多种影响因素来确定本次操作待触发的目标功能有哪些,该按照什么样的模式来执行这些目标功能,将多个控制功能均关联于同一按钮,通过对一个按钮进行一次触控操作,即可实现多种操作控制功能,且可以影响因素,对目标按钮进行一次操作实现不同的操作控制功能,操作过程简单便捷,操作控制效率高。
上述所有可选技术方案,可以采用任意结合形成本申请的可选实施例,在此不再一一赘述。
应该理解的是,本申请各实施例中的各个步骤并不是必然按照步骤标号指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,各实施例中至少一部分步骤可以包括多个子步骤或者多个阶段,这些子步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些子步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤的子步骤或者阶段的至少一部分轮流或者交替地执行。
图12是本申请实施例提供的一种操作控制装置的结构示意图,参见图12,该装置可以包括:
第一显示模块1201,用于在图形用户界面中显示目标按钮,该目标按钮对应于显示模式切换功能和射击功能;
第一确定模块1202,用于当检测到对该目标按钮的触控操作时,根据当前虚拟对象所控制的虚拟道具的类型,确定该显示模式切换功能和射击功能中待触发的目标功能以及该目标功能所对应的目标模式;
第一执行模块1203,用于基于该目标模式,在该图形用户界面中执行该目 标功能。
在一种可能实现方式中,该第一确定模块1202用于执行下述任一项:
如果该虚拟道具的类型为第一类型,将该显示模式切换功能和该射击功能确定为该待触发的目标功能,将目标显示模式切换模式确定为该显示模式切换功能所对应的目标模式,将第一射击模式确定为该射击功能所对应的目标模式;
如果该虚拟道具的类型为第二类型,将该显示模式切换功能和该射击功能确定为该待触发的目标功能,将目标显示模式切换模式确定为该显示模式切换功能所对应的目标模式,将第二射击模式确定为该射击功能所对应的目标模式;
如果该虚拟道具的类型为第三类型,将该射击功能确定为该待触发的目标功能,将第三射击模式确定为该射击功能所对应的目标模式。
在一种可能实现方式中,该第一执行模块1203用于执行下述任一项:
如果该虚拟道具的类型为第一类型,基于该目标显示模式切换模式和该第一射击模式,当检测到该触控操作开始时,将虚拟场景的显示模式从第一显示模式切换至第二显示模式,在该触控操作持续的期间持续执行射击功能,当检测到该触控操作结束时,将虚拟场景的显示模式从第二显示模式切换至第一显示模式;
如果该虚拟道具的类型为第二类型,基于该目标显示模式切换模式和该第二射击模式,当检测到该触控操作开始时,将虚拟场景的显示模式从第一显示模式切换至第二显示模式,当检测到该触控操作结束时,执行该射击功能,将虚拟场景的显示模式从第二显示模式切换至第一显示模式;
如果该虚拟道具的类型为第三类型,基于该第三射击模式,当检测到该触控操作开始时,执行该射击功能。
在一种可能实现方式中,该第一确定模块1202用于:
如果该虚拟道具的类型为第一类型,获取该虚拟道具当前的射击类型;
当该射击类型为第一射击类型时,执行该将该显示模式切换功能和该射击功能确定为该待触发的目标功能,将目标显示模式切换模式确定为该显示模式切换功能所对应的目标模式,将第一射击模式确定为该射击功能所对应的目标模式的步骤;
当该射击类型为第二射击类型时,将该显示模式切换功能和该射击功能确定为该待触发的目标功能,将目标显示模式切换模式确定为该显示模式切换功能所对应的目标模式,将第二射击模式确定为该射击功能所对应的目标模式。
在一种可能实现方式中,该第一显示模块1201用于执行下述至少一项:
在图形用户界面中的目标位置上,显示该目标按钮;
在图形用户界面中按照目标尺寸,显示该目标按钮;
在图形用户界面中按照目标透明度,显示该目标按钮。
在一种可能实现方式中,该装置还包括获取模块,该获取模块用于执行下述至少一项:
基于配置界面,获取该目标按钮的位置调整信息,基于该位置调整信息,获取该目标按钮的目标位置;
基于配置界面,获取该目标按钮的尺寸调整信息,基于该尺寸调整信息,获取该目标按钮的目标尺寸;
基于配置界面,获取该目标按钮的透明度调整信息,基于该透明度调整信息,获取该目标按钮的目标透明度。
在一种可能实现方式中,该目标按钮还对应于视角调整功能,该第一确定模块1202还用于当检测到对该目标按钮的触控操作时,将该视角调整功能确定为该待触发的目标功能,将目标视角调整模式确定为该视角调整功能所对应的目标模式;
该第一执行模块1203还用于基于该目标视角调整模式,在该触控操作持续的期间,基于该触控操作的操作方向和操作速度,对虚拟场景的视角进行调整。
在一种可能实现方式中,该目标按钮还对应于动作控制功能,该第一确定模块1202还用于当检测到对该目标按钮的触控操作时,将该动作控制功能确定为该待触发的目标功能,将目标动作控制模式确定为该动作控制功能所对应的目标模式;
该第一执行模块1203还用于执行下述任一项:
基于该目标动作控制模式,当检测到该触控操作开始时,控制该当前虚拟对象执行目标动作;
基于该目标动作控制模式,当检测到该触控操作开始时,控制该当前虚拟对象执行目标动作;当检测到该触控操作结束时,控制该当前虚拟对象的动作恢复至执行该目标动作之前的动作。
在一种可能实现方式中,该目标按钮还对应于动作控制功能,该第一确定模块1202还用于当检测到对该目标按钮的触控操作时,将该动作控制功能确定为该待触发的目标功能,根据该当前虚拟对象的运动状态和该当前虚拟对象在 虚拟场景中的环境中至少一项,将多个动作控制模式中的目标动作控制模式确定为该动作控制功能所对应的目标模式;
该第一执行模块1203还用于执行下述任一项:
基于该目标动作控制模式,当检测到该触控操作开始时,控制该当前虚拟对象执行该目标动作控制模式对应的目标动作;
基于该目标动作控制模式,当检测到该触控操作开始时,控制该当前虚拟对象执行该目标动作控制模式对应的目标动作;当检测到该触控操作结束时,控制该当前虚拟对象的动作恢复至执行该目标动作之前的动作。
在一种可能实现方式中,该第一显示模块1201用于当根据配置信息确定操作控制功能处于开启状态时,执行该在图形用户界面中显示目标按钮的步骤。
本申请实施例提供的装置,通过在图形用户界面中显示有对应于显示模式切换功能和射击功能的目标按钮,检测到对目标按钮的触控操作时,可以根据虚拟对象所控制的虚拟道具的类型来确定该触控操作待触发的功能为哪个或哪几个,确定本次执行相应功能需要遵循哪种模式,将显示模式切换功能和射击功能均对应于同一按钮,通过对一个按钮进行一次触控操作,即可实现多种操作控制功能,且可以根据虚拟道具的类型,一次操作实现不同的操作控制功能,操作过程简单便捷,操作控制效率高。
需要说明的是:上述实施例提供的操作控制装置在实现操作控制时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将电子设备的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的操作控制装置与操作控制方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
图13是本申请实施例提供的一种操作控制装置的结构示意图,参见图13,该装置可以包括:
第二显示模块1301,用于在图形用户界面中显示目标按钮,该目标按钮对应于多个控制功能,该多个控制功能包括显示模式切换功能、射击功能、动作控制功能或者视角调整功能中至少两个;
第二确定模块1302,用于当检测到对该目标按钮的触控操作时,根据当前虚拟对象所控制的虚拟道具的类型、该当前虚拟对象的运动状态和该当前虚拟对象在虚拟场景中的环境中至少一项,确定该多个控制功能中待触发的目标功 能以及该目标功能所对应的目标模式;
第二执行模块1303,用于基于该目标模式,在该图形用户界面中执行该目标功能。
在一种可能实现方式中,当该目标按钮对应于显示模式切换功能和射击功能时,该第二确定模块1302用于执行下述任一项:
如果该虚拟道具的类型为第一类型,将该显示模式切换功能和该射击功能确定为该待触发的目标功能,将目标显示模式切换模式确定为该显示模式切换功能所对应的目标模式,将第一射击模式确定为该射击功能所对应的目标模式;
如果该虚拟道具的类型为第二类型,将该显示模式切换功能和该射击功能确定为该待触发的目标功能,将目标显示模式切换模式确定为该显示模式切换功能所对应的目标模式,将第二射击模式确定为该射击功能所对应的目标模式;
如果该虚拟道具的类型为第三类型,将该射击功能确定为该待触发的目标功能,将第三射击模式确定为该射击功能所对应的目标模式。
在一种可能实现方式中,该第二执行模块1303用于执行下述任一项:
如果该虚拟道具的类型为第一类型,基于该目标显示模式切换模式和该第一射击模式,当检测到该触控操作开始时,将虚拟场景的显示模式从第一显示模式切换至第二显示模式,在该触控操作持续的期间持续执行射击功能,当检测到该触控操作结束时,将虚拟场景的显示模式从第二显示模式切换至第一显示模式;
如果该虚拟道具的类型为第二类型,基于该目标显示模式切换模式和该第二射击模式,当检测到该触控操作开始时,将虚拟场景的显示模式从第一显示模式切换至第二显示模式,当检测到该触控操作结束时,执行该射击功能,将虚拟场景的显示模式从第二显示模式切换至第一显示模式;
如果该虚拟道具的类型为第三类型,基于该第三射击模式,当检测到该触控操作开始时,执行该射击功能。
在一种可能实现方式中,该第二确定模块1302用于:
如果该虚拟道具的类型为第一类型,获取该虚拟道具当前的射击类型;
当该射击类型为第一射击类型时,执行该将该显示模式切换功能和该射击功能确定为该待触发的目标功能,将目标显示模式切换模式确定为该显示模式切换功能所对应的目标模式,将第一射击模式确定为该射击功能所对应的目标模式的步骤;
当该射击类型为第二射击类型时,将该显示模式切换功能和该射击功能确定为该待触发的目标功能,将目标显示模式切换模式确定为该显示模式切换功能所对应的目标模式,将第二射击模式确定为该射击功能所对应的目标模式。
在一种可能实现方式中,该第二显示模块1301用于执行下述至少一项:
在图形用户界面中的目标位置上,显示该目标按钮;
在图形用户界面中按照目标尺寸,显示该目标按钮;
在图形用户界面中按照目标透明度,显示该目标按钮。
在一种可能实现方式中,该装置还包括获取模块,该获取模块用于执行下述至少一项:
基于配置界面,获取该目标按钮的位置调整信息,基于该位置调整信息,获取该目标按钮的目标位置;
基于配置界面,获取该目标按钮的尺寸调整信息,基于该尺寸调整信息,获取该目标按钮的目标尺寸;
基于配置界面,获取该目标按钮的透明度调整信息,基于该透明度调整信息,获取该目标按钮的目标透明度。
在一种可能实现方式中,该第二确定模块1302还用于当检测到对该目标按钮的触控操作且该目标按钮对应于视角调整功能时,将该视角调整功能确定为该待触发的目标功能,将目标视角调整模式确定为该视角调整功能所对应的目标模式;
该第二执行模块1303用于基于该目标视角调整模式,在该触控操作持续的期间,基于该触控操作的操作方向和操作速度,对虚拟场景的视角进行调整。
在一种可能实现方式中,该第二确定模块1302用于当检测到对该目标按钮的触控操作且该目标按钮对应于动作控制功能时,将该动作控制功能确定为该待触发的目标功能,将目标动作控制模式确定为该动作控制功能所对应的目标模式;
该第二执行模块1303用于执行下述至少一项:
基于该目标动作控制模式,当检测到该触控操作开始时,控制该当前虚拟对象执行目标动作;
当检测到该触控操作结束时,控制该当前虚拟对象的动作恢复至执行该目标动作之前的动作。
在一种可能实现方式中,该第二确定模块1302用于当检测到对该目标按钮 的触控操作且该目标按钮对应于动作控制功能时,将该动作控制功能确定为该待触发的目标功能,根据该当前虚拟对象的运动状态和该当前虚拟对象在虚拟场景中的环境中至少一项,将多个动作控制模式中的目标动作控制模式确定为该动作控制功能所对应的目标模式;
该第二执行模块1303用于执行下述至少一项:
基于该目标动作控制模式,当检测到该触控操作开始时,控制该当前虚拟对象执行该目标动作控制模式对应的目标动作;
当检测到该触控操作结束时,控制该当前虚拟对象的动作恢复至执行该目标动作之前的动作。
在一种可能实现方式中,该第二显示模块1301用于当根据配置信息确定操作控制功能处于开启状态时,执行该在图形用户界面中显示目标按钮的步骤。
本申请实施例提供的装置,通过在图形用户界面中显示有对应于多个控制功能的目标按钮,检测到对目标按钮的触控操作时,可以根据多种影响因素来确定本次操作待触发的目标功能有哪些,该按照什么样的模式来执行这些目标功能,将多个控制功能均关联于同一按钮,通过对一个按钮进行一次触控操作,即可实现多种操作控制功能,且可以影响因素,对目标按钮进行一次操作实现不同的操作控制功能,操作过程简单便捷,操作控制效率高。
需要说明的是:上述实施例提供的操作控制装置在实现操作控制时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将电子设备的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的操作控制装置与操作控制方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
上述电子设备可以被提供为下述图14所述的终端,也可以被提供为下述图15所示的服务器,本申请实施例对此不作限定。
图14是本申请实施例提供的一种终端的结构示意图。该终端1400可以是:智能手机、平板电脑、MP3播放器(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器、笔记本电脑或台式电脑。终端1400还可能被称为用户设备、便携式终端、膝上型终端、台式终端等其他名称。
通常,终端1400包括有:一个或多个处理器1401和一个或多个存储器1402。
处理器1401可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器1401可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。处理器1401也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器1401可以在集成有GPU(Graphics Processing Unit,图像处理器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器1401还可以包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器1402可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是非暂态的。存储器1402还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1402中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器1401所执行以实现本申请中方法实施例提供的操作控制方法。
在一些实施例中,终端1400还可选包括有:外围设备接口1403和至少一个外围设备。处理器1401、存储器1402和外围设备接口1403之间可以通过总线或信号线相连。各个外围设备可以通过总线、信号线或电路板与外围设备接口1403相连。具体地,外围设备包括:射频电路1404、显示屏1405、摄像头1406、音频电路1407、定位组件1408和电源1409中的至少一种。
外围设备接口1403可被用于将I/O(Input/Output,输入/输出)相关的至少一个外围设备连接到处理器1401和存储器1402。在一些实施例中,处理器1401、存储器1402和外围设备接口1403被集成在同一芯片或电路板上;在一些其他实施例中,处理器1401、存储器1402和外围设备接口1403中的任意一个或两个可以在单独的芯片或电路板上实现,本实施例对此不加以限定。
射频电路1404用于接收和发射RF(Radio Frequency,射频)信号,也称电磁信号。射频电路1404通过电磁信号与通信网络以及其他通信设备进行通信。射频电路1404将电信号转换为电磁信号进行发送,或者,将接收到的电磁信号 转换为电信号。可选地,射频电路1404包括:天线系统、RF收发器、一个或多个放大器、调谐器、振荡器、数字信号处理器、编解码芯片组、用户身份模块卡等等。射频电路1404可以通过至少一种无线通信协议来与其它终端进行通信。该无线通信协议包括但不限于:城域网、各代移动通信网络(2G、3G、4G及5G)、无线局域网和/或WiFi(Wireless Fidelity,无线保真)网络。在一些实施例中,射频电路1404还可以包括NFC(Near Field Communication,近距离无线通信)有关的电路,本申请对此不加以限定。
显示屏1405用于显示UI(User Interface,用户界面)。该UI可以包括图形、文本、图标、视频及其它们的任意组合。当显示屏1405是触摸显示屏时,显示屏1405还具有采集在显示屏1405的表面或表面上方的触摸信号的能力。该触摸信号可以作为控制信号输入至处理器1401进行处理。此时,显示屏1405还可以用于提供虚拟按钮和/或虚拟键盘,也称软按钮和/或软键盘。在一些实施例中,显示屏1405可以为一个,设置终端1400的前面板;在另一些实施例中,显示屏1405可以为至少两个,分别设置在终端1400的不同表面或呈折叠设计;在再一些实施例中,显示屏1405可以是柔性显示屏,设置在终端1400的弯曲表面上或折叠面上。甚至,显示屏1405还可以设置成非矩形的不规则图形,也即异形屏。显示屏1405可以采用LCD(Liquid Crystal Display,液晶显示屏)、OLED(Organic Light-Emitting Diode,有机发光二极管)等材质制备。
摄像头组件1406用于采集图像或视频。可选地,摄像头组件1406包括前置摄像头和后置摄像头。通常,前置摄像头设置在终端的前面板,后置摄像头设置在终端的背面。在一些实施例中,后置摄像头为至少两个,分别为主摄像头、景深摄像头、广角摄像头、长焦摄像头中的任意一种,以实现主摄像头和景深摄像头融合实现背景虚化功能、主摄像头和广角摄像头融合实现全景拍摄以及VR(Virtual Reality,虚拟现实)拍摄功能或者其它融合拍摄功能。在一些实施例中,摄像头组件1406还可以包括闪光灯。闪光灯可以是单色温闪光灯,也可以是双色温闪光灯。双色温闪光灯是指暖光闪光灯和冷光闪光灯的组合,可以用于不同色温下的光线补偿。
音频电路1407可以包括麦克风和扬声器。麦克风用于采集用户及环境的声波,并将声波转换为电信号输入至处理器1401进行处理,或者输入至射频电路1404以实现语音通信。出于立体声采集或降噪的目的,麦克风可以为多个,分别设置在终端1400的不同部位。麦克风还可以是阵列麦克风或全向采集型麦克 风。扬声器则用于将来自处理器1401或射频电路1404的电信号转换为声波。扬声器可以是传统的薄膜扬声器,也可以是压电陶瓷扬声器。当扬声器是压电陶瓷扬声器时,不仅可以将电信号转换为人类可听见的声波,也可以将电信号转换为人类听不见的声波以进行测距等用途。在一些实施例中,音频电路1407还可以包括耳机插孔。
定位组件1408用于定位终端1400的当前地理位置,以实现导航或LBS(Location Based Service,基于位置的服务)。定位组件1408可以是基于美国的GPS(Global Positioning System,全球定位系统)、中国的北斗系统、俄罗斯的格雷纳斯系统或欧盟的伽利略系统的定位组件。
电源1409用于为终端1400中的各个组件进行供电。电源1409可以是交流电、直流电、一次性电池或可充电电池。当电源1409包括可充电电池时,该可充电电池可以支持有线充电或无线充电。该可充电电池还可以用于支持快充技术。
在一些实施例中,终端1400还包括有一个或多个传感器1410。该一个或多个传感器1410包括但不限于:加速度传感器1411、陀螺仪传感器1412、压力传感器1413、指纹传感器1414、光学传感器1415以及接近传感器1416。
加速度传感器1411可以检测以终端1400建立的坐标系的三个坐标轴上的加速度大小。比如,加速度传感器1411可以用于检测重力加速度在三个坐标轴上的分量。处理器1401可以根据加速度传感器1411采集的重力加速度信号,控制显示屏1405以横向视图或纵向视图进行用户界面的显示。加速度传感器1411还可以用于游戏或者用户的运动数据的采集。
陀螺仪传感器1412可以检测终端1400的机体方向及转动角度,陀螺仪传感器1412可以与加速度传感器1411协同采集用户对终端1400的3D动作。处理器1401根据陀螺仪传感器1412采集的数据,可以实现如下功能:动作感应(比如根据用户的倾斜操作来改变UI)、拍摄时的图像稳定、游戏控制以及惯性导航。
压力传感器1413可以设置在终端1400的侧边框和/或显示屏1405的下层。当压力传感器1413设置在终端1400的侧边框时,可以检测用户对终端1400的握持信号,由处理器1401根据压力传感器1413采集的握持信号进行左右手识别或快捷操作。当压力传感器1413设置在显示屏1405的下层时,由处理器1401根据用户对显示屏1405的压力操作,实现对UI界面上的可操作性控件进行控 制。可操作性控件包括按钮控件、滚动条控件、图标控件、菜单控件中的至少一种。
指纹传感器1414用于采集用户的指纹,由处理器1401根据指纹传感器1414采集到的指纹识别用户的身份,或者,由指纹传感器1414根据采集到的指纹识别用户的身份。在识别出用户的身份为可信身份时,由处理器1401授权该用户执行相关的敏感操作,该敏感操作包括解锁屏幕、查看加密信息、下载软件、支付及更改设置等。指纹传感器1414可以被设置终端1400的正面、背面或侧面。当终端1400上设置有物理按键或厂商Logo时,指纹传感器1414可以与物理按键或厂商Logo集成在一起。
光学传感器1415用于采集环境光强度。在一个实施例中,处理器1401可以根据光学传感器1415采集的环境光强度,控制显示屏1405的显示亮度。具体地,当环境光强度较高时,调高显示屏1405的显示亮度;当环境光强度较低时,调低显示屏1405的显示亮度。在另一个实施例中,处理器1401还可以根据光学传感器1415采集的环境光强度,动态调整摄像头组件1406的拍摄参数。
接近传感器1416,也称距离传感器,通常设置在终端1400的前面板。接近传感器1416用于采集用户与终端1400的正面之间的距离。在一个实施例中,当接近传感器1416检测到用户与终端1400的正面之间的距离逐渐变小时,由处理器1401控制显示屏1405从亮屏状态切换为息屏状态;当接近传感器1416检测到用户与终端1400的正面之间的距离逐渐变大时,由处理器1401控制显示屏1405从息屏状态切换为亮屏状态。
本领域技术人员可以理解,图14中示出的结构并不构成对终端1400的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
图15是本申请实施例提供的一种服务器的结构示意图,该服务器1500可因配置或性能不同而产生比较大的差异,可以包括一个或多个处理器(central processing units,CPU)1501和一个或多个的存储器1502,其中,该一个或多个存储器1502中存储有至少一条指令,该至少一条指令由该一个或多个处理器1501加载并执行以实现上述各个方法实施例提供的操作控制方法。当然,该服务器1500还可以具有有线或无线网络接口、键盘以及输入输出接口等部件,以便进行输入输出,该服务器1500还可以包括其他用于实现设备功能的部件,在此不做赘述。
在示例性实施例中,还提供了一种计算机可读存储介质,例如包括指令的存储器,上述指令可由处理器执行以完成上述实施例中的操作控制方法。例如,该计算机可读存储介质可以是只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、只读光盘(Compact Disc Read-Only Memory,CD-ROM)、磁带、软盘和光数据存储设备等。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,该程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。上述仅为本申请的较佳实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (19)

  1. 一种操作控制方法,由电子设备执行,所述方法包括:
    在图形用户界面中显示目标按钮,所述目标按钮对应于显示模式切换功能和射击功能;
    当检测到对所述目标按钮的触控操作时,根据当前虚拟对象所控制的虚拟道具的类型,确定所述显示模式切换功能和射击功能中待触发的目标功能以及所述目标功能所对应的目标模式;及
    基于所述目标模式,在所述图形用户界面中执行所述目标功能。
  2. 根据权利要求1所述的方法,其特征在于,所述根据当前虚拟对象所控制的虚拟道具的类型,确定所述显示模式切换功能和射击功能中待触发的目标功能以及所述目标功能所对应的目标模式,包括:
    如果所述虚拟道具的类型为第一类型,将所述显示模式切换功能和所述射击功能确定为所述待触发的目标功能,将目标显示模式切换模式确定为所述显示模式切换功能所对应的目标模式,将第一射击模式确定为所述射击功能所对应的目标模式。
  3. 根据权利要求2所述的方法,其特征在于,所述基于所述目标模式,在所述图形用户界面中执行所述目标功能,包括:
    如果所述虚拟道具的类型为第一类型,基于所述目标显示模式切换模式和所述第一射击模式,当检测到所述触控操作开始时,将虚拟场景的显示模式从第一显示模式切换至第二显示模式,在所述触控操作持续的期间持续执行射击功能,当检测到所述触控操作结束时,将虚拟场景的显示模式从第二显示模式切换至第一显示模式。
  4. 根据权利要求2所述的方法,其特征在于,所述根据当前虚拟对象所控制的虚拟道具的类型,确定所述显示模式切换功能和射击功能中待触发的目标功能以及所述目标功能所对应的目标模式,包括:
    如果所述虚拟道具的类型为第一类型,获取所述虚拟道具当前的射击类型;
    当所述射击类型为第一射击类型时,执行所述将所述显示模式切换功能和所述射击功能确定为所述待触发的目标功能,将目标显示模式切换模式确定为所述显示模式切换功能所对应的目标模式,将第一射击模式确定为所述射击功能所对应的目标模式的步骤;
    当所述射击类型为第二射击类型时,将所述显示模式切换功能和所述射击 功能确定为所述待触发的目标功能,将目标显示模式切换模式确定为所述显示模式切换功能所对应的目标模式,将第二射击模式确定为所述射击功能所对应的目标模式。
  5. 根据权利要求1所述的方法,其特征在于,所述根据当前虚拟对象所控制的虚拟道具的类型,确定所述显示模式切换功能和射击功能中待触发的目标功能以及所述目标功能所对应的目标模式,包括:
    如果所述虚拟道具的类型为第二类型,将所述显示模式切换功能和所述射击功能确定为所述待触发的目标功能,将目标显示模式切换模式确定为所述显示模式切换功能所对应的目标模式,将第二射击模式确定为所述射击功能所对应的目标模式。
  6. 根据权利要求5所述的方法,其特征在于,所述基于所述目标模式,在所述图形用户界面中执行所述目标功能,包括:
    如果所述虚拟道具的类型为第二类型,基于所述目标显示模式切换模式和所述第二射击模式,当检测到所述触控操作开始时,将虚拟场景的显示模式从第一显示模式切换至第二显示模式,当检测到所述触控操作结束时,执行所述射击功能,将虚拟场景的显示模式从第二显示模式切换至第一显示模式。
  7. 根据权利要求1所述的方法,其特征在于,所述根据当前虚拟对象所控制的虚拟道具的类型,确定所述显示模式切换功能和射击功能中待触发的目标功能以及所述目标功能所对应的目标模式,包括:
    如果所述虚拟道具的类型为第三类型,将所述射击功能确定为所述待触发的目标功能,将第三射击模式确定为所述射击功能所对应的目标模式。
  8. 根据权利要求7所述的方法,其特征在于,所述基于所述目标模式,在所述图形用户界面中执行所述目标功能,包括:
    如果所述虚拟道具的类型为第三类型,基于所述第三射击模式,当检测到所述触控操作开始时,执行所述射击功能。
  9. 根据权利要求1所述的方法,其特征在于,所述在图形用户界面中显示目标按钮,包括下述至少一项:
    在图形用户界面中的目标位置上,显示所述目标按钮;
    在图形用户界面中按照目标尺寸,显示所述目标按钮;或者
    在图形用户界面中按照目标透明度,显示所述目标按钮。
  10. 根据权利要求9所述的方法,其特征在于,所述方法还包括下述至少一项:
    基于配置界面,获取所述目标按钮的位置调整信息,基于所述位置调整信息,获取所述目标按钮的目标位置;
    基于配置界面,获取所述目标按钮的尺寸调整信息,基于所述尺寸调整信息,获取所述目标按钮的目标尺寸;或者
    基于配置界面,获取所述目标按钮的透明度调整信息,基于所述透明度调整信息,获取所述目标按钮的目标透明度。
  11. 根据权利要求1所述的方法,其特征在于,所述目标按钮还对应于视角调整功能,所述方法还包括:
    当检测到对所述目标按钮的触控操作时,将所述视角调整功能确定为所述待触发的目标功能,将目标视角调整模式确定为所述视角调整功能所对应的目标模式;
    所述基于所述目标模式,在所述图形用户界面中执行所述目标功能,还包括:
    基于所述目标视角调整模式,在所述触控操作持续的期间,基于所述触控操作的操作方向和操作速度,对虚拟场景的视角进行调整。
  12. 根据权利要求1所述的方法,其特征在于,所述目标按钮还对应于动作控制功能,所述方法还包括:
    当检测到对所述目标按钮的触控操作时,将所述动作控制功能确定为所述待触发的目标功能,将目标动作控制模式确定为所述动作控制功能所对应的目标模式;
    所述基于所述目标模式,在所述图形用户界面中执行所述目标功能,还包括下述至少一项:
    基于所述目标动作控制模式,当检测到所述触控操作开始时,控制所述当前虚拟对象执行目标动作;
    当检测到所述触控操作结束时,控制所述当前虚拟对象的动作恢复至执行所述目标动作之前的动作。
  13. 根据权利要求1所述的方法,其特征在于,所述目标按钮还对应于动作控制功能,所述方法还包括:
    当检测到对所述目标按钮的触控操作时,将所述动作控制功能确定为所述 待触发的目标功能,根据所述当前虚拟对象的运动状态或者所述当前虚拟对象在虚拟场景中的环境中至少一项,将多个动作控制模式中的目标动作控制模式确定为所述动作控制功能所对应的目标模式;
    所述基于所述目标模式,在所述图形用户界面中执行所述目标功能,还包括下述至少一项:
    基于所述目标动作控制模式,当检测到所述触控操作开始时,控制所述当前虚拟对象执行所述目标动作控制模式对应的目标动作;
    当检测到所述触控操作结束时,控制所述当前虚拟对象的动作恢复至执行所述目标动作之前的动作。
  14. 根据权利要求1所述的方法,其特征在于,所述在图形用户界面中显示目标按钮,包括:
    当根据配置信息确定操作控制功能处于开启状态时,执行所述在图形用户界面中显示目标按钮的步骤。
  15. 一种操作控制方法,其特征在于,由电子设备执行,所述方法包括:
    在图形用户界面中显示目标按钮,所述目标按钮对应于多个控制功能,所述多个控制功能包括显示模式切换功能、射击功能、动作控制功能或者视角调整功能中至少两个;
    当检测到对所述目标按钮的触控操作时,根据当前虚拟对象所控制的虚拟道具的类型、所述当前虚拟对象的运动状态或者所述当前虚拟对象在虚拟场景中的环境中至少一项,确定所述多个控制功能中待触发的目标功能以及所述目标功能所对应的目标模式;及
    基于所述目标模式,在所述图形用户界面中执行所述目标功能。
  16. 一种操作控制装置,所述装置包括:
    显示模块,用于在图形用户界面中显示目标按钮,所述目标按钮对应于显示模式切换功能和射击功能;
    确定模块,用于当检测到对所述目标按钮的触控操作时,根据当前虚拟对象所控制的虚拟道具的类型,确定所述显示模式切换功能和射击功能中待触发的目标功能以及所述目标功能所对应的目标模式;及
    执行模块,用于基于所述目标模式,在所述图形用户界面中执行所述目标功能。
  17. 一种操作控制装置,所述装置包括:
    显示模块,用于在图形用户界面中显示目标按钮,所述目标按钮对应于多个控制功能,所述多个控制功能包括显示模式切换功能、射击功能、动作控制功能或者视角调整功能中至少两个;
    确定模块,用于当检测到对所述目标按钮的触控操作时,根据当前虚拟对象所控制的虚拟道具的类型、所述当前虚拟对象的运动状态和所述当前虚拟对象在虚拟场景中的环境中至少一项,确定所述多个控制功能中待触发的目标功能以及所述目标功能所对应的目标模式;及
    执行模块,用于基于所述目标模式,在所述图形用户界面中执行所述目标功能。
  18. 一种电子设备,包括存储器和处理器,所述存储器中存储有计算机可读指令,所述计算机可读指令被所述处理器执行时,使得所述处理器执行如权利要求1至权利要求15任一项所述的操作控制方法的步骤。
  19. 一个或多个存储有计算机可读指令的非易失性存储介质,所述计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行如权利要求1至权利要求15任一项所述的操作控制方法的步骤。
PCT/CN2020/079706 2019-04-11 2020-03-17 操作控制方法、装置、电子设备及存储介质 WO2020207206A1 (zh)

Priority Applications (7)

Application Number Priority Date Filing Date Title
KR1020217017252A KR102578242B1 (ko) 2019-04-11 2020-03-17 조작 제어 방법 및 장치, 전자 기기, 및 저장 매체
AU2020256524A AU2020256524A1 (en) 2019-04-11 2020-03-17 Operation control method and apparatus, and electronic device and storage medium
SG11202104911TA SG11202104911TA (en) 2019-04-11 2020-03-17 Operation control method and apparatus, and electronic device and storage medium
CA3132897A CA3132897A1 (en) 2019-04-11 2020-03-17 Operation control method and apparatus, and electronic device and storage medium
BR112021019455A BR112021019455A2 (pt) 2019-04-11 2020-03-17 Método e aparelho de controle de operação, e dispositivo eletrônico e meio de armazenamento
JP2021531040A JP7231737B2 (ja) 2019-04-11 2020-03-17 動作制御方法、装置、電子機器およびプログラム
US17/317,853 US20210260479A1 (en) 2019-04-11 2021-05-11 Operation control method and apparatus, electronic device, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910290727.0A CN110141869A (zh) 2019-04-11 2019-04-11 操作控制方法、装置、电子设备及存储介质
CN201910290727.0 2019-04-11

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/317,853 Continuation US20210260479A1 (en) 2019-04-11 2021-05-11 Operation control method and apparatus, electronic device, and storage medium

Publications (1)

Publication Number Publication Date
WO2020207206A1 true WO2020207206A1 (zh) 2020-10-15

Family

ID=67588909

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/079706 WO2020207206A1 (zh) 2019-04-11 2020-03-17 操作控制方法、装置、电子设备及存储介质

Country Status (9)

Country Link
US (1) US20210260479A1 (zh)
JP (1) JP7231737B2 (zh)
KR (1) KR102578242B1 (zh)
CN (1) CN110141869A (zh)
AU (1) AU2020256524A1 (zh)
BR (1) BR112021019455A2 (zh)
CA (1) CA3132897A1 (zh)
SG (1) SG11202104911TA (zh)
WO (1) WO2020207206A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114100134A (zh) * 2021-10-21 2022-03-01 腾讯科技(深圳)有限公司 虚拟场景的画面展示方法、装置、设备、介质及程序产品
EP4119210A4 (en) * 2020-11-19 2023-11-01 Tencent Technology (Shenzhen) Company Limited STATE CHANGE METHOD AND APPARATUS IN A VIRTUAL SCENE, APPARATUS, MEDIUM AND PROGRAM PRODUCT

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110141869A (zh) * 2019-04-11 2019-08-20 腾讯科技(深圳)有限公司 操作控制方法、装置、电子设备及存储介质
CN110339562B (zh) * 2019-08-01 2023-09-15 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、终端及存储介质
CN110448908B (zh) * 2019-08-22 2020-12-01 腾讯科技(深圳)有限公司 虚拟环境中瞄准镜的应用方法、装置、设备及存储介质
CN110665229B (zh) * 2019-10-11 2021-08-20 腾讯科技(深圳)有限公司 一种射击游戏中元素交互的方法以及相关装置
CN110975289B (zh) * 2019-11-14 2021-10-15 腾讯科技(深圳)有限公司 射击模式切换控制方法和装置、存储介质及电子装置
CN111068330B (zh) * 2019-11-21 2021-07-27 腾讯科技(深圳)有限公司 虚拟攻击道具的处理方法和装置、存储介质及电子装置
CN111359214B (zh) * 2020-03-05 2021-05-11 腾讯科技(深圳)有限公司 虚拟道具控制方法和装置、存储介质及电子装置
CN111589114B (zh) * 2020-05-12 2023-03-10 腾讯科技(深圳)有限公司 虚拟对象的选择方法、装置、终端及存储介质
CN111632380A (zh) * 2020-05-28 2020-09-08 腾讯科技(深圳)有限公司 虚拟姿态切换方法、装置、存储介质及电子装置
EP3992766A4 (en) * 2020-09-11 2022-08-24 Tencent Technology (Shenzhen) Company Limited METHOD AND APPARATUS FOR ADJUSTING A CONTROL POSITION IN AN APPLICATION PROGRAM, DEVICE, AND RECORDING MEDIA
CN112316428A (zh) * 2020-10-27 2021-02-05 腾讯科技(深圳)有限公司 一种虚拟道具的处理方法、装置及计算机可读存储介质
US11534681B2 (en) * 2020-10-29 2022-12-27 Google Llc Virtual console gaming controller
CN112354181B (zh) * 2020-11-30 2022-12-30 腾讯科技(深圳)有限公司 开镜画面展示方法、装置、计算机设备及存储介质
CN112791398A (zh) * 2021-02-03 2021-05-14 网易(杭州)网络有限公司 游戏中虚拟倍镜的控制方法、装置、电子设备及存储介质
CN113318430A (zh) * 2021-05-28 2021-08-31 网易(杭州)网络有限公司 虚拟角色的姿态调整方法、装置、处理器及电子装置
CN113546425B (zh) * 2021-07-27 2024-03-01 网易(杭州)网络有限公司 游戏中虚拟物品处理方法、装置、终端和存储介质
CN113687761B (zh) * 2021-08-24 2023-08-08 网易(杭州)网络有限公司 游戏控制方法及装置、电子设备、存储介质
KR102354559B1 (ko) * 2021-08-24 2022-01-21 한국기술교육대학교 산학협력단 콘텐츠 제어용 다종 인터페이스 장치
KR102646689B1 (ko) * 2021-10-20 2024-03-12 (주)크래프톤 모바일 게임을 위한 사용자 인터페이스 제공 방법 및 이를 적용한 디바이스
CN113926181A (zh) * 2021-10-21 2022-01-14 腾讯科技(深圳)有限公司 虚拟场景的对象控制方法、装置及电子设备
CN113986079B (zh) * 2021-10-28 2023-07-14 腾讯科技(深圳)有限公司 虚拟按钮的设置方法和装置、存储介质及电子设备
CN114371898B (zh) * 2021-12-10 2022-11-22 北京城市网邻信息技术有限公司 信息展示方法、设备、装置及存储介质
CN114217708B (zh) * 2021-12-15 2023-05-26 腾讯科技(深圳)有限公司 虚拟场景中开局操作的控制方法、装置、设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140274239A1 (en) * 2013-03-12 2014-09-18 Fourthirtythree Inc. Computer readable medium recording shooting game
CN106293070A (zh) * 2016-07-27 2017-01-04 网易(杭州)网络有限公司 虚拟角色视角方向控制方法及装置
CN109091869A (zh) * 2018-08-10 2018-12-28 腾讯科技(深圳)有限公司 虚拟对象的动作控制方法、装置、计算机设备及存储介质
CN109589601A (zh) * 2018-12-10 2019-04-09 网易(杭州)网络有限公司 虚拟瞄准镜控制方法及装置、电子设备和存储介质
CN110141869A (zh) * 2019-04-11 2019-08-20 腾讯科技(深圳)有限公司 操作控制方法、装置、电子设备及存储介质

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9561436B2 (en) 2013-02-26 2017-02-07 Gree, Inc. Shooting game control method and game system
JP5727655B1 (ja) 2014-09-17 2015-06-03 株式会社Pgユニバース 情報処理装置、情報処理方法及びプログラム
CN105688409A (zh) 2016-01-27 2016-06-22 网易(杭州)网络有限公司 游戏控制方法及装置
US9919213B2 (en) 2016-05-03 2018-03-20 Hothead Games Inc. Zoom controls for virtual environment user interfaces
KR20180116870A (ko) * 2017-04-18 2018-10-26 주식회사 드래곤플라이 게임 장치 및 컴퓨터 프로그램
CN108525294B (zh) * 2018-04-04 2021-07-27 网易(杭州)网络有限公司 射击游戏的控制方法和装置
CN108553891A (zh) * 2018-04-27 2018-09-21 腾讯科技(深圳)有限公司 对象瞄准方法和装置、存储介质及电子装置
CN108771863B (zh) * 2018-06-11 2022-04-15 网易(杭州)网络有限公司 射击游戏的控制方法和装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140274239A1 (en) * 2013-03-12 2014-09-18 Fourthirtythree Inc. Computer readable medium recording shooting game
CN106293070A (zh) * 2016-07-27 2017-01-04 网易(杭州)网络有限公司 虚拟角色视角方向控制方法及装置
CN109091869A (zh) * 2018-08-10 2018-12-28 腾讯科技(深圳)有限公司 虚拟对象的动作控制方法、装置、计算机设备及存储介质
CN109589601A (zh) * 2018-12-10 2019-04-09 网易(杭州)网络有限公司 虚拟瞄准镜控制方法及装置、电子设备和存储介质
CN110141869A (zh) * 2019-04-11 2019-08-20 腾讯科技(深圳)有限公司 操作控制方法、装置、电子设备及存储介质

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
BAIDU JINGYAN: "How to Enable Peek and Fire Mode in PUBG Mobile", 19 March 2019 (2019-03-19), Retrieved from the Internet <URL:https://jingyan.baidu.com/article/6dad50754008d7a123e36ef7.html> *
CHIJIYOUXI XIAOXIAZI: "PUBG Mobile: New Fire Mode Launched in the Latest Version, Even Beginners Can be Winners!", 18 March 2019 (2019-03-18), Retrieved from the Internet <URL:https://baijiahao.baidu.com/s?id=1628346935618351582&wfr=spider&for=pc> *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4119210A4 (en) * 2020-11-19 2023-11-01 Tencent Technology (Shenzhen) Company Limited STATE CHANGE METHOD AND APPARATUS IN A VIRTUAL SCENE, APPARATUS, MEDIUM AND PROGRAM PRODUCT
CN114100134A (zh) * 2021-10-21 2022-03-01 腾讯科技(深圳)有限公司 虚拟场景的画面展示方法、装置、设备、介质及程序产品

Also Published As

Publication number Publication date
JP7231737B2 (ja) 2023-03-01
AU2020256524A1 (en) 2021-10-07
KR102578242B1 (ko) 2023-09-12
BR112021019455A2 (pt) 2021-11-30
US20210260479A1 (en) 2021-08-26
KR20210086705A (ko) 2021-07-08
SG11202104911TA (en) 2021-06-29
CA3132897A1 (en) 2020-10-15
CN110141869A (zh) 2019-08-20
JP2022509295A (ja) 2022-01-20

Similar Documents

Publication Publication Date Title
WO2020207206A1 (zh) 操作控制方法、装置、电子设备及存储介质
CN109350964B (zh) 控制虚拟角色的方法、装置、设备及存储介质
JP7476109B2 (ja) 仮想オブジェクトと仮想シーンとのインタラクションの制御方法、装置、端末及びコンピュータプログラム
WO2019214402A1 (zh) 虚拟环境中的配件切换方法、装置、设备及存储介质
TWI796777B (zh) 虛擬物品的控制方法、裝置、終端及儲存媒體
WO2021017783A1 (zh) 视角转动的方法、装置、设备及存储介质
CN111589132A (zh) 虚拟道具展示方法、计算机设备及存储介质
WO2020151594A1 (zh) 视角转动的方法、装置、设备及存储介质
WO2020125340A1 (zh) 控制信息处理方法、装置、电子设备及存储介质
WO2021031765A1 (zh) 虚拟环境中瞄准镜的应用方法和相关装置
WO2022237076A1 (zh) 虚拟对象的控制方法、装置、设备及计算机可读存储介质
WO2022089152A1 (zh) 确定选中目标的方法、装置、设备及存储介质
CN112121438B (zh) 操作提示方法、装置、终端及存储介质
CN110152309B (zh) 语音通信方法、装置、电子设备及存储介质
RU2787649C1 (ru) Способ и устройство для управления операциями, электронное устройство и носитель данных
JP7413563B2 (ja) 仮想オブジェクトの制御方法、装置、機器及びコンピュータプログラム
CN113633976A (zh) 操作控制方法、装置、设备及计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20788013

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021531040

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20217017252

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 3132897

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2020256524

Country of ref document: AU

Date of ref document: 20200317

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112021019455

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 112021019455

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20210928

122 Ep: pct application non-entry in european phase

Ref document number: 20788013

Country of ref document: EP

Kind code of ref document: A1