WO2024012010A1 - 虚拟对象的控制方法、装置、设备、存储介质及程序产品 - Google Patents

虚拟对象的控制方法、装置、设备、存储介质及程序产品 Download PDF

Info

Publication number
WO2024012010A1
WO2024012010A1 PCT/CN2023/091178 CN2023091178W WO2024012010A1 WO 2024012010 A1 WO2024012010 A1 WO 2024012010A1 CN 2023091178 W CN2023091178 W CN 2023091178W WO 2024012010 A1 WO2024012010 A1 WO 2024012010A1
Authority
WO
WIPO (PCT)
Prior art keywords
trigger
trigger area
virtual object
area
sliding operation
Prior art date
Application number
PCT/CN2023/091178
Other languages
English (en)
French (fr)
Inventor
刘智洪
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2024012010A1 publication Critical patent/WO2024012010A1/zh

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Definitions

  • Embodiments of the present application relate to the fields of computer and Internet technologies, and in particular to a virtual object control method, device, equipment, storage medium, and program product.
  • the moving direction of the virtual object can be controlled by the direction of the sliding operation of the finger.
  • Embodiments of the present application provide a virtual object control method, device, equipment, storage medium, and program product.
  • the technical solutions are as follows:
  • a method for controlling a virtual object is provided.
  • the method is executed by a terminal device, and the method includes:
  • the user interface displays a joystick control for controlling the movement of virtual objects, the joystick control has multiple trigger areas, and different trigger areas correspond to different sensitivities;
  • the first moving speed of the virtual object is determined according to the sensitivity of the first trigger area; the first trigger area is the plurality of trigger areas one of the;
  • the virtual object is controlled to move at the first moving speed.
  • a method for controlling a virtual object is provided.
  • the method is executed by a terminal device, and the method includes:
  • a joystick control for controlling the movement of virtual objects is displayed in the user interface, and the joystick control has multiple trigger areas;
  • first trigger area and the second trigger area are respectively one of the plurality of trigger areas; the first trigger area and the second trigger area are different, and the first moving speed It is different from the second moving speed.
  • a device for controlling a virtual object includes:
  • An interface display module is used to display a user interface.
  • the user interface displays a joystick control for controlling the movement of virtual objects.
  • the joystick control has multiple trigger areas, and different trigger areas correspond to different sensitivities;
  • a speed determination module configured to determine the first moving speed of the virtual object according to the sensitivity of the first trigger area in response to the first sliding operation with the starting position located in the first trigger area; the first trigger area is one of the plurality of trigger areas;
  • a movement control module used to control the virtual object to move at the first movement speed.
  • a device for controlling a virtual object includes:
  • An interface display module is used to display a user interface.
  • the user interface displays a joystick control for controlling the movement of virtual objects, and the joystick control has multiple trigger areas;
  • a movement control module configured to control the virtual object to move at a first movement speed in response to the first sliding operation with the starting position located in the first trigger area
  • the movement control module is also configured to control the virtual object to move at a second movement speed in response to the second sliding operation with the starting position located in the second trigger area;
  • first trigger area and the second trigger area are respectively one of the plurality of trigger areas; the first trigger area and the second trigger area are different, and the first moving speed and The second moving speeds are different.
  • a terminal device includes a processor and a memory.
  • a computer program is stored in the memory. The computer program is loaded and executed by the processor to implement the above. method.
  • a computer-readable storage medium in which a computer program is stored, and the computer program is loaded and executed by a processor to implement the above method.
  • a computer program product includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
  • the processor of the terminal device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the terminal device performs the above method.
  • the moving speed of the virtual object is determined based on the sensitivity of the trigger area where the initial position of the user's sliding operation is located, and the virtual object is controlled to move according to the moving speed. Users can operate in different locations according to different needs.
  • the sensitive trigger area begins to perform sliding operations, making it possible to control virtual objects to move at different speeds, enriching the control methods of virtual objects, and improving the efficiency of human-computer interaction.
  • Figure 1 is a schematic diagram of a solution implementation environment provided by an embodiment of the present application.
  • Figure 2 is a schematic diagram of a virtual object control method provided by an embodiment of the present application.
  • Figure 3 is a flow chart of a virtual object control method provided by an embodiment of the present application.
  • Figure 4 is a schematic diagram of a user interface provided by an embodiment of the present application.
  • Figure 5 is a schematic diagram of a user interface provided by another embodiment of the present application.
  • Figure 6 is a schematic diagram of a user interface provided by another embodiment of the present application.
  • Figure 7 is a schematic diagram of a user interface provided by another embodiment of the present application.
  • Figure 8 is a flow chart of a virtual object control method provided by another embodiment of the present application.
  • Figure 9 is a flow chart of a virtual object control method provided by another embodiment of the present application.
  • Figure 10 is a schematic diagram of a user interface provided by another embodiment of the present application.
  • Figure 11 is a schematic diagram of a user interface provided by another embodiment of the present application.
  • Figure 12 is a schematic diagram of a user interface provided by another embodiment of the present application.
  • Figure 13 is a flow chart of a virtual object control method provided by another embodiment of the present application.
  • Figure 14 is a schematic diagram of a user interface provided by another embodiment of the present application.
  • Figure 15 is a schematic diagram of a user interface provided by another embodiment of the present application.
  • Figure 16 is a flow chart of a virtual object control method provided by another embodiment of the present application.
  • Figure 17 is a schematic diagram of a user interface provided by another embodiment of the present application.
  • Figure 18 is a flow chart of a virtual object control method provided by another embodiment of the present application.
  • Figure 19 is a block diagram of a virtual object control device provided by an embodiment of the present application.
  • Figure 20 is a block diagram of a virtual object control device provided by another embodiment of the present application.
  • Figure 21 is a block diagram of a virtual object control device provided by another embodiment of the present application.
  • Figure 22 is a structural block diagram of a terminal device provided by an embodiment of the present application.
  • Figure 1 shows a schematic diagram of a solution implementation environment provided by an embodiment of the present application.
  • the implementation environment of this solution can realize a control system that becomes a virtual object.
  • the solution implementation environment may include: terminal device 10 and server 20 .
  • the terminal device 10 may be an electronic device such as a mobile phone, a tablet computer, a game console, an e-book reader, a multimedia playback device, a wearable device, a PC (Personal Computer), a vehicle-mounted terminal, etc.
  • a client of a target application (such as a game application) can be installed in the terminal device 10 .
  • the target application may be an application that needs to be downloaded and installed, or may be a click-and-use application, which is not limited in the embodiments of the present application.
  • the target application may be a shooting application, a racing application, a multiplayer online tactical competitive game, etc.
  • This application does not limit this.
  • the above-mentioned target application may be a shooting application, which can provide a virtual environment for virtual objects operated by the user to move in the virtual environment.
  • shooting applications can be TPS (Third-Personal Shooting Game, third-person shooting game), FPS (First-person shooting game, first-person shooting game), MOBA (Multiplayer Online Battle Arena, multiplayer online tactical competition) game , multiplayer gunfight survival games, virtual reality (VR) shooting applications, augmented reality (Augmented Reality, AR) applications, three-dimensional map programs, social applications, interactive entertainment applications, etc.
  • the terminal device 10 runs a client of the above application program.
  • the above-mentioned application is an application developed based on a three-dimensional virtual environment engine.
  • the virtual environment engine is the Unity engine.
  • the virtual environment engine can build a three-dimensional virtual environment, virtual objects, virtual props, etc., to provide users with Bringing a more immersive gaming experience.
  • the above-mentioned virtual environment is a scene displayed (or provided) when the client of the target application (such as a game application) is running on the terminal device.
  • the virtual environment refers to a virtual object created for activities (such as game competition) Scenes, such as virtual houses, virtual islands, virtual maps, etc.
  • the virtual environment can be a simulation environment of the real world, a semi-simulation and semi-fictional environment, or a purely fictitious environment.
  • the virtual environment may be a two-dimensional virtual environment, a 2.5-dimensional virtual environment, or a three-dimensional virtual environment, which is not limited in the embodiments of the present application.
  • the above-mentioned virtual objects refer to virtual characters, virtual vehicles, virtual items, etc. controlled by the user account in the target application, and this application does not limit this.
  • the virtual object refers to the game character controlled by the user account in the game application.
  • the virtual object may be in the form of a character, an animal, a cartoon, or other forms, which are not limited in the embodiments of the present application.
  • the virtual object can be displayed in a three-dimensional form or in a two-dimensional form, which is not limited in the embodiments of the present application.
  • the virtual object is a three-dimensional model created based on animation skeleton technology.
  • Each virtual object has its own shape and volume in the three-dimensional virtual environment and occupies a part of the space in the three-dimensional virtual environment.
  • the virtual object is a virtual vehicle in the virtual environment, such as a virtual car, a virtual hot air balloon, a virtual motorcycle, and other virtual props that can be controlled by the user.
  • the server 20 is used to provide background services for clients that install and run target applications in the terminal device 10 .
  • the server 20 may be a backend server of the above-mentioned game application.
  • the server 20 may be one server, a server cluster composed of multiple servers, or a cloud computing service center.
  • the server 20 provides background services for target applications in multiple terminal devices 10 at the same time.
  • the terminal device 10 and the server 20 can communicate with each other through the network.
  • FIG. 2 shows a schematic diagram of a virtual object control method provided by an embodiment of the present application.
  • a user interface is displayed on the terminal device 10 as shown in Figure 1, and multiple trigger areas are provided on the user interface, namely trigger areas. Domain z1, trigger area z2, and trigger area z3.
  • the three trigger areas correspond to different sensitivities respectively.
  • the sensitivity of the trigger area is related to the moving speed of the virtual object.
  • the moving speed of the virtual object is determined according to the sensitivity of the trigger area z1, and the movement of the virtual object is controlled at this speed.
  • the technical solution provided by the embodiment of this application sets multiple trigger areas on the user interface. Different trigger areas correspond to different sensitivities, and different sensitivities correspond to different moving speeds of virtual objects.
  • the moving speed of the virtual object is determined according to the sensitivity of the target trigger area, and the virtual object is controlled to move at this moving speed.
  • the settings for the trigger area include but are not limited to the size of the area, the sensitivity of the area, etc.
  • users can adjust the starting position of the sliding operation according to real-time conditions, so that the moving speed of virtual objects can change, further improving the user's control over the virtual objects, and also improving the user's game experience. .
  • FIG. 3 shows a flow chart of a virtual object control method provided by an embodiment of the present application.
  • the execution subject of each step of the method may be the terminal device 10 in the solution implementation environment shown in Figure 1.
  • the execution subject of each step may be the client of the target application.
  • the method may include at least one of the following steps (320-360):
  • Step 320 Display the user interface.
  • the user interface displays a joystick control for controlling the movement of the virtual object.
  • the joystick control has multiple trigger areas, and different trigger areas correspond to different sensitivities.
  • Joystick control It can also be called a virtual joystick. It consists of a roulette part and a remote sensing part.
  • the roulette part is the operable range of the virtual joystick. When the user does not operate, the position of the remote sensing part does not change.
  • the remote sensing part can slide within the range of the roulette part as the finger slides, and the user can slide the remote sensing part arbitrarily within the range of the roulette part.
  • a rocker control can control the movement direction of a virtual object.
  • the trigger area refers to a specific area in the interface. Different trigger areas correspond to different sensitivities. This area can be set by the server, or can be set or adjusted by the user. In the embodiment of the present application, the size and shape of the trigger area are not limited. For example, the shape of the trigger area can be rectangular, circular, rounded rectangle, etc., and the size of the trigger area can be reasonably set based on the interface layout. In addition, the size and shape of different trigger areas may be the same or different, which is not limited in this application.
  • the sensitivity in the embodiment of this application refers to the movement sensitivity, which can be understood as the same sliding operation performed by the user in different trigger areas, but the speed at which the virtual object is controlled is different.
  • the user also performs a sliding operation on the user interface.
  • the starting position of the sliding operation is located in different trigger areas.
  • the virtual objects corresponding to different trigger areas have different moving speeds.
  • the moving speed of the corresponding virtual object is 10 meters/second
  • the moving speed of the corresponding virtual object is 20 meters/second.
  • FIG 4 shows a schematic diagram of a user interface provided by an embodiment of the present application.
  • the user interface displays a joystick control Y1 for controlling the movement of virtual objects.
  • the joystick control has two trigger areas, namely trigger area Q1 and trigger area Q2. Different trigger areas correspond to different sensitivities.
  • the sensitivity of the trigger area Q1 is x
  • the sensitivity of the trigger area Q2 is y
  • x and y are positive numbers.
  • the rocker control Y1 as shown in Figure 4 is in the trigger area Q1, and the sensitivity of the rocker control Y1 is x.
  • Step 340 in response to the first sliding operation whose starting position is located in the first trigger area, determine the first moving speed of the virtual object according to the sensitivity of the first trigger area; the first trigger area is one of multiple trigger areas.
  • the first sliding operation is an action performed by the user.
  • the terminal device is a handheld device
  • the user's first sliding operation is an operation directly performed on the terminal device, such as a sliding operation, pressing operation, dragging operation, etc. on the mobile phone screen.
  • the terminal device is not a handheld device
  • the user's first interactive operation may be an operation performed on a peripheral of the terminal device, such as a double-click operation on a mouse, a click operation on a keyboard, a click on a handle, shaking, etc., This application does not limit the type of the first sliding operation.
  • the sliding operation has a starting position and a real-time Position, for example, if the sliding operation is a sliding operation on the mobile phone screen, the starting position is the position where the hand first touches the screen, and then the hand slides on the screen to perform the sliding operation.
  • the current position of the hand touching the screen during the sliding operation is The real-time position of the sliding operation. When the hand leaves the screen, the sliding operation ends.
  • FIG. 5 shows a schematic diagram of a user interface provided by another embodiment of the present application.
  • the user interface displays a joystick control for controlling the movement of virtual objects.
  • the joystick control has three trigger areas, namely trigger area Q3, trigger area Q4 and trigger area Q5. Different trigger areas correspond to different sensitivities.
  • the sensitivity of the trigger area Q3 is 10
  • the sensitivity of the trigger area Q4 is 8,
  • the sensitivity of Q5 is 5.
  • the joystick control as shown in Figure 5 is in the trigger area Q5, and the sensitivity of the joystick control Y1 is 5. .
  • the moving speed corresponding to different sensitivities is preset by the server.
  • sensitivity 1 indicates that the moving speed of the virtual object is 1 m/s
  • sensitivity 2 indicates that the moving speed of the virtual object is 2 m/s. And so on. Then in Figure 5, according to the sensitivity of the trigger area Q5 being 5, it is determined that the first moving speed of the virtual object is 5m/s.
  • the sensitivity of different trigger areas can be customized by the user. It can be set by the user before entering the game, or it can be set by the user according to the real-time situation of the game after entering the game. set up.
  • the initially set sensitivities of trigger area Q3, trigger area Q4 and trigger area Q5 in Figure 5 are 10, 8 and 5 respectively, but the user considers that the sensitivity of trigger area Q5 does not need to reach 5, then the user You can set the sensitivity of trigger area Q5 by yourself.
  • the moving speed of the virtual object also changes accordingly. For example, the trigger The sensitivity of area Q5 is adjusted from 5 to 3, and the moving speed of the virtual object is adjusted to 3m/s.
  • the trigger area where the starting position of the sliding operation performed by the user falls is determined as the selected trigger area, for example, the trigger area where the starting position of the first sliding operation falls is determined as the first trigger area.
  • the trigger area with the highest sensitivity of the trigger area near the boundary can be set as the first trigger area, or the trigger area with the smallest sensitivity near the boundary can be set.
  • the trigger area is the first trigger area.
  • the trigger area Q1 is determined as the first trigger area or the trigger area Q2 is determined as the third trigger area.
  • a trigger area, or the one with greater sensitivity among the trigger area Q1 and the trigger area Q2 is determined as the first trigger area.
  • At least two trigger areas overlap.
  • the starting position of the first sliding operation is obtained; the distance between the starting position of the first sliding operation and the reference point of each trigger area is determined; wherein, the position of the reference point of each trigger area Different from each other; from multiple trigger areas, the trigger area with the smallest distance is determined as the first trigger area.
  • the "overlap between two trigger areas" described in the embodiment of this application means that there is an overlapping area between the two trigger areas, but there are also corresponding non-overlapping areas, that is, the two trigger areas do not completely overlap, only Partially overlapping.
  • the trigger area Q6 has a non-overlapping portion with respect to the trigger area Q7
  • the trigger area Q7 also has a non-overlapping portion with respect to the trigger area Q6.
  • the trigger area where the starting position of the sliding operation falls is determined as the first trigger area.
  • the trigger area with the greatest sensitivity of the trigger area near the boundary can be set as For the first trigger area, the trigger area with the smallest sensitivity of the trigger area near the boundary can also be set as the first trigger area.
  • the closest one is selected as the selected trigger area.
  • the reference point is the center point position of the trigger area or other position that can characterize the trigger area.
  • the starting position of the sliding operation is D0
  • D1 is the center position of the trigger area Q6
  • D2 is the center position of the trigger area Q7
  • D3 is the center position of the trigger area Q8.
  • the starting position of the sliding operation D0 is in the overlapping area of trigger area Q6 and trigger area Q7. Determine the distance from D0 to D1 and the distance from D0 to D2. If the distance from D0 to D2 is smaller, then determine the trigger area Q7 where D2 is located as the selected first trigger. area.
  • overlapping the arrangement of multiple trigger areas can reduce the interface area occupied by the trigger area and avoid affecting the arrangement of other controls in the interface; on the other hand, overlapping the arrangement in multiple trigger areas
  • the trigger area closest to the operating position is selected as the trigger area selected by the user, ensuring that the trigger area can be correctly selected.
  • the three trigger areas in Figure 6 are distributed vertically. In some embodiments, they can also be distributed horizontally, as shown in Figure 7 , horizontally distributed trigger area Q9, trigger area Q10, and trigger area Q11.
  • the positions of the multiple trigger areas are arranged in the order of increasing or decreasing sensitivity corresponding to the multiple trigger areas.
  • the sensitivities of the trigger areas Q9, Q10, and Q11 distributed horizontally in FIG. 7 increase gradually.
  • the sensitivities of trigger area Q6, trigger area Q7, and trigger area Q8 in Figure 6 decrease gradually. Therefore, the user can avoid accidentally touching the trigger area, making it easier for the user to remember and operate according to the conventional increasing and decreasing order.
  • the technical solution provided by the embodiments of this application can meet the different needs of users by setting the trigger area.
  • the trigger area When the trigger area is not expected to occupy a large area of the user interface, the trigger area can be arranged horizontally to reduce the need for the trigger area to occupy Area.
  • the user When the user needs to increase the trigger area, he can also choose a vertically distributed trigger area.
  • the trigger area When the trigger area is larger, the user's operation requirements are lower. The smaller the trigger area, the user's operation requirements are higher. Therefore, it can meet different requirements.
  • the needs of users are friendly to novice players, but it can also meet the needs of experienced players, and the user experience is good.
  • Step 360 Control the virtual object to move at a first moving speed.
  • the client After determining the first moving speed of the virtual object, the client can control the virtual object to move according to the first moving speed.
  • the first moving speed of the virtual object is not only related to the sensitivity of the trigger area, but also related to various other factors, such as the vehicle the virtual object is riding on, the environment the virtual object is in, and what the virtual object is wearing. See the following embodiments for details and will not be described in detail here.
  • the value of the first moving speed is displayed in the user interface so that the user can grasp the moving speed of the currently controlled virtual object.
  • the starting position of the sliding operation can be adjusted at any time according to the real-time conditions of the virtual environment to obtain Different trigger areas correspond to different movement speeds. Users can adjust operations in time based on numerical information, making the game more strategic. In some embodiments, different movement speeds will correspond to different animation effects, thereby making the user have a stronger sense of immersion in the virtual object and a better experience.
  • the technical solution provided by the embodiment of the present application determines the moving speed of the virtual object by setting multiple trigger areas corresponding to different sensitivities, based on the sensitivity of the trigger area where the initial position of the user's sliding operation is located, and controls the virtual object to move according to the moving speed.
  • users can start sliding operations in trigger areas with different sensitivities according to different needs, enabling the virtual object to be controlled to move at different speeds, enriching the control methods of virtual objects, and improving the efficiency of human-computer interaction.
  • the technical solution provided by the embodiments of the present application does not require the user to manually set the sensitivity during the game, thereby simplifying the user's operation and helping to improve the flexibility and efficiency of the user's movement control of virtual objects.
  • FIG. 8 shows a flow chart of a virtual object control method provided by another embodiment of the present application.
  • the execution subject of each step of the method may be the terminal device 10 in the solution implementation environment shown in Figure 1.
  • the execution subject of each step may be the client of the target application.
  • the method may include at least one of the following steps (320-360):
  • Step 320 Display the user interface.
  • the user interface displays a joystick control for controlling the movement of the virtual object.
  • the joystick control has multiple trigger areas, and different trigger areas correspond to different sensitivities.
  • Step 330 In response to the first sliding operation whose starting position is located in the first trigger area, display the rocker control at the starting position of the first sliding operation.
  • the first trigger area is one of multiple trigger areas.
  • moving the rocker control to the starting position of the first sliding operation is displayed, that is, the movement process of the first sliding operation from the original display position to the starting position is displayed; or, on the rocker control, The rocker control is canceled at the original display position, and the rocker control is displayed at the starting position of the first sliding operation.
  • Step 340-1 Determine the first moving speed of the virtual object based on the sensitivity of the first trigger area and the attribute information of the first sliding operation.
  • the attribute information is information related to the first sliding operation.
  • the attribute information includes: the distance between the real-time position of the first sliding operation and the starting position of the first sliding operation.
  • the distance between the real-time position of the first sliding operation and the starting position of the first sliding operation is positively correlated with the movement speed.
  • the distance between the real-time position of the first sliding operation and the starting position of the first sliding operation is 1 cm, then the moving speed is 1 m/s, and the distance between the real-time position of the first sliding operation and the starting position of the first sliding operation is The distance between them is 2cm, then the moving speed is 2m/s, and so on.
  • the first moving speed of the virtual object is determined by the sensitivity of the first trigger area and the attribute information of the first sliding operation.
  • the sensitivity can correspond to different moving speeds
  • the attribute information of the sliding operation can also correspond to different moving speeds.
  • the moving speed corresponding to the sensitivity of the sliding area where the initial position of the sliding operation is located can be recorded as the first moving speed.
  • record the corresponding moving speed in the attribute information of the first sliding operation as the second moving speed determine the relationship between the first moving speed and the second moving speed, and determine the larger moving speed as the first moving speed of the virtual object.
  • the first moving speed of the virtual object is determined jointly by the sensitivity of the first trigger area and the attribute information of the first sliding operation.
  • the sensitivity of the first trigger area and the attribute information of the first sliding operation correspond to different weights respectively, and the moving speed corresponding to the sensitivity of the sliding area where the initial position of the sliding operation is located is recorded as the first moving speed, and the first moving speed is recorded as the first moving speed.
  • the corresponding moving speed in the attribute information of the sliding operation is recorded as the second moving speed, and the final moving speed is determined based on the proportion of the two.
  • a sensitivity correction parameter is determined based on the distance, and the sensitivity correction parameter is used to adjust the sensitivity of the trigger area; the first moving speed of the virtual object is determined based on the sensitivity correction parameter and the sensitivity of the first trigger area.
  • the sensitivity correction parameter is related to the distance.
  • the sensitivity correction parameter also changes. That is to say, the distance and the sensitivity correction parameter are positively correlated.
  • different distance intervals correspond to different sensitivity correction parameters.
  • the sensitivity correction parameter is d1
  • the sensitivity correction parameter is d2
  • the sensitivity correction parameter is an interval function, and different intervals correspond to different values, among which a1, b1, d1, a2, b2, and d2 are all positive numbers.
  • the corrected sensitivity is determined based on the sensitivity correction parameter and the sensitivity of the first trigger area, and the first moving speed of the virtual object is determined based on the corrected sensitivity.
  • the corrected sensitivity may be determined based on the sensitivity correction parameter and the sensitivity of the first trigger area, which may be in an additive or multiplicative manner.
  • the specific algorithm is not limited in this application.
  • the attribute information is information related to the first sliding operation.
  • the attribute information includes: other attribute information of the first sliding operation in addition to the distance, such as the pressure value of the first sliding operation.
  • the pressure value of the sliding operation is the first pressure value
  • the moving speed of the virtual object is the first speed.
  • the pressure value of the sliding operation is the second pressure value
  • the moving speed of the virtual object is the second speed.
  • the first pressure value is greater than the second pressure value, and the first speed is greater than the second speed.
  • the attribute information is information related to the first sliding operation.
  • the attribute information includes: other attribute information besides the distance of the first sliding operation, such as the size of the trigger area covered by the first sliding operation. .
  • the size of the trigger area covered by the sliding operation is the first area
  • the moving speed of the virtual object is the third speed.
  • the moving speed of the virtual object is the fourth speed. Speed, if the first area is greater than the second area, then the third speed is greater than the fourth speed.
  • the moving speed of the virtual object is also related to the virtual environment location/area where the virtual object is currently located. (such as flat land, grassland, snow, river, etc.).
  • the moving speed of the virtual object when the complexity of the virtual environment in which the virtual object is currently located is related to, optionally, the virtual environment in which the virtual object is located is snow, the moving speed of the virtual object will be reduced.
  • the moving speed of the virtual object When the virtual environment where the virtual object is located is flat ground, the moving speed of the virtual object will be greatly improved compared to the moving speed of the virtual object in the snow.
  • the technical solution provided by the embodiments of this application determines the moving speed of the virtual object through the attribute information of the sliding operation and the sensitivity of the trigger area, which is more in line with the actual situation and makes the user's control of the virtual object more precise and accurate.
  • Step 360 Control the virtual object to move at a first moving speed.
  • the method further includes at least one of the following steps (361-365, not shown in Figure 8).
  • Step 361 In response to the setting operation for the trigger area, display range boxes corresponding to the multiple trigger areas.
  • the user can set the trigger area before the game starts, or can set the trigger area after the game starts, which is not limited in this application.
  • This application does not limit the type of setting operation. It may be a click operation on the trigger area, or it may be setting the trigger area through other controls.
  • the range boxes corresponding to the multiple trigger areas are displayed.
  • the display range boxes here can be displayed in the form of highlights or in the form of ordinary lines.
  • the specific display method is in this application. Not limited.
  • Step 362 In response to the deletion operation of the target trigger area among the multiple trigger areas, cancel the display of the range box corresponding to the target trigger area.
  • the user can perform a deletion operation on some unnecessary trigger areas.
  • the user can perform a deletion operation on the trigger area Q6.
  • the user thinks that the trigger area Q6 The practicality is not high.
  • Trigger area Q6 is not needed and trigger area Q6 can be deleted.
  • Step 363 In response to the adjustment operation for the target trigger area among the multiple trigger areas, adjust at least one of the size and position of the range frame corresponding to the target trigger area.
  • the user can adjust the size and position of the range box of the trigger area.
  • the user can adjust the size of the range box of the trigger area through the first adjustment operation, for example, changing the size of the range box to Large; optionally, the user can adjust the position of the range frame of the trigger area through the second adjustment operation.
  • Step 364 In response to the operation of adding a trigger area, display a range box corresponding to the newly added trigger area.
  • the user can add a trigger area and simultaneously display a range box corresponding to the newly added trigger area.
  • Step 365 In response to completing the setting operation for the trigger area, multiple trigger areas corresponding to the joystick control are set according to the size and position of the range frame corresponding to each currently displayed trigger area.
  • the technical solution provided by the embodiment of the present application can adjust the size and position of the range frame of the trigger area, and the trigger area can be added or deleted, thereby meeting the needs of different users and being suitable for different groups of users.
  • the method further includes at least one of the following steps (366-368, not shown in Figure 8):
  • Step 366 Obtain real-time competitive data related to the virtual object.
  • the real-time competitive data includes at least one of the following: real-time attribute data of the virtual object, real-time environment data of the virtual object, real-time equipment data of the virtual object, and the current location and expectation of the virtual object. Real-time distance between locations.
  • the real-time attribute data of the virtual object may be the current status of the virtual user, such as blood volume, whether it is injured, whether it continues to lose blood, etc.
  • the real-time environment data of the virtual object may be the data of the real-time environment where the virtual object is currently located, such as whether the virtual object is in a poison circle, an unsafe swamp area, a mine control area, etc.
  • the real-time equipment data of the virtual object may be the data of the equipment currently held by the virtual object, such as the number of virtual props, the number of virtual gun ammunition, and so on.
  • the real-time distance between the current position and the expected position of the virtual object may be the distance between the current position of the virtual object and the expected position.
  • the desired location can be marked by the user himself or predicted by the server.
  • the desired location can be the center of the safe zone or a target location marked by the user himself.
  • Step 367 Determine a recommended trigger area from multiple trigger areas based on real-time competition data.
  • a recommended trigger area is determined from a plurality of trigger areas based on real-time competitive data. For example, when the virtual object's health value is low, its equipment is poor, and the area it is in is unsafe, the trigger area with the highest sensitivity is determined from multiple trigger areas as the recommended trigger area. For example, when the virtual object's blood volume is very healthy and its equipment is good, the trigger area with lower sensitivity is determined from multiple trigger areas as the recommended trigger area.
  • the recommended trigger area is one trigger area among multiple trigger areas.
  • real-time competitive data is processed through a speed prediction model to predict the expected movement speed of the virtual object; wherein the speed prediction model is a machine learning model built based on a neural network. Based on the expected movement speed, a recommended trigger area is determined from multiple trigger areas.
  • Step 368 Display prompt information corresponding to the recommended trigger area.
  • the prompt information is a rocker control
  • the rocker control is displayed in the recommended trigger area of the multiple trigger areas.
  • the rocker control is displayed at a reference point position of the recommended trigger area, and optionally, the reference point position is the center point position of the recommended trigger area.
  • the rocker control T0 is directly displayed on the recommended trigger area of the user interface.
  • the prompt information is a recommended trigger area
  • the recommended trigger area among the multiple trigger areas is displayed differently from other trigger areas except the recommended trigger area.
  • the recommended trigger area is displayed in a highlighted form in the user interface.
  • the trigger area Q3 is highlighted on the recommended trigger area of the user interface.
  • the technical solution provided by the embodiment of this application determines the recommended trigger area based on the real-time competition data of the virtual object, and can provide the recommended trigger area in real time according to the situation of the virtual object.
  • the user can Carefully consider which trigger area needs to be used, and directly use the recommended trigger area to reduce the user's reaction time and increase the user's competitive experience.
  • it can also improve the accuracy and efficiency of trigger area selection, allowing users to quickly and accurately select the trigger area suitable for the current competitive scene to control the virtual object to move at a speed suitable for the current competitive scene.
  • FIG. 9 shows a flow chart of a virtual object control method provided by another embodiment of the present application.
  • the execution subject of each step of the method may be the terminal device 10 in the solution implementation environment shown in Figure 1.
  • the execution subject of each step may be the client of the target application.
  • the method may include at least one of the following steps (320-380):
  • Step 320 Display the user interface.
  • the user interface displays a joystick control for controlling the movement of the virtual object.
  • the joystick control has multiple direction intervals, and different direction intervals correspond to different movement directions.
  • Step 370 In response to the first sliding operation on the joystick control, determine the first direction interval to which the real-time direction belongs from the plurality of direction intervals based on the real-time position of the first sliding operation relative to the real-time direction of the joystick control.
  • the moving speed of the virtual object be determined based on the sensitivity of the target trigger area corresponding to the starting position of the sliding operation, but also the real-time position of the sliding operation relative to the real-time direction of the joystick control , determine the first direction interval to which the real-time direction belongs from the plurality of direction intervals.
  • the number of direction intervals is 8.
  • the real-time position of the user's first sliding operation Set the real-time direction relative to the joystick control L1, and determine the first direction interval to which the real-time direction belongs from the plurality of direction intervals.
  • Figure 10 can be divided into 10 directional intervals. The upper half of the interval, for example, can be divided into upper, upper left, upper right, left, and right, a total of 5 directional intervals.
  • each The direction interval corresponds to a certain interval range.
  • the real-time position of the first sliding operation relative to the real-time direction of the joystick control L2 is the direction pointed by arrow m3, and the direction of m3 falls in the direction interval surrounded by m1 and m2. Therefore, the "upper left" direction interval surrounded by m1 and m2 is determined as the first direction interval, and the direction corresponding to the first direction interval 45 degrees north to west is determined as the moving direction of the virtual object, just as the virtual object pointed by arrow m4 The object's direction of movement is 45 degrees north by west.
  • the real-time position of the first interactive operation is E1
  • the center position of the joystick control L3 is E0
  • the real-time position of the first sliding operation is E0
  • the real-time direction of the position relative to the joystick control is the direction in which E0 points to E1. It can be determined that the direction in which E0 points to E1 belongs to the direction interval P1 (P1 corresponds to one of the eight direction intervals, the upper right direction interval).
  • a method of determining the first direction interval to which the real-time position of the first sliding operation belongs relative to the real-time direction of the joystick control is provided.
  • the number of pixels that the user slides on the screen can be known, and then the arc length of the slide can be obtained.
  • n is the central angle of the circle
  • r is the radius
  • L is the arc length of the central angle.
  • the arc length of the sector is actually the length of one side of the circle.
  • the first direction interval to which the real-time position of the first sliding operation belongs relative to the real-time direction of the joystick control is determined. Therefore, the specific value of n° can be calculated inversely based on the arc length L corresponding to the n° central angle in Figure 12, and the direction interval to which it belongs can be determined based on the value of n°. Alternatively, the value of n is 75, then the The first direction interval to which the real-time position of a sliding operation belongs relative to the real-time direction of the joystick control is P2.
  • Step 380 Control the virtual object to move in the moving direction corresponding to the first direction interval.
  • the movement direction corresponding to each direction interval is set. In some embodiments, the central direction of each direction interval is used as the movement direction corresponding to the direction interval.
  • the moving direction corresponding to the direction interval P1 is the direction in which E0 points to F, and F is a point in the center direction of the direction interval P1.
  • the moving direction of the virtual object is controlled to gradually change from the moving direction corresponding to the first direction interval to the third direction interval within the first time period.
  • the second direction interval is a direction interval adjacent to the first direction interval. Gradually adjusting the direction so that the direction of the virtual object does not change instantaneously can make the user's experience better.
  • the technical solution provided by the embodiment of the present application avoids the user-controlled virtual object from changing its direction too sensitively by dividing multiple direction intervals, and simultaneously sets multiple direction intervals, each direction interval corresponding to a moving direction, so that the processing of the terminal device
  • the overhead has been reduced to a great extent.
  • the moving direction of the virtual object is set to the moving direction corresponding to the multiple direction intervals set in the embodiments of this application. Reduces the running cost of terminal equipment, reduces screen lags, makes the screen smoother, and improves user experience. The experience is better.
  • FIG. 13 shows a flow chart of a virtual object control method provided by another embodiment of the present application.
  • the execution subject of each step of the method may be the terminal device 10 in the solution implementation environment shown in Figure 1.
  • the execution subject of each step may be the client of the target application.
  • the method may include at least one of the following steps (320-1 ⁇ 394):
  • Step 320-1 Display the user interface.
  • the user interface displays a joystick control for controlling the movement of the virtual object.
  • Step 390 In response to the first sliding operation on the rocker control, obtain the distance between the real-time position of the first sliding operation and the starting position of the first sliding operation.
  • the distance between the real-time position of the first sliding operation and the starting position of the first sliding operation is G1 to the first sliding operation.
  • the distance between the starting position G0 of the operation is G1 to the first sliding operation.
  • Step 392 If the distance is greater than or equal to the first threshold, display the automatic movement control.
  • the automatic movement control is used to make virtual objects move automatically.
  • the distance between the real-time position of the first sliding operation and the starting position of the first sliding operation is the distance between G1 and the starting position G0 of the first sliding operation. If the distance does not meet the first threshold, the automatic movement control will not be displayed.
  • the real-time position of the first sliding operation changes from G1 to G2
  • the distance between the real-time position of the first sliding operation and the starting position of the first sliding operation is the distance between G2 and the starting position G0 of the first sliding operation. If the distance meets the first threshold, the automatic movement control H1 is displayed.
  • Step 394 When the automatic movement control is in the display state, if the distance is less than or equal to the second threshold, the display of the automatic movement control is cancelled.
  • the display automatically Move the control.
  • canceling the display of the automatic moving control can avoid automatic movement caused by multiple operations of the user.
  • the constant disappearance and appearance of controls can improve the user experience and reduce the processing pressure on the terminal device.
  • FIG. 16 shows a flow chart of a virtual object control method provided by another embodiment of the present application.
  • the execution subject of each step of the method may be the terminal device 10 in the solution implementation environment shown in Figure 1.
  • the execution subject of each step may be the client of the target application.
  • the method may include at least one of the following steps (410-430):
  • Step 410 Display the user interface.
  • the user interface displays a joystick control for controlling the movement of the virtual object.
  • the joystick control has multiple trigger areas, and different trigger areas correspond to different sensitivities.
  • Step 420 In response to the first sliding operation whose starting position is located in the first trigger area, control the virtual object to move at the first moving speed.
  • Step 430 In response to the second sliding operation whose starting position is located in the second trigger area, control the virtual object to move at a second moving speed.
  • first trigger area and the second trigger area are respectively one of multiple trigger areas; the first trigger area and the second trigger area are different, and the first moving speed and the second moving speed are different.
  • the virtual object in response to the sliding operation whose starting position is located in the trigger area Q22, the virtual object is controlled to move at a low speed, and in response to the sliding operation whose starting position is located in the trigger area Q21, the virtual object is controlled to move at a medium speed.
  • the virtual object In response to the sliding operation whose starting position is located in the trigger area Q20, the virtual object is controlled to move at a high speed.
  • the virtual object when a virtual object of the enemy camp appears in the virtual environment, the virtual object (our camp) needs to be controlled to move at a relatively low speed, so the starting position of the sliding operation can be placed in the trigger area Q21, makes it easier for virtual objects to target virtual objects of the enemy camp.
  • the starting position of the sliding operation can be placed in the trigger area Q20 to control the virtual object to move at a higher speed.
  • the starting position of the sliding operation can be placed in the trigger area Q22 to control the virtual object to move slowly.
  • the rocker control in response to the first sliding operation, is displayed at a starting position of the first sliding operation.
  • the range boxes corresponding to the multiple trigger areas are displayed; in response to the deletion operation for the target trigger area in the multiple trigger areas, the range corresponding to the target trigger area is cancelled. frame; or, in response to the adjustment operation for the target trigger area among the multiple trigger areas, adjust at least one of the size and position of the range frame corresponding to the target trigger area; or, in response to the increase operation of the trigger area, display a new The range box corresponding to the added trigger area; in response to the completion operation of setting the trigger area, multiple trigger areas corresponding to the joystick control are set according to the size and position of the range box corresponding to each currently displayed trigger area.
  • the multiple trigger areas there is no overlapping area between any two trigger areas; or, among the multiple trigger areas, there is an overlapping area between at least two trigger areas.
  • the rocker control is displayed in the recommended trigger area of multiple trigger areas; or, the recommended trigger area in the multiple trigger areas is displayed differently from other trigger areas except the recommended trigger area.
  • the technical solution provided by the embodiment of the present application can control the virtual object to move at different moving speeds according to the sliding operation of different trigger areas, refine the user's control of the virtual object, and meet the different needs of users in different situations.
  • FIG. 18 shows a flow chart of a virtual object control method provided by another embodiment of the present application.
  • the execution subject of each step of the method may be the terminal device 10 in the solution implementation environment shown in Figure 1.
  • the execution subject of each step may be the client of the target application.
  • the method may include at least one of the following steps (S1-S4):
  • step S1 is started.
  • Step S1 determine whether it touches the moving area. If so, enter the moving state; if not, re-enter the game.
  • step S2 is executed.
  • Step S2 Determine whether to slide other sections. If so, change the moving direction; if not, re-enter the moving state.
  • step S3 is executed.
  • Step S3 Determine whether to slide up a certain distance. If so, display a button to lock running.
  • Step S4 Determine whether the running button is touched. If so, the running state is locked; if not, the running button is continued to be displayed.
  • the game session is over.
  • FIG. 19 shows a block diagram of a virtual object control device provided by an embodiment of the present application.
  • the device has the function of implementing the above method example, and the function can be implemented by hardware, or can be implemented by hardware executing corresponding software.
  • the device can be the terminal equipment introduced above, or can be set in the terminal equipment.
  • the device 1800 may include: an interface display module 1810, a speed determination module 1820 and a movement control module 1830.
  • the interface display module 1810 is used to display a user interface.
  • the user interface displays a joystick control for controlling the movement of virtual objects.
  • the joystick control has multiple trigger areas, and different trigger areas correspond to different sensitivities. .
  • the speed determination module 1820 is configured to determine the first moving speed of the virtual object according to the sensitivity of the first trigger area in response to the first sliding operation with the starting position located in the first trigger area.
  • the movement control module 1830 is used to control the virtual object to move at the first movement speed.
  • the speed determination module 1820 is configured to determine the first moving speed of the virtual object according to the sensitivity of the first trigger area and the attribute information of the first sliding operation; the first The trigger area is one of the plurality of trigger areas.
  • the attribute information includes: a distance between the real-time position of the first sliding operation and the starting position of the first sliding operation.
  • the speed determination module 1820 is also used to determine a sensitivity correction parameter according to the distance, and the sensitivity correction parameter is used to adjust the sensitivity of the trigger area.
  • the speed determination module 1820 is also configured to determine the first moving speed of the virtual object according to the sensitivity correction parameter and the sensitivity of the first trigger area.
  • the plurality of trigger areas there is an overlapping area between at least two trigger areas.
  • the device further includes a starting position acquisition module 1840, a distance determination module 1850 and a trigger area determination module 1860.
  • the starting position obtaining module 1840 is configured to obtain the starting position of the first sliding operation when the first sliding operation is detected.
  • the distance determination module 1850 is used to determine the distance between the starting position of the first sliding operation and the reference point of each trigger area; wherein the positions of the reference points of each trigger area are different from each other.
  • the trigger area determination module 1860 is configured to determine the trigger area with the smallest distance from the multiple trigger areas as the first trigger area.
  • the device further includes a data acquisition module 1870 and a prompt information display module 1880.
  • the data acquisition module 1870 is used to obtain real-time competitive data related to the virtual object.
  • the real-time competitive data includes at least one of the following: real-time attribute data of the virtual object, real-time environment data of the virtual object, Real-time equipment data of the virtual object, real-time distance between the current position and the desired position of the virtual object.
  • the trigger area determination module 1860 is also configured to determine a recommended trigger area from the multiple trigger areas based on the real-time competition data.
  • the prompt information display module 1880 is used to display prompt information corresponding to the recommended trigger area.
  • the trigger area determination module 1860 is also used to process the real-time competitive data through a speed prediction model to predict the expected movement speed of the virtual object; wherein the speed prediction model is based on Machine learning model built by neural network.
  • the trigger area determination module 1860 is also configured to determine the recommended trigger area from the multiple trigger areas according to the expected movement speed.
  • the positions of the multiple trigger areas are arranged in an order of increasing or decreasing sensitivity corresponding to the multiple trigger areas.
  • the rocker control has multiple direction intervals, and different direction intervals correspond to different movement directions.
  • the apparatus further includes an interval determination module 1890.
  • the interval determination module 1890 is configured to determine the first direction interval to which the real-time direction belongs from the plurality of direction intervals based on the real-time position of the first sliding operation relative to the real-time direction of the joystick control.
  • the control movement module 1830 is also used to control the virtual object to move in the movement direction corresponding to the first direction interval.
  • control movement module 1830 is also used to control the movement of the virtual object when the direction interval to which the real-time direction belongs changes from the first direction interval to the second direction interval. direction, within a first period of time, from the moving direction corresponding to the first direction interval to the moving direction corresponding to the second direction interval; wherein the second direction interval is the same as the first direction interval. Adjacent direction intervals.
  • the device further includes a distance acquisition module 1892 and a control display module 1894.
  • the distance acquisition module 1892 is used to acquire the distance between the real-time position of the first sliding operation and the starting position of the first sliding operation.
  • the control display module 1894 is used to display an automatic movement control when the distance is greater than or equal to a first threshold, and the automatic movement control is used to trigger the virtual object to run automatically.
  • the control display module 1894 is also configured to cancel the display of the automatic moving control if the distance is less than or equal to a second threshold when the automatic moving control is in the display state; wherein, the second threshold is less than the first threshold.
  • the interface display module 1810 is used to display a user interface.
  • the user interface displays a joystick control for controlling the movement of virtual objects, and the joystick control has multiple trigger areas.
  • the movement control module 1830 is configured to control the virtual object to move at a first movement speed in response to the first sliding operation with the starting position located in the first trigger area.
  • the movement control module 1830 is used to control the virtual object to move at a second movement speed in response to the second sliding operation with the starting position located in the second trigger area; wherein the first trigger area and the second trigger area are respectively It is one of the plurality of trigger areas; the first trigger area and the second trigger area are different, and the first moving speed and the second moving speed are different.
  • the device further includes a rocker control display module 2040.
  • the rocker control display module 2040 is configured to display the rocker control at the starting position of the first sliding operation in response to the first sliding operation.
  • the device further includes a range frame display module 2050, a range frame adjustment module 2060 and a trigger area setting module 2070.
  • the range frame display module 2050 is configured to display range frames corresponding to the multiple trigger areas in response to the setting operation for the trigger area.
  • the range frame display module 2050 is also configured to cancel the display of the range frame corresponding to the target trigger area in response to the deletion operation of the target trigger area among the multiple trigger areas.
  • the range frame adjustment module 2060 is configured to adjust at least one of the size and position of the range frame corresponding to the target trigger area in response to an adjustment operation for a target trigger area in the plurality of trigger areas.
  • the range frame display module 2050 is also configured to display the range frame corresponding to the newly added trigger area in response to the operation of adding the trigger area.
  • the trigger area setting module 2070 is configured to, in response to completing the setting operation for the trigger area, set multiple corresponding fields of the rocker control according to the size and position of the range frame corresponding to each of the currently displayed trigger areas. touch hair area.
  • the plurality of trigger areas there is no overlapping area between any two trigger areas; or, among the plurality of trigger areas, there is an overlapping area between at least two trigger areas.
  • the device further includes a prompt information display module 2080.
  • the prompt information display module 2080 is used to display the rocker control in the recommended trigger area of the multiple trigger areas; or, the prompt information display module 2080 is used to display the rocker control in the multiple trigger areas.
  • the recommended trigger area is displayed differently from other trigger areas except the recommended trigger area.
  • FIG 22 shows a structural block diagram of a terminal device 2100 provided by an embodiment of the present application.
  • the terminal device 2100 may be the terminal device 10 in the implementation environment shown in Figure 1, and is used to implement the virtual object control method provided in the above embodiment. Specifically:
  • the terminal device 2100 includes: a processor 2101 and a memory 2102.
  • the processor 2101 may include one or more processing cores, such as a 4-core processor, an 8-core processor, etc.
  • the processor 2101 can be implemented using at least one hardware form among DSP (Digital Signal Processing, digital signal processing), FPGA (Field Programmable Gate Array, field programmable gate array), and PLA (Programmable Logic Array, programmable logic array). .
  • Memory 2102 may include one or more computer-readable storage media, which may be non-transitory. Memory 2102 may also include high-speed random access memory, and non-volatile memory, such as one or more disk storage devices, flash memory storage devices. In some embodiments, the non-transitory computer-readable storage medium in the memory 2102 is used to store a computer program configured to be executed by one or more processors to implement the above-mentioned control method of virtual objects.
  • the terminal device 2100 optionally further includes: a peripheral device interface 2103 and at least one peripheral device.
  • the processor 2101, the memory 2102 and the peripheral device interface 2103 may be connected through a bus or a signal line.
  • Each peripheral device can be connected to the peripheral device interface 2103 through a bus, a signal line or a circuit board.
  • the peripheral device includes: at least one of a radio frequency circuit 2104, a display screen 2105, an audio circuit 2107, and a power supply 2108.
  • FIG. 22 does not constitute a limitation on the terminal device 2100, and may include more or fewer components than shown, or combine certain components, or adopt different component arrangements.
  • a computer-readable storage medium is also provided, and a computer program is stored in the storage medium.
  • the computer program When executed by a processor, the computer program implements a method for controlling a virtual object.
  • the computer-readable storage medium may include: ROM (Read-Only Memory), RAM (Random Access Memory), SSD (Solid State Drives, solid state drive) or optical disk, etc.
  • random access memory can include ReRAM (Resistance Random Access Memory, resistive random access memory) and DRAM (Dynamic Random Access Memory, dynamic random access memory).
  • a computer program product including computer instructions stored in a computer-readable storage medium.
  • the processor of the terminal device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the terminal device executes the above control method of the virtual object.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种虚拟对象的控制方法、装置、设备、存储介质及程序产品,涉及计算机和互联网技术领域。该方法包括:显示用户界面(320),用户界面中显示有用于控制虚拟对象移动的摇杆控件,摇杆控件具有多个触发区域,不同的触发区域对应于不同的灵敏度;响应于起始位置位于第一触发区域的第一滑动操作,根据第一触发区域的灵敏度,确定虚拟对象的第一移动速度(340);控制虚拟对象以第一移动速度进行移动(360)。用户可以根据不同的需求在不同灵敏度的触发区域开始执行滑动操作,使得能够控制虚拟对象以不同的速度移动,丰富了虚拟对象的控制方式,提高了人机交互效率。

Description

虚拟对象的控制方法、装置、设备、存储介质及程序产品
本申请要求于2022年07月12日提交的、申请号为202210822326.7、发明名称为“虚拟对象的控制方法、装置、设备、存储介质及程序产品”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及计算机和互联网技术领域,特别涉及一种虚拟对象的控制方法、装置、设备、存储介质及程序产品。
背景技术
在射击游戏对局中,用户可以使用摇杆控件控制虚拟对象移动。
在相关技术中,当用户手指点到摇杆控件时,可以通过手指的滑动操作的方向来控制虚拟对象的移动方向。
然而,在上述相关技术中,虚拟对象的控制方式比较单一。
发明内容
本申请实施例提供了一种虚拟对象的控制方法、装置、设备、存储介质及程序产品。所述技术方案如下:
根据本申请实施例的一个方面,提供了一种虚拟对象的控制方法,所述方法由终端设备执行,所述方法包括:
显示用户界面,所述用户界面中显示有用于控制虚拟对象移动的摇杆控件,所述摇杆控件具有多个触发区域,不同的触发区域对应于不同的灵敏度;
响应于起始位置位于第一触发区域的第一滑动操作,根据所述第一触发区域的灵敏度,确定所述虚拟对象的第一移动速度;所述第一触发区域是所述多个触发区域中的一个;
控制所述虚拟对象以所述第一移动速度进行移动。
根据本申请实施例的另一个方面,提供了一种虚拟对象的控制方法,所述方法由终端设备执行,所述方法包括:
显示用户界面,所述用户界面中显示有用于控制虚拟对象移动的摇杆控件,所述摇杆控件具有多个触发区域;
响应于起始位置位于第一触发区域的第一滑动操作,控制所述虚拟对象以第一移动速度进行移动;
响应于起始位置位于第二触发区域的第二滑动操作,控制所述虚拟对象以第二移动速度进行移动;
其中,所述第一触发区域和所述第二触发区域分别是所述多个触发区域中的一个;所所述第一触发区域和所述第二触发区域不同,且所述第一移动速度和所述第二移动速度不同。
根据本申请实施例的一个方面,提供了一种虚拟对象的控制装置,所述装置包括:
界面显示模块,用于显示用户界面,所述用户界面中显示有用于控制虚拟对象移动的摇杆控件,所述摇杆控件具有多个触发区域,不同的触发区域对应于不同的灵敏度;
速度确定模块,用于响应于起始位置位于第一触发区域的第一滑动操作,根据所述第一触发区域的灵敏度,确定所述虚拟对象的第一移动速度;所述第一触发区域是所述多个触发区域中的一个;
移动控制模块,用于控制所述虚拟对象以所述第一移动速度进行移动。
根据本申请实施例的一个方面,提供了一种虚拟对象的控制装置,所述装置包括:
界面显示模块,用于显示用户界面,所述用户界面中显示有用于控制虚拟对象移动的摇杆控件,所述摇杆控件具有多个触发区域;
移动控制模块,用于响应于起始位置位于第一触发区域的第一滑动操作,控制所述虚拟对象以第一移动速度进行移动;
所述移动控制模块,还用于响应于起始位置位于第二触发区域的第二滑动操作,控制所述虚拟对象以第二移动速度进行移动;
其中,所述第一触发区域和所述第二触发区域分别是所述多个触发区域中的一个;所述第一触发区域和所述第二触发区域不同,且所述第一移动速度和所述第二移动速度不同。
根据本申请实施例的一个方面,提供了一种终端设备,所述终端设备包括处理器和存储器,所述存储器中存储有计算机程序,所述计算机程序由所述处理器加载并执行以实现上述方法。
根据本申请实施例的一个方面,提供了一种计算机可读存储介质,所述可读存储介质中存储有计算机程序,所述计算机程序由处理器加载并执行以实现上述方法。
根据本申请实施例的一个方面,提供了一种计算机程序产品,该计算机程序产品包括计算机指令,该计算机指令存储在计算机可读存储介质中。终端设备的处理器从计算机可读存储介质读取该计算机指令,处理器执行该计算机指令,使得该终端设备执行上述方法。
本申请实施例提供的技术方案可以包括如下有益效果:
通过设置多个对应不同灵敏度的触发区域,根据用户的滑动操作的初始位置所在触发区域的灵敏度,确定虚拟对象的移动速度,并且控制虚拟对象按照该移动速度移动,用户可以根据不同的需求在不同灵敏度的触发区域开始执行滑动操作,使得能够控制虚拟对象以不同的速度移动,丰富了虚拟对象的控制方式,提高了人机交互效率。
附图说明
图1是本申请一个实施例提供的方案实施环境的示意图;
图2是本申请一个实施例提供的虚拟对象的控制方法的示意图;
图3是本申请一个实施例提供的虚拟对象的控制方法的流程图;
图4是本申请一个实施例提供的用户界面的示意图;
图5是本申请另一个实施例提供的用户界面的示意图;
图6是本申请另一个实施例提供的用户界面的示意图;
图7是本申请另一个实施例提供的用户界面的示意图;
图8是本申请另一个实施例提供的虚拟对象的控制方法的流程图;
图9是本申请另一个实施例提供的虚拟对象的控制方法的流程图;
图10是本申请另一个实施例提供的用户界面的示意图;
图11是本申请另一个实施例提供的用户界面的示意图;
图12是本申请另一个实施例提供的用户界面的示意图;
图13是本申请另一个实施例提供的虚拟对象的控制方法的流程图;
图14是本申请另一个实施例提供的用户界面的示意图;
图15是本申请另一个实施例提供的用户界面的示意图;
图16是本申请另一个实施例提供的虚拟对象的控制方法的流程图;
图17是本申请另一个实施例提供的用户界面的示意图;
图18是本申请另一个实施例提供的虚拟对象的控制方法的流程图;
图19是本申请一个实施例提供的虚拟对象的控制装置的框图;
图20是本申请另一个实施例提供的虚拟对象的控制装置的框图;
图21是本申请另一个实施例提供的虚拟对象的控制装置的框图;
图22是本申请一个实施例提供的终端设备的结构框图。
具体实施方式
请参考图1,其示出了本申请一个实施例提供的方案实施环境的示意图。该方案实施环境可以实现成为虚拟对象的控制系统。该方案实施环境可以包括:终端设备10和服务器20。
终端设备10可以是诸如手机、平板电脑、游戏主机、电子书阅读器、多媒体播放设备、可穿戴设备、PC(Personal Computer,个人计算机)、车载终端等电子设备。终端设备10中可以安装目标应用程序(如游戏应用程序)的客户端。可选地,该目标应用程序可以是需要下载安装的应用程序,也可以是即点即用的应用程序,本申请实施例对此不作限定。
在本申请实施例中,目标应用程序可以是射击应用程序、竞速应用程序、多人在线战术竞技游戏等,本申请对此不作限定。在一些实施例中,上述目标应用程序可以是射击应用程序,该射击应用程序能够提供虚拟环境,以供用户操作的虚拟对象在该虚拟环境中进行移动。典型地,射击应用程序可以是TPS(Third-Personal Shooting Game,第三人称射击游戏)、FPS(First-person shooting game,第一人称射击游戏)、MOBA(Multiplayer Online Battle Arena,多人在线战术竞技)游戏、多人枪战类生存游戏、虚拟现实(Virtual Reality,VR)类射击应用程序、增强现实(Augmented Reality,AR)类应用应用程序、三维地图程序、社交类应用程序、互动娱乐类应用程序,等任一具有射击产品功能的应用程序。另外,对于不同的应用程序来说,其所提供的虚拟对象的形态或形状也会有所不同,且相应的功能也会有所不同,这都可以根据实际需求进行设计,本申请实施例对此不作限定。可选地,终端设备10中运行有上述应用程序的客户端。在一些实施例中,上述应用程序是基于三维的虚拟环境引擎开发的应用程序,比如该虚拟环境引擎是Unity引擎,该虚拟环境引擎能够构建三维的虚拟环境、虚拟对象和虚拟道具等,给用户带来更加沉浸式的游戏体验。
其中,上述虚拟环境是目标应用程序(如游戏应用程序)的客户端在终端设备上运行时显示(或提供)的场景,该虚拟环境是指营造出的供虚拟对象进行活动(如游戏竞技)的场景,如虚拟房屋、虚拟岛屿、虚拟地图等。该虚拟环境可以是对真实世界的仿真环境,也可以是半仿真半虚构的环境,还可以是纯虚构的环境。虚拟环境可以是二维虚拟环境,也可以是2.5维虚拟环境,或者是三维虚拟环境,本申请实施例对此不作限定。
上述虚拟对象是指用户帐号在目标应用程序中控制的虚拟角色、虚拟载具、虚拟物品等等,本申请对此不作限定。以目标应用程序为游戏应用程序为例,虚拟对象是指用户帐号在游戏应用程序中控制的游戏角色。虚拟对象可以是人物形态,可以是动物、卡通或者其它形态,本申请实施例对此不作限定。虚拟对象可以三维形式展示,也可以二维形式展示,本申请实施例对此不作限定。可选地,当虚拟环境为三维虚拟环境时,虚拟对象是基于动画骨骼技术创建的三维立体模型。每个虚拟对象在三维虚拟环境中具有自身的形状和体积,占据三维虚拟环境中的一部分空间。可选地,虚拟对象是虚拟环境中的虚拟载具,如虚拟汽车、虚拟热气球、虚拟摩托车等等能够被用户控制的虚拟道具。
服务器20用于为终端设备10中安装运行的目标应用程序的客户端提供后台服务。例如,服务器20可以是上述游戏应用程序的后台服务器。服务器20可以是一台服务器,也可以是由多台服务器组成的服务器集群,或者是一个云计算服务中心。可选地,服务器20同时为多个终端设备10中的目标应用程序提供后台服务。
终端设备10和服务器20之间可通过网络进行互相通信。
请参考图2,其示出了本申请一个实施例提供的虚拟对象的控制方法的示意图。在如图1所示的终端设备10上显示有用户界面,用户界面上设置有多个触发区域,分别是触发区 域z1、触发区域z2、触发区域z3,三个触发区域分别对应不同的灵敏度,触发区域的灵敏度和虚拟对象的移动速度相关。当用户的滑动操作的起始位置位于多个触发区域中的触发区域z1时,则根据触发区域z1的灵敏度确定虚拟对象的移动速度,同时以该速度控制虚拟对象移动。
本申请实施例提供的技术方案,在用户界面上设置了多块触发区域,不同触发区域对应不同的灵敏度,而不同的灵敏度又对应着虚拟对象的不同移动速度,当用户的滑动操作的起始位置在目标触发区域,则根据目标触发区域的灵敏度确定虚拟对象的移动速度,并且控制虚拟对象以该移动速度移动。也就是说,在竞技对局开始之前,用户就可以根据自己的操作习惯,对于触发区域进行设置,对于触发区域的设置包括但不限于区域的尺寸、区域的灵敏度等等。在竞技对局中,用户可以根据实时情况,调整滑动操作的起始位置,使得虚拟对象的移动速度可以发生变化,进一步提高了用户对于虚拟对象的控制,也同时提升了用户的对局体验感。
请参考图3,其示出了本申请一个实施例提供的虚拟对象的控制方法的流程图。该方法各步骤的执行主体可以是图1所示方案实施环境中的终端设备10,如各步骤的执行主体可以是目标应用程序的客户端。在下文方法实施例中,为了便于描述,仅以各步骤的执行主体为“客户端”进行介绍说明。该方法可以包括如下几个步骤(320~360)中的至少一个步骤:
步骤320,显示用户界面,用户界面中显示有用于控制虚拟对象移动的摇杆控件,摇杆控件具有多个触发区域,不同的触发区域对应于不同的灵敏度。
摇杆控件:也可以称为虚拟摇杆,由轮盘部和遥感部组成,轮盘部是虚拟摇杆的可操作范围,当用户没有操作时,遥感部的位置不发生改变。可选地,遥感部会随着手指的滑动而在轮盘部的范围滑动,用户可以在轮盘部的范围内任意滑动遥感部。在一些实施例中,摇杆控件可以控制虚拟对象的移动方向。
触发区域是指界面中的特定区域,不同的触发区域对应不同的灵敏度。该区域可以由服务器设定,也可以由用户自行设定或者调整。在本申请实施例中,对于触发区域的尺寸以及形状不作限定。例如,触发区域的形状可以是矩形、圆形、圆角矩形等等,触发区域的尺寸可以结合界面布局进行合理设定。另外,不同的触发区域其尺寸和形状可以相同,也可以不同,本申请对此不作限定。
本申请实施例中的灵敏度指的是移动灵敏度,可以理解为用户在不同触发区域执行的相同滑动操作,但是控制虚拟对象移动的速度不同。在一些实施例中,同样是由用户在用户界面上执行滑动操作,滑动操作的起始位置所在的触发区域不同,不同的触发区域对应的虚拟对象的移动速度不同,可选地,在一个触发区域中,对应的虚拟对象的移动速度是10米/秒,在另一个触发区域中,对应的虚拟对象的移动速度是20米/秒。
请参考图4,其示出了本申请一个实施例提供的用户界面的示意图。其中用户界面中显示有用于控制虚拟对象移动的摇杆控件Y1,摇杆控件具有两个触发区域,分别是触发区域Q1和触发区域Q2,不同的触发区域对应于不同的灵敏度,可选地,触发区域Q1的灵敏度为x,触发区域Q2的灵敏度为y,x与y为正数,则如图4所示的摇杆控件Y1在触发区域Q1,所述摇杆控件Y1的灵敏度为x。
步骤340,响应于起始位置位于第一触发区域的第一滑动操作,根据第一触发区域的灵敏度,确定虚拟对象的第一移动速度;第一触发区域是多个触发区域中的一个。
第一滑动操作是用户执行的动作。可选地,终端设备是手持设备,用户的第一滑动操作是对终端设备直接执行的操作,例如对于手机屏幕的滑动操作、按压操作、拖动操作等等。可选地,终端设备不是手持设备,用户的第一互动操作可以是对终端设备的外设执行的操作,例如对于鼠标的双击操作、对于键盘的点击操作、对于手柄的点击、摇动等操作,本申请对于第一滑动操作的类型不作限定。在一些实施例中,滑动操作有起始位置和实时 位置,例如,滑动操作是对于手机屏幕的滑动操作,则起始位置是手最开始触碰屏幕的位置,随后手在屏幕上滑动,执行滑动操作,滑动操作中当前手触碰屏幕的位置是滑动操作的实时位置,当手离开屏幕时,滑动操作结束。
请参考图5,其示出了本申请另一个实施例提供的用户界面的示意图。其中用户界面中显示有用于控制虚拟对象移动的摇杆控件,摇杆控件具有三个触发区域,分别是触发区域Q3、触发区域Q4和触发区域Q5,不同的触发区域对应于不同的灵敏度,可选地,触发区域Q3的灵敏度为10,触发区域Q4的灵敏度为8,Q5的灵敏度为5,则如图5所示的摇杆控件在触发区域Q5,所述摇杆控件Y1的灵敏度为5。
在一些实施例中,由服务器预先设定好不同的灵敏度对应的移动速度,可选地,灵敏度1表示虚拟对象的移动速度为1m/s,灵敏度2表示虚拟对象的移动速度为2m/s,依次类推。则图5中根据触发区域Q5的灵敏度为5,确定虚拟对象的第一移动速度是5m/s。
在一些实施例中,不同触发区域的灵敏度可以由用户自定义设定,可以由用户在进入游戏对局之前就设定好,也可以在进入游戏对局之后由用户根据对局的实时情况进行设定。可选地,初始设定的图5中的触发区域Q3、触发区域Q4和触发区域Q5的灵敏度分别是10、8、5,但是用户考虑到对于触发区域Q5的灵敏度不需要达到5,则用户可以自行设定触发区域Q5的灵敏度,在调整了灵敏度之后,当摇杆控件在触发区域Q5的灵敏度发生变化,响应于用户的滑动操作,虚拟对象的移动速度也发生相应的改变,例如将触发区域Q5的灵敏度从5调整为3,则虚拟对象的移动速度调整为3m/s。
在一些实施例中,多个触发区域中,任意两个触发区域之间不存在重叠区域,例如在如图4、图5所示的用户界面中,触发区域之间不存在重叠区域。将用户执行的滑动操作的起始位置落在的触发区域确定为被选择的触发区域,例如将第一滑动操作的起始位置落在的触发区域确定为第一触发区域。可选地,当滑动操作的起始位置落在的触发区域的边界处时,可以设置边界附近触发区域的灵敏度最大的触发区域为第一触发区域,也可以设置边界附近触发区域的灵敏度最小的触发区域为第一触发区域。可选地,滑动操作的起始位置落在如图4所示的用户界面的边界处B1(图中虚线所示),则将触发区域Q1确定为第一触发区域或者触发区域Q2确定为第一触发区域,或者将触发区域Q1和触发区域Q2中灵敏度大的确定为第一触发区域。
在一些实施例中,多个触发区域中,存在至少两个触发区域之间存在重叠。在检测到第一滑动操作时,获取第一滑动操作的起始位置;确定第一滑动操作的起始位置与各个触发区域的参考点之间的间距;其中,各个触发区域的参考点的位置互不相同;从多个触发区域中,将间距最小的触发区域,确定为第一触发区域。本申请实施例所述的“两个触发区域之间存在重叠”,是指两个触发区域之间存在重叠区域,但是相应的也存在不重叠的区域,即两个触发区域不完全重叠,仅部分重叠。例如在如图6所示的用户界面中,存在触发区域Q6、触发区域Q7、触发区域Q8,三个触发区域之间存在重叠。触发区域Q6、触发区域Q7之间存在重叠的部分,但是触发区域Q6相对于触发区域Q7存在不重叠的部分,触发区域Q7相对于触发区域Q6也存在不重叠的部分。可选地,将滑动操作的起始位置落在的触发区域确定为第一触发区域,当滑动操作的起始位置落在的重叠区域时,可以设置边界附近触发区域的灵敏度最大的触发区域为第一触发区域,也可以设置边界附近触发区域的灵敏度最小的触发区域为第一触发区域。可选地,根据滑动操作的起始位置与各个触发区域的参考点之间的间距,选择距离最近的作为被选择的触发区域。可选地,参考点是触发区域的中心点位置或者其他能够表征该触发区域的位置。如图6所示,滑动操作的起始位置为D0,D1为触发区域Q6的中心位置,D2为触发区域Q7的中心位置,D3为触发区域Q8的中心位置,此时滑动操作的起始位置D0在触发区域Q6以及触发区域Q7的重叠区域,判断D0到D1的距离以及D0到D2的距离,D0到D2的距离更小,则将D2所在的触发区域Q7确定为被选择的第一触发区域。
本申请实施例中,一方面让多个触发区域重叠排布,可以减少触发区域所占用的界面面积,避免影响到界面中其他控件的排布;另一方面,在多个触发区域重叠排布时,选择距离操作位置最近的触发区域作为用户选择的触发区域,确保了触发区域能够被正确选择。
本申请实施例中,对于滑动操作的区域、形状及排布不作限定,例如在图6中的三个触发区域是垂直分布的,在一些实施例中,也可以水平分布,如图7所示,水平分布的触发区域Q9、触发区域Q10、触发区域Q11。在本申请实施例中,多个触发区域的位置,按照多个触发区域分别对应的灵敏度递增或递减的顺序排列。可选地,图7中水平分布的触发区域Q9、触发区域Q10、触发区域Q11的灵敏度递增。可选地,图6中的触发区域Q6、触发区域Q7、触发区域Q8的灵敏度递减。因此,可以避免用户对于触发区域的误触,按照常规的递增递减的顺序使得用户更加容易记忆,方便用户的操作。
本申请实施例提供的技术方案,通过对于触发区域的设定,可以满足用户的不同需求,当不想触发区域占据用户界面太大区域时,可以将触发区域水平排布,减少触发区域所需要占据的区域。当用户需要增大触发区域时,也可以选择垂直分布的触发区域,当触发区域越大,对于用户的操作要求越低,触发区域越小,对于用户的操作要求越高,因此,可以满足不同用户的需求,对于新手玩家比较友好的同时,还可以满足老玩家的需求,用户体验感较好。
步骤360,控制虚拟对象以第一移动速度进行移动。
客户端在确定出虚拟对象的第一移动速度之后,便可以控制该虚拟对象按照该第一移动速度进行移动。
可选地,虚拟对象的第一移动速度除了与触发区域的灵敏度有关,还有其他多种因素有关,比如虚拟对象所乘坐的载具、虚拟对象所处的环境、虚拟对象的穿着等。具体见下述实施例,在此不作赘述。
在一些实施例中,用户界面中会显示第一移动速度的数值,以便用户掌握当前所控制的虚拟对象的移动速度,同时可以根据虚拟环境的实时情况随时调整滑动操作的起始位置,以获得不同的触发区域所对应的不同移动速度,用户可以根据数值信息,及时调整操作,对局的策略性更强。在一些实施例中,不同的移动速度会对应不同的动画效果,从而使得用户对于虚拟对象的代入感更强,体验感更好。
本申请实施例提供的技术方案,通过设置多个对应不同灵敏度的触发区域,根据用户的滑动操作的初始位置所在触发区域的灵敏度,确定虚拟对象的移动速度,并且控制虚拟对象按照该移动速度移动,用户可以根据不同的需求在不同灵敏度的触发区域开始执行滑动操作,使得能够控制虚拟对象以不同的速度移动,丰富了虚拟对象的控制方式,提高了人机交互效率。
同时,本申请实施例提供的技术方案无需用户在对局中手动设置灵敏度,从而简化了用户的操作,有助于提升用户针对虚拟对象移动控制的灵活性和高效性。
请参考图8,其示出了本申请另一个实施例提供的虚拟对象的控制方法的流程图。该方法各步骤的执行主体可以是图1所示方案实施环境中的终端设备10,如各步骤的执行主体可以是目标应用程序的客户端。在下文方法实施例中,为了便于描述,仅以各步骤的执行主体为“客户端”进行介绍说明。该方法可以包括如下几个步骤(320~360)中的至少一个步骤:
步骤320,显示用户界面,用户界面中显示有用于控制虚拟对象移动的摇杆控件,摇杆控件具有多个触发区域,不同的触发区域对应于不同的灵敏度。
步骤330,响应于起始位置位于第一触发区域的第一滑动操作,在第一滑动操作的起始位置处显示摇杆控件。
其中,第一触发区域是多个触发区域中的一个。
在一些实施例中,将摇杆控件移动至第一滑动操作的起始位置处显示,即显示该第一滑动操作从原显示位置到该起始位置的移动过程;或者,在摇杆控件的原显示位置处取消显示该摇杆控件,并在第一滑动操作的起始位置处显示摇杆控件。
步骤340-1,根据第一触发区域的灵敏度以及第一滑动操作的属性信息,确定虚拟对象的第一移动速度。
属性信息是第一滑动操作相关的信息,可选地,属性信息包括:第一滑动操作的实时位置与第一滑动操作的起始位置之间的距离。
在一些实施例中,第一滑动操作的实时位置与第一滑动操作的起始位置之间的距离与移动速度呈正相关关系。例如,第一滑动操作的实时位置与第一滑动操作的起始位置之间的距离为1cm,则移动速度为1m/s,第一滑动操作的实时位置与第一滑动操作的起始位置之间的距离为2cm,则移动速度为2m/s,依次类推。
在一些实施例中,由第一触发区域的灵敏度以及第一滑动操作的属性信息确定虚拟对象的第一移动速度。在上述实施例中,灵敏度可以对应不同的移动速度,滑动操作的属性信息也可以对应不同的移动速度,可以将滑动操作的初始位置所在的滑动区域的灵敏度对应的移动速度记为第一移动速度,将第一滑动操作的属性信息中对应的移动速度记为第二移动速度,判断第一移动速度和第二移动速度的大小关系,将大的移动速度确定为虚拟对象的第一移动速度。
在一些实施例中,由第一触发区域的灵敏度以及第一滑动操作的属性信息共同确定虚拟对象的第一移动速度。可选地,第一触发区域的灵敏度以及第一滑动操作的属性信息分别对应不同的权重,将滑动操作的初始位置所在的滑动区域的灵敏度对应的移动速度记为第一移动速度,将第一滑动操作的属性信息中对应的移动速度记为第二移动速度,根据二者所占的比重确定最终的移动速度。
在一些实施例中,根据距离确定灵敏度修正参数,灵敏度修正参数用于对所述触发区域的灵敏度进行调整;根据灵敏度修正参数和第一触发区域的灵敏度,确定虚拟对象的第一移动速度。
在一些实施例中,灵敏度修正参数和距离相关,当距离发生变化时,灵敏度修正参数也发生变化,也就是说距离和灵敏度修正参数呈正相关关系。可选地,不同的距离区间对应不同的灵敏度修正参数,例如当距离在a1~b1的区间范围内时,灵敏度修正参数是d1,当距离在a2~b2的区间范围内时,灵敏度修正参数是d2,也可理解为灵敏度修正参数是一个区间函数,不同区间对应不同的值,其中a1、b1、d1、a2、b2、d2都是正数。
根据灵敏度修正参数和第一触发区域的灵敏度,确定修正后的灵敏度,根据修正后的灵敏度确定虚拟对象的第一移动速度。其中,根据灵敏度修正参数和第一触发区域的灵敏度,确定修正后的灵敏度,可以是以相加或者相乘的方式,具体的算法本申请不作限定。
在一些实施例中,属性信息是第一滑动操作相关的信息,可选地,属性信息包括:第一滑动操作除了距离之外的其他属性信息,例如第一滑动操作的压力值。可选地,滑动操作的压力值和虚拟对象的移动速度呈正相关关系。当滑动操作的压力值为第一压力值时,虚拟对象的移动速度为第一速度,当滑动操作的压力值为第二压力值时,虚拟对象的移动速度为第二速度,第一压力值大于第二压力值,第一速度大于第二速度。
在一些实施例中,属性信息是第一滑动操作相关的信息,可选地,属性信息包括:第一滑动操作除了距离之外的其他属性信息,例如第一滑动操作所覆盖的触发区域的大小。可选地,滑动操作所覆盖的触发区域的大小和虚拟对象的移动速度呈正相关关系。当滑动操作所覆盖的触发区域的大小为第一面积时,虚拟对象的移动速度为第三速度,当滑动操作所覆盖的触发区域的大小为第二面积时,虚拟对象的移动速度为第四速度,第一面积大于第二面积,则第三速度大于第四速度。
在一些实施例中,虚拟对象的移动速度还与虚拟对象当前所处的虚拟环境位置/区域 (比如平地、草地、雪地、河流等)有关。在一些实施例中,当虚拟对象当前所处的虚拟环境的复杂程度有关,可选的,虚拟对象所处的虚拟环境是雪地,则虚拟对象的移动速度会有所降低,可选地,虚拟对象所处的虚拟环境是平地时,虚拟对象的移动速度相比于在雪地里的虚拟对象的移动速度会有较大幅度的提升。
本申请实施例提供的技术方案,通过滑动操作的属性信息与触发区域的灵敏度共同决定虚拟对象的移动速度,更加符合实际情况,同时使得用户对于虚拟对象的控制更加精细和准确。
步骤360,控制虚拟对象以第一移动速度进行移动。
在一些实施例中,该方法还包括以下几个步骤(361~365,图8中未示出)中的至少一个步骤。
步骤361,响应于针对触发区域的设置操作,显示多个触发区域分别对应的范围框。
在一些实施例中,用户可以在对局开始前对触发区域进行设置,也可以在对局开始后对触发区域进行设置,本申请对此不作限定。本申请对于设置操作的类型也不作限定,可以是对于触发区域的点击操作,也可以是通过其他控件对于触发区域进行设置。响应于针对触发区域的设置操作,显示多个触发区域分别对应的范围框,这边的显示范围框可以是以高亮的形式显示,也可以是普通线条的形式显示,具体的显示方式本申请不作限定。
步骤362,响应于针对多个触发区域中的目标触发区域的删除操作,取消显示目标触发区域对应的范围框。
在一些实施例中,用户对于一些不需要的触发区域可以进行删除操作,在如图6所示的实施例中,用户可以针对触发区域Q6执行删除操作,可选地,用户认为触发区域Q6的实用性不高,不需要触发区域Q6,可以删除触发区域Q6。
步骤363,响应于针对多个触发区域中的目标触发区域的调整操作,调整目标触发区域对应的范围框的尺寸和位置中的至少之一。
在一些实施例中,用户可以对于触发区域的范围框的尺寸和位置进行调整,可选地,用户可以通过第一调整操作对于触发区域的范围框的尺寸进行调整,例如将范围框的尺寸变大;可选地,用户可以通过第二调整操作对于触发区域的范围框的位置进行调整。可选地,拖动范围框移动到自己需要的位置。
步骤364,响应于触发区域的增加操作,显示新增的触发区域对应的范围框。
在一些实施例中,用户可以增加触发区域,同时显示新增的触发区域对应的范围框。
步骤365,响应于针对触发区域的设置完成操作,根据当前显示的各个触发区域对应的范围框的尺寸和位置,设置摇杆控件对应的多个触发区域。
本申请实施例提供的技术方案,通过可以对于触发区域的范围框的尺寸和位置进行调整,可以增加或者删除触发区域,从而可以满足不同的用户需求,适合不同群体的用户,对于新手用户来说,可以设置少量且尺寸较大的触发区域,避免出现误触,对于老手用户来说,可以设置多个且尺寸较小的触发区域,方便提高自己的对局技巧,提高人机交互的可能性,用户体验感更好。
在一些实施例中,该方法还包括以下几个步骤(366~368,图8中未示出)中的至少一个步骤:
步骤366,获取与虚拟对象相关的实时竞技数据,实时竞技数据包括以下至少之一:虚拟对象的实时属性数据、虚拟对象的实时环境数据、虚拟对象的实时装备数据、虚拟对象的当前位置与期望位置之间的实时距离。
在一些实施例中,虚拟对象的实时属性数据可以是虚拟用户当前的状态,例如血量值、是否受伤、是否持续掉血等等。
在一些实施例中,虚拟对象的实时环境数据可以是虚拟对象当前所处的实时环境的数据,例如虚拟对象是否处于毒圈、是否处于不安全的沼泽区域、是否处于降雷区域等等。
在一些实施例中,虚拟对象的实时装备数据可以是虚拟对象当前所持有的装备的数据,例如虚拟道具的数量、虚拟枪药的数量等等。
在一些实施例中,虚拟对象的当前位置与期望位置之间的实时距离可以是虚拟对象当前所处的位置与所期待的位置之间的距离。期望位置可以是用户自己标记的,也可以是服务器预测出来的,例如期望位置可以是安全区的中心位置,也可以是用户自己标记的目标位置。
步骤367,根据实时竞技数据,从多个触发区域中确定推荐触发区域。
在一些实施例中,根据实时竞技数据,从多个触发区域中确定推荐触发区域。例如,当虚拟对象的血量值较低,装备较差,且所处的区域不安全时,从多个触发区域中确定灵敏度最高的触发区域作为推荐触发区域。例如,当虚拟对象的血量十分健康,装备很好,则从多个触发区域中确定灵敏度较低的触发区域作为推荐触发区域。可选地,推荐触发区域是多个触发区域中的一个触发区域。
在一些实施例中,通过速度预测模型对实时竞技数据进行处理,预测得到虚拟对象的期望移动速度;其中,速度预测模型是基于神经网络构建的机器学习模型。根据期望移动速度,从多个触发区域中确定推荐触发区域。
步骤368,显示推荐触发区域对应的提示信息。
在一些实施例中,提示信息是摇杆控件,将摇杆控件显示在多个触发区域的推荐触发区域中。可选地,将摇杆控件显示在推荐触发区域的参考点位置,可选地,参考点位置是推荐触发区域的中心点位置。可选地,如图5所示,在用户界面的推荐触发区域上直接显示摇杆控件T0。
在一些实施例中,提示信息是推荐触发区域,将多个触发区域中的推荐触发区域,与除推荐触发区域之外的其他触发区域进行区别显示。例如,将推荐触发区域以高亮的形式显示在用户界面中。可选地,如图5所示,在用户界面的推荐触发区域上高亮显示触发区域Q3。
本申请实施例提供的技术方案,根据虚拟对象的实时竞技数据确定推荐触发区域,可以实时根据虚拟对象的情况给出推荐触发区域,当虚拟对象遭遇危险或者处于不安全状态时,用户可以不分心考虑需要使用那个触发区域,而直接使用推荐触发区域,减少用户的反应时间,增加用户的竞技体验感。同时,还可以提升触发区域选择的准确性和高效性,使得用户可以快速而又准确地选择适合当前竞技场景的触发区域,以控制虚拟对象以适合当前竞技场景的速度进行移动。
请参考图9,其示出了本申请另一个实施例提供的虚拟对象的控制方法的流程图。该方法各步骤的执行主体可以是图1所示方案实施环境中的终端设备10,如各步骤的执行主体可以是目标应用程序的客户端。在下文方法实施例中,为了便于描述,仅以各步骤的执行主体为“客户端”进行介绍说明。该方法可以包括如下几个步骤(320~380)中的至少一个步骤:
步骤320,显示用户界面,用户界面中显示有用于控制虚拟对象移动的摇杆控件,摇杆控件具有多个方向区间,不同的方向区间对应于不同的移动方向。
步骤370,响应于针对摇杆控件的第一滑动操作,根据第一滑动操作的实时位置相对于摇杆控件的实时方向,从多个方向区间中确定实时方向所属的第一方向区间。
在本申请实施例提供的技术方案中,不仅可以根据滑动操作的起始位置对应的目标触发区域的灵敏度确定虚拟对象的移动速度,还可以根据滑动操作的实时位置相对于摇杆控件的实时方向,从多个方向区间中确定实时方向所属的第一方向区间。
本申请对于方向区间的数量不作限定。可选地,方向区间的数量为8。
在一些实施例中,如图10所示的用户界面的示意图中,用户的第一滑动操作的实时位 置相对于摇杆控件L1的实时方向,从多个方向区间中确定实时方向所属的第一方向区间。在一些实施例中,图10中可以分为10个方向区间,以上半区间为例,可以分为上、左上、右上、左、右,一共5个方向区间,在一些实施例中,每个方向区间对应一定的区间范围。可选地,如图11所示,第一滑动操作的实时位置相对于摇杆控件L2的实时方向为箭头m3所指向的方向,m3的方向落在了m1和m2围成的方向区间中,因此将m1和m2围成的“左上”方向区间确定为第一方向区间,将第一方向区间对应的方向北偏西45度的方向确定为虚拟对象的移动方向,正如箭头m4所指向的虚拟对象的移动方向是北偏西45度的方向。
具体的,以图12所示的摇杆控件的示意图为例,总共有8个方向区间,第一互动操作的实时位置为E1,摇杆控件L3的中心位置为E0,第一滑动操作的实时位置相对于摇杆控件的实时方向为E0指向E1的方向,可以确定E0指向E1的方向属于P1这一方向区间(P1对应八个方向区间中的一个,右上这一方向区间)。
在一些实施例中,提供了一种判断第一滑动操作的实时位置相对于摇杆控件的实时方向所属的第一方向区间的方法。
根据用户的互动操作,可以得知用户在屏幕上滑动的像素点的数量,进而可以得到滑动的弧长。假设已经直到滑动的弧长是5mm,那么根据弧长公式为L=n×π×r/180,L=α×r。其中n是圆心角度数,r是半径,L是圆心角弧长。在半径是R的圆中,因为360°的圆心角所对的弧长就等于圆周长C=2πr,所以n°圆心角所对的弧长为l=n°πr÷180°(l=n°x2πr/360°)。扇形的弧长,事实上就是圆的其中一段边长,扇形的角度是360度的几分之一,那么扇形的弧长就是这个圆的周长的几分之一,所以可以得出:扇形的弧长=2πr×角度/360,其中,2πr是圆的周长,角度为该扇形的角度值。举例:半径为1cm,0.785cm的弧长为所对应的圆心角为:l=nπr/180=n×π×1/180=n×3.14×1/180=0.785,所以可以知道n=45度,所以这段圆弧对应的角是45度。
根据上述由弧长判断圆心角,进而判断第一滑动操作的实时位置相对于摇杆控件的实时方向所属的第一方向区间。因此,可以根据图12中n°圆心角对应的弧长L,反向计算出n°的具体数值,根据n°的数值判断所属的方向区间,可选地,n的值为75,则第一滑动操作的实时位置相对于摇杆控件的实时方向所属的第一方向区间为P2。
步骤380,控制虚拟对象朝第一方向区间对应的移动方向进行移动。
在一些实施例中,设定每一个方向区间对应的移动方向。在一些实施例中,每个方向区间的中心方向作为该方向区间对应的移动方向。在图12中,方向区间P1对应的移动方向是E0指向F的方向,F是方向区间P1的中心方向上的一个点。
在实时方向所属的方向区间,由第一方向区间变为第二方向区间的情况下,控制虚拟对象的移动方向,在第一时长之内从第一方向区间对应的移动方向,逐渐变为第二方向区间对应的移动方向。其中,第二方向区间是与第一方向区间相邻的方向区间。逐渐调整方向使得虚拟对象的方向并不会瞬间改变,可以使得用户的体验感更好。
在图12中,当滑动操作的实时位置从E1点移动到E2点,则方向区间从P1变成了P2,控制虚拟对象的移动方向从E0指向F1的方向逐渐变为E0指向F2的方向,其中E0指向F2的方向为方向区间P2对应的移动方向。
对于实施例中提到的步骤的先后顺序,本申请不作限定,所有步骤可以进行排列组合,形成新的实施例。
本申请实施例提供的技术方案,通过划分多个方向区间,避免用户控制的虚拟对象过于灵敏的改变方向,同时设置多个方向区间,每个方向区间对应于一个移动方向,使得终端设备的处理开销得到了较大程度的降低,在一些实施例中,在当竞技画面出现卡顿时,将虚拟对象的移动方向设置成为如本申请实施例所设置的多个方向区间对应的移动方向,可以花费较少的终端设备的运行成本,减少画面的卡顿,使得画面的流畅度较高,用户体 验感较好。
请参考图13,其示出了本申请另一个实施例提供的虚拟对象的控制方法的流程图。该方法各步骤的执行主体可以是图1所示方案实施环境中的终端设备10,如各步骤的执行主体可以是目标应用程序的客户端。在下文方法实施例中,为了便于描述,仅以各步骤的执行主体为“客户端”进行介绍说明。该方法可以包括如下几个步骤(320-1~394)中的至少一个步骤:
步骤320-1,显示用户界面,用户界面中显示有用于控制虚拟对象移动的摇杆控件。
步骤390,响应于针对摇杆控件的第一滑动操作,获取第一滑动操作的实时位置与第一滑动操作的起始位置之间的距离。
在一些实施例中,如图14所示,当第一滑动操作的实时位置为G1时,第一滑动操作的实时位置与第一滑动操作的起始位置之间的距离为G1到第一滑动操作的起始位置G0之间的距离。
步骤392,在距离大于或等于第一阈值的情况下,显示自动移动控件。
自动移动控件用于让虚拟对象自动移动。在图14中,当第一滑动操作的实时位置为G1时,第一滑动操作的实时位置与第一滑动操作的起始位置之间的距离为G1到第一滑动操作的起始位置G0之间的距离,不满足第一阈值,不显示自动移动控件。当第一滑动操作的实时位置从G1变为G2时,第一滑动操作的实时位置与第一滑动操作的起始位置之间的距离为G2到第一滑动操作的起始位置G0之间的距离,满足第一阈值,则显示自动移动控件H1。
步骤394,在自动移动控件处于显示状态的情况下,若距离小于或等于第二阈值,则取消显示自动移动控件。
在图15中,当第一滑动操作的实时位置为G5时,第一滑动操作的实时位置与第一滑动操作的起始位置之间的距离为G5到第一滑动操作的起始位置G6之间的距离,满足第一阈值,显示自动移动控件H2。当第一滑动操作的实时位置从G5变为G4时,第一滑动操作的实时位置与第一滑动操作的起始位置之间的距离为G4到第一滑动操作的起始位置G6之间的距离,不满足第一阈值,则取消显示自动移动控件H1。
在本申请实施例中,对于第一阈值和第二阈值的数值不作限定,当第一滑动操作的实时位置与第一滑动操作的起始位置之间的距离大于或等于第一阈值,显示自动移动控件,当第一滑动操作的实时位置与第一滑动操作的起始位置之间的距离小于或等于第一阈值,取消显示自动移动控件,可以避免由于用户的多次操作带来的自动移动控件的不断消失和出现,因此可以提升用户的体验感,同时还能减轻终端设备的处理压力。
请参考图16,其示出了本申请另一个实施例提供的虚拟对象的控制方法的流程图。该方法各步骤的执行主体可以是图1所示方案实施环境中的终端设备10,如各步骤的执行主体可以是目标应用程序的客户端。在下文方法实施例中,为了便于描述,仅以各步骤的执行主体为“客户端”进行介绍说明。该方法可以包括如下几个步骤(410~430)中的至少一个步骤:
步骤410,显示用户界面,用户界面中显示有用于控制虚拟对象移动的摇杆控件,摇杆控件具有多个触发区域,不同的触发区域对应于不同的灵敏度。
步骤420,响应于起始位置位于第一触发区域的第一滑动操作,控制虚拟对象以第一移动速度进行移动。
步骤430,响应于起始位置位于第二触发区域的第二滑动操作,控制虚拟对象以第二移动速度进行移动。
其中,上述的第一触发区域和第二触发区域分别是多个触发区域中的一个;第一触发区域和第二触发区域不同,且第一移动速度和第二移动速度不同。
如图17所示的用户界面中,响应于起始位置位于触发区域Q22的滑动操作,控制虚拟对象以低速进行移动,响应于起始位置位于触发区域Q21的滑动操作,控制虚拟对象以中速进行移动,响应于起始位置位于触发区域Q20的滑动操作,控制虚拟对象以高速进行移动。如图17所示,当虚拟环境中出现敌方阵营的虚拟对象时,需要控制虚拟对象(我方阵营)以相对较低的速度进行移动,因此可以将滑动操作的起始位置落在触发区域Q21,使得虚拟对象更容易瞄准敌方阵营的虚拟对象。当需要控制虚拟对象快速移动,例如跑毒的时候,可以将滑动操作的起始位置落在触发区域Q20,控制虚拟对象以较高的速度移动。当虚拟对象匍匐移动时,可以将滑动操作的起始位置落在触发区域Q22,控制虚拟对象缓慢移动。
在一些实施例中,响应于第一滑动操作,在第一滑动操作的起始位置处显示摇杆控件。具体详见上述实施例,在此不作赘述。
在一些实施例中,响应于针对触发区域的设置操作,显示多个触发区域分别对应的范围框;响应于针对多个触发区域中的目标触发区域的删除操作,取消显示目标触发区域对应的范围框;或者,响应于针对多个触发区域中的目标触发区域的调整操作,调整目标触发区域对应的范围框的尺寸和位置中的至少之一;或者,响应于触发区域的增加操作,显示新增的触发区域对应的范围框;响应于针对触发区域的设置完成操作,根据当前显示的各个触发区域对应的范围框的尺寸和位置,设置摇杆控件对应的多个触发区域。具体详见上述实施例,在此不作赘述。
在一些实施例中,多个触发区域中,任意两个触发区域之间不存在重叠区域;或者,多个触发区域中,存在至少两个触发区域之间存在重叠区域。具体详见上述实施例,在此不作赘述。
在一些实施例中,将摇杆控件显示在多个触发区域的推荐触发区域中;或者,将多个触发区域中的推荐触发区域,与除推荐触发区域之外的其他触发区域进行区别显示。具体详见上述实施例,在此不作赘述。
本申请实施例中提到的步骤并不仅限于本申请所列举出来的几个实施例,步骤之间可以互相组合形成新的实施例,本申请对此不作限定。
本申请实施例提供的技术方案,针对不同的触发区域的滑动操作,可以控制虚拟对象以不同的移动速度进行移动,能够细化用户对于虚拟对象的控制,满足不同情况的用户的不同需求。
请参考图18,其示出了本申请另一个实施例提供的虚拟对象的控制方法的流程图。该方法各步骤的执行主体可以是图1所示方案实施环境中的终端设备10,如各步骤的执行主体可以是目标应用程序的客户端。在下文方法实施例中,为了便于描述,仅以各步骤的执行主体为“客户端”进行介绍说明。该方法可以包括如下几个步骤(S1~S4)中的至少一个步骤:
游戏对局开始之后,进入游戏对局,开始执行步骤S1。
步骤S1,判断是否接触移动区域,若是,则进入移动状态;若不是,则重新进入游戏对局。
进入移动状态之后,执行步骤S2。
步骤S2,判断是否滑动其他区间,若是,则改变移动方向;若不是,则重新进入移动状态。
在改变完移动状态之后,执行步骤S3。
步骤S3,判断是否往上滑动一定距离,若是,则显示锁定奔跑的按钮。
步骤S4,判断是否触碰奔跑按钮,若是,则锁定奔跑状态;若不是,则继续显示锁定奔跑的按钮。
游戏对局结束。
下述为本申请装置实施例,可以用于执行本申请方法实施例。对于本申请装置实施例中未披露的细节,请参照本申请方法实施例。
请参考图19,其示出了本申请一个实施例提供的虚拟对象的控制装置的框图。该装置具有实现上述方法示例的功能,所述功能可以由硬件实现,也可以由硬件执行相应的软件实现。该装置可以是上文介绍的终端设备,也可以设置在终端设备中。如图19所示,该装置1800可以包括:界面显示模块1810、速度确定模块1820和移动控制模块1830。
所述界面显示模块1810,用于显示用户界面,所述用户界面中显示有用于控制虚拟对象移动的摇杆控件,所述摇杆控件具有多个触发区域,不同的触发区域对应于不同的灵敏度。
所述速度确定模块1820,用于响应于起始位置位于第一触发区域的第一滑动操作,根据所述第一触发区域的灵敏度,确定所述虚拟对象的第一移动速度。
所述移动控制模块1830,用于控制所述虚拟对象以所述第一移动速度进行移动。
在一些实施例中,所述速度确定模块1820,用于根据所述第一触发区域的灵敏度以及所述第一滑动操作的属性信息,确定所述虚拟对象的第一移动速度;所述第一触发区域是所述多个触发区域中的一个。
在一些实施例中,所述属性信息包括:所述第一滑动操作的实时位置与所述第一滑动操作的起始位置之间的距离。
在一些实施例中,所述速度确定模块1820,还用于根据所述距离确定灵敏度修正参数,所述灵敏度修正参数用于对所述触发区域的灵敏度进行调整。
所述速度确定模块1820,还用于根据所述灵敏度修正参数和所述第一触发区域的灵敏度,确定所述虚拟对象的第一移动速度。
在一些实施例中,所述多个触发区域中,存在至少两个触发区域之间存在重叠区域。
在一些实施例中,如图20所示,所述装置还包括起始位置获取模块1840、间距确定模块1850和触发区域确定模块1860。
所述起始位置获取模块1840,用于在检测到所述第一滑动操作时,获取所述第一滑动操作的起始位置。
所述间距确定模块1850,用于确定所述第一滑动操作的起始位置与各个所述触发区域的参考点之间的间距;其中,各个所述触发区域的参考点的位置互不相同。
所述触发区域确定模块1860,用于从所述多个触发区域中,将所述间距最小的触发区域,确定为所述第一触发区域。
在一些实施例中,如图20所示,所述装置还包括数据获取模块1870和提示信息显示模块1880。
所述数据获取模块1870,用于获取与所述虚拟对象相关的实时竞技数据,所述实时竞技数据包括以下至少之一:所述虚拟对象的实时属性数据、所述虚拟对象的实时环境数据、所述虚拟对象的实时装备数据、所述虚拟对象的当前位置与期望位置之间的实时距离。
所述触发区域确定模块1860,还用于根据所述实时竞技数据,从所述多个触发区域中确定推荐触发区域。
所述提示信息显示模块1880,用于显示所述推荐触发区域对应的提示信息。
在一些实施例中,所述触发区域确定模块1860,还用于通过速度预测模型对所述实时竞技数据进行处理,预测得到所述虚拟对象的期望移动速度;其中,所述速度预测模型是基于神经网络构建的机器学习模型。
所述触发区域确定模块1860,还用于根据所述期望移动速度,从所述多个触发区域中确定所述推荐触发区域。
在一些实施例中,所述多个触发区域的位置,按照所述多个触发区域分别对应的灵敏度递增或递减的顺序排列。
在一些实施例中,所述摇杆控件具有多个方向区间,不同的方向区间对应于不同的移动方向。
在一些实施例中,如图20所示,所述装置还包括区间确定模块1890。
所述区间确定模块1890,用于根据所述第一滑动操作的实时位置相对于所述摇杆控件的实时方向,从所述多个方向区间中确定所述实时方向所属的第一方向区间。
所述控制移动模块1830,还用于控制所述虚拟对象朝所述第一方向区间对应的移动方向进行移动。
在一些实施例中,所述控制移动模块1830,还用于在所述实时方向所属的方向区间,由所述第一方向区间变为第二方向区间的情况下,控制所述虚拟对象的移动方向,在第一时长之内从所述第一方向区间对应的移动方向,逐渐变为所述第二方向区间对应的移动方向;其中,所述第二方向区间是与所述第一方向区间相邻的方向区间。
在一些实施例中,如图20所示,所述装置还包括距离获取模块1892和控件显示模块1894。
所述距离获取模块1892,用于获取所述第一滑动操作的实时位置与所述第一滑动操作的起始位置之间的距离。
所述控件显示模块1894,用于在所述距离大于或等于第一阈值的情况下,显示自动移动控件,所述自动移动控件用于触发所述虚拟对象自动奔跑。
所述控件显示模块1894,还用于在所述自动移动控件处于显示状态的情况下,若所述距离小于或等于第二阈值,则取消显示所述自动移动控件;其中,所述第二阈值小于所述第一阈值。
在一些实施例中,所述界面显示模块1810,用于显示用户界面,所述用户界面中显示有用于控制虚拟对象移动的摇杆控件,所述摇杆控件具有多个触发区域。
所述移动控制模块1830,用于响应于起始位置位于第一触发区域的第一滑动操作,控制所述虚拟对象以第一移动速度进行移动。
所述移动控制模块1830,用于响应于起始位置位于第二触发区域的第二滑动操作,控制所述虚拟对象以第二移动速度进行移动;其中,第一触发区域和第二触发区域分别是所述多个触发区域中的一个;所述第一触发区域和所述第二触发区域不同,且所述第一移动速度和所述第二移动速度不同。
在一些实施例中,如图21所示,所述装置还包括摇杆控件显示模块2040。
所述摇杆控件显示模块2040,用于响应于所述第一滑动操作,在所述第一滑动操作的起始位置处显示所述摇杆控件。
在一些实施例中,如图21所示,所述装置还包括范围框显示模块2050、范围框调整模块2060和触发区域设置模块2070。
所述范围框显示模块2050,用于响应于针对所述触发区域的设置操作,显示所述多个触发区域分别对应的范围框。
所述范围框显示模块2050,还用于响应于针对所述多个触发区域中的目标触发区域的删除操作,取消显示所述目标触发区域对应的范围框。
所述范围框调整模块2060,用于响应于针对所述多个触发区域中的目标触发区域的调整操作,调整所述目标触发区域对应的范围框的尺寸和位置中的至少之一。
所述范围框显示模块2050,还用于响应于所述触发区域的增加操作,显示新增的触发区域对应的范围框。
所述触发区域设置模块2070,用于响应于针对所述触发区域的设置完成操作,根据当前显示的各个所述触发区域对应的范围框的尺寸和位置,设置所述摇杆控件对应的多个触 发区域。
在一些实施例中,所述多个触发区域中,任意两个触发区域之间不存在重叠区域;或者,所述多个触发区域中,存在至少两个触发区域之间存在重叠区域。
在一些实施例中,如图21所示,所述装置还包括提示信息显示模块2080。
所述提示信息显示模块2080,用于将所述摇杆控件显示在所述多个触发区域的推荐触发区域中;或者,所述提示信息显示模块2080,用于将所述多个触发区域中的推荐触发区域,与除所述推荐触发区域之外的其他触发区域进行区别显示。
需要说明的是,上述实施例提供的装置,在实现其功能时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将设备的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的装置与方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
请参考图22,其示出了本申请一个实施例提供的终端设备2100的结构框图。该终端设备2100可以是图1所示实施环境中的终端设备10,用于实施上述实施例中提供的虚拟对象的控制方法。具体来讲:
通常,终端设备2100包括有:处理器2101和存储器2102。
处理器2101可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器2101可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。
存储器2102可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是非暂态的。存储器2102还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器2102中的非暂态的计算机可读存储介质用于存储计算机程序,所述计算机程序经配置以由一个或者一个以上处理器执行,以实现上述虚拟对象的控制方法。
在一些实施例中,终端设备2100还可选包括有:外围设备接口2103和至少一个外围设备。处理器2101、存储器2102和外围设备接口2103之间可以通过总线或信号线相连。各个外围设备可以通过总线、信号线或电路板与外围设备接口2103相连。具体地,外围设备包括:射频电路2104、显示屏2105、音频电路2107和电源2108中的至少一种。
本领域技术人员可以理解,图22中示出的结构并不构成对终端设备2100的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
在示例性实施例中,还提供了一种计算机可读存储介质,所述存储介质中存储有计算机程序,所述计算机程序在被处理器执行时以实现上虚拟对象的控制方法。
可选地,该计算机可读存储介质可以包括:ROM(Read-Only Memory,只读存储器)、RAM(Random Access Memory,随机存取存储器)、SSD(Solid State Drives,固态硬盘)或光盘等。其中,随机存取存储器可以包括ReRAM(Resistance Random Access Memory,电阻式随机存取存储器)和DRAM(Dynamic Random Access Memory,动态随机存取存储器)。
在示例性实施例中,还提供了一种计算机程序产品,所述计算机程序产品包括计算机指令,所述计算机指令存储在计算机可读存储介质中。终端设备的处理器从所述计算机可读存储介质中读取所述计算机指令,所述处理器执行所述计算机指令,使得所述终端设备执行上述虚拟对象的控制方法。

Claims (20)

  1. 一种虚拟对象的控制方法,所述方法由终端设备执行,所述方法包括:
    显示用户界面,所述用户界面中显示有用于控制虚拟对象移动的摇杆控件,所述摇杆控件具有多个触发区域,不同的触发区域对应于不同的灵敏度;
    响应于起始位置位于第一触发区域的第一滑动操作,根据所述第一触发区域的灵敏度,确定所述虚拟对象的第一移动速度;所述第一触发区域是所述多个触发区域中的一个;
    控制所述虚拟对象以所述第一移动速度进行移动。
  2. 根据权利要求1所述的方法,所述根据所述第一触发区域的灵敏度,确定所述虚拟对象的第一移动速度,包括:
    根据所述第一触发区域的灵敏度以及所述第一滑动操作的属性信息,确定所述虚拟对象的第一移动速度。
  3. 根据权利要求2所述的方法,所述属性信息包括:所述第一滑动操作的实时位置与所述第一滑动操作的起始位置之间的距离;
    所述根据所述第一触发区域的灵敏度以及所述第一滑动操作的属性信息,确定所述虚拟对象的第一移动速度,包括:
    根据所述距离确定灵敏度修正参数,所述灵敏度修正参数用于对所述触发区域的灵敏度进行调整;
    根据所述灵敏度修正参数和所述第一触发区域的灵敏度,确定所述虚拟对象的第一移动速度。
  4. 根据权利要求1所述的方法,所述多个触发区域中,存在至少两个触发区域之间存在重叠区域;所述方法还包括:
    在检测到所述第一滑动操作时,获取所述第一滑动操作的起始位置;
    确定所述第一滑动操作的起始位置与各个所述触发区域的参考点之间的间距;其中,各个所述触发区域的参考点的位置互不相同;
    从所述多个触发区域中,将所述间距最小的触发区域,确定为所述第一触发区域。
  5. 根据权利要求1所述的方法,所述方法还包括:
    获取与所述虚拟对象相关的实时竞技数据,所述实时竞技数据包括以下至少之一:所述虚拟对象的实时属性数据、所述虚拟对象的实时环境数据、所述虚拟对象的实时装备数据、所述虚拟对象的当前位置与期望位置之间的实时距离;
    根据所述实时竞技数据,从所述多个触发区域中确定推荐触发区域;
    显示所述推荐触发区域对应的提示信息。
  6. 根据权利要求5所述的方法,所述根据所述实时竞技数据,从所述多个触发区域中确定推荐触发区域,包括:
    通过速度预测模型对所述实时竞技数据进行处理,预测得到所述虚拟对象的期望移动速度;其中,所述速度预测模型是基于神经网络构建的机器学习模型;
    根据所述期望移动速度,从所述多个触发区域中确定所述推荐触发区域。
  7. 根据权利要求1所述的方法,所述多个触发区域的位置,按照所述多个触发区域分别对应的灵敏度递增或递减的顺序排列。
  8. 根据权利要求1所述的方法,所述摇杆控件具有多个方向区间,不同的方向区间对应于不同的移动方向;
    所述方法还包括:
    根据所述第一滑动操作的实时位置相对于所述摇杆控件的实时方向,从所述多个方向区间中确定所述实时方向所属的第一方向区间;
    控制所述虚拟对象朝所述第一方向区间对应的移动方向进行移动。
  9. 根据权利要求8所述的方法,所述控制所述虚拟对象朝所述第一方向区间对应的移动方向进行移动之后,还包括:
    在所述实时方向所属的方向区间,由所述第一方向区间变为第二方向区间的情况下,控制所述虚拟对象的移动方向,在第一时长之内从所述第一方向区间对应的移动方向,逐渐变为所述第二方向区间对应的移动方向;
    其中,所述第二方向区间是与所述第一方向区间相邻的方向区间。
  10. 根据权利要求1所述的方法,所述方法还包括:
    获取所述第一滑动操作的实时位置与所述第一滑动操作的起始位置之间的距离;
    在所述距离大于或等于第一阈值的情况下,显示自动移动控件,所述自动移动控件用于触发所述虚拟对象自动奔跑;
    在所述自动移动控件处于显示状态的情况下,若所述距离小于或等于第二阈值,则取消显示所述自动移动控件;其中,所述第二阈值小于所述第一阈值。
  11. 一种虚拟对象的控制方法,所述方法由终端设备执行,所述方法包括:
    显示用户界面,所述用户界面中显示有用于控制虚拟对象移动的摇杆控件,所述摇杆控件具有多个触发区域;
    响应于起始位置位于第一触发区域的第一滑动操作,控制所述虚拟对象以第一移动速度进行移动;
    响应于起始位置位于第二触发区域的第二滑动操作,控制所述虚拟对象以第二移动速度进行移动;
    其中,所述第一触发区域和所述第二触发区域分别是所述多个触发区域中的一个;所述第一触发区域和所述第二触发区域不同,且所述第一移动速度和所述第二移动速度不同。
  12. 根据权利要求11所述的方法,所述方法还包括:
    响应于所述第一滑动操作,在所述第一滑动操作的起始位置处显示所述摇杆控件。
  13. 根据权利要求11所述的方法,所述方法还包括:
    响应于针对所述触发区域的设置操作,显示所述多个触发区域分别对应的范围框;
    响应于针对所述多个触发区域中的目标触发区域的删除操作,取消显示所述目标触发区域对应的范围框;或者,响应于针对所述多个触发区域中的目标触发区域的调整操作,调整所述目标触发区域对应的范围框的尺寸和位置中的至少之一;或者,响应于针对所述触发区域的增加操作,显示新增的触发区域对应的范围框;
    响应于针对所述触发区域的设置完成操作,根据当前显示的各个所述触发区域对应的范围框的尺寸和位置,设置所述摇杆控件对应的多个触发区域。
  14. 根据权利要求11所述的方法,
    所述多个触发区域中,任意两个触发区域之间不存在重叠区域;
    或者,
    所述多个触发区域中,存在至少两个触发区域之间存在重叠区域。
  15. 根据权利要求11至14任一项所述的方法,所述方法还包括:
    将所述摇杆控件显示在所述多个触发区域的推荐触发区域中;
    或者,
    将所述多个触发区域中的推荐触发区域,与除所述推荐触发区域之外的其他触发区域进行区别显示。
  16. 一种虚拟对象的控制装置,所述装置包括:
    界面显示模块,用于显示用户界面,所述用户界面中显示有用于控制虚拟对象移动的摇杆控件,所述摇杆控件具有多个触发区域,不同的触发区域对应于不同的灵敏度;
    速度确定模块,用于响应于起始位置位于第一触发区域的第一滑动操作,根据所述第一触发区域的灵敏度,确定所述虚拟对象的第一移动速度;所述第一触发区域是所述多个触发区域中的一个;
    移动控制模块,用于控制所述虚拟对象以所述第一移动速度进行移动。
  17. 一种虚拟对象的控制装置,所述装置包括:
    界面显示模块,用于显示用户界面,所述用户界面中显示有用于控制虚拟对象移动的摇杆控件,所述摇杆控件具有多个触发区域;
    移动控制模块,用于响应于起始位置位于第一触发区域的第一滑动操作,控制所述虚拟对象以第一移动速度进行移动;
    所述移动控制模块,还用于响应于起始位置位于第二触发区域的第二滑动操作,控制所述虚拟对象以第二移动速度进行移动;
    其中,所述第一触发区域和所述第二触发区域分别是所述多个触发区域中的一个;所述第一触发区域和所述第二触发区域不同,且所述第一移动速度和所述第二移动速度不同。
  18. 一种终端设备,所述终端设备包括处理器和存储器,所述存储器中存储有计算机程序,所述计算机程序由所述处理器加载并执行以实现如权利要求1至10任一项所述的方法,或实现如权利要求11至15任一项所述的方法。
  19. 一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机程序,所述计算机程序由处理器加载并执行以实现如上述权利要求1至10任一项所述的方法,或实现如权利要求11至15任一项所述的方法。
  20. 一种计算机程序产品,所述计算机程序产品包括计算机指令,所述计算机指令存储在计算机可读存储介质中,处理器从所述计算机可读存储介质读取并执行所述计算机指令,以实现如权利要求1至10任一项所述的方法,或实现如权利要求11至15任一项所述的方法。
PCT/CN2023/091178 2022-07-12 2023-04-27 虚拟对象的控制方法、装置、设备、存储介质及程序产品 WO2024012010A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210822326.7A CN117427332A (zh) 2022-07-12 2022-07-12 虚拟对象的控制方法、装置、设备、存储介质及程序产品
CN202210822326.7 2022-07-12

Publications (1)

Publication Number Publication Date
WO2024012010A1 true WO2024012010A1 (zh) 2024-01-18

Family

ID=89535396

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/091178 WO2024012010A1 (zh) 2022-07-12 2023-04-27 虚拟对象的控制方法、装置、设备、存储介质及程序产品

Country Status (2)

Country Link
CN (1) CN117427332A (zh)
WO (1) WO2024012010A1 (zh)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108379844A (zh) * 2018-03-30 2018-08-10 腾讯科技(深圳)有限公司 控制虚拟对象移动的方法、装置、电子装置及存储介质
CN108509139A (zh) * 2018-03-30 2018-09-07 腾讯科技(深圳)有限公司 虚拟对象的移动控制方法、装置、电子装置及存储介质
CN111111190A (zh) * 2019-12-17 2020-05-08 网易(杭州)网络有限公司 游戏中虚拟角色的交互方法、装置以及触控终端
CN111330272A (zh) * 2020-02-14 2020-06-26 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、终端及存储介质
JP2020178822A (ja) * 2019-04-24 2020-11-05 株式会社カプコン ゲームプログラム、およびゲームシステム
WO2020244421A1 (zh) * 2019-06-05 2020-12-10 腾讯科技(深圳)有限公司 虚拟对象的移动控制方法、装置、终端和存储介质
CN113181651A (zh) * 2021-04-06 2021-07-30 网易(杭州)网络有限公司 游戏中控制虚拟对象移动的方法、装置、电子设备及存储介质
CN113398573A (zh) * 2021-06-29 2021-09-17 网易(杭州)网络有限公司 一种虚拟角色的位移控制方法和装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108379844A (zh) * 2018-03-30 2018-08-10 腾讯科技(深圳)有限公司 控制虚拟对象移动的方法、装置、电子装置及存储介质
CN108509139A (zh) * 2018-03-30 2018-09-07 腾讯科技(深圳)有限公司 虚拟对象的移动控制方法、装置、电子装置及存储介质
JP2020178822A (ja) * 2019-04-24 2020-11-05 株式会社カプコン ゲームプログラム、およびゲームシステム
WO2020244421A1 (zh) * 2019-06-05 2020-12-10 腾讯科技(深圳)有限公司 虚拟对象的移动控制方法、装置、终端和存储介质
CN111111190A (zh) * 2019-12-17 2020-05-08 网易(杭州)网络有限公司 游戏中虚拟角色的交互方法、装置以及触控终端
CN111330272A (zh) * 2020-02-14 2020-06-26 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、终端及存储介质
CN113181651A (zh) * 2021-04-06 2021-07-30 网易(杭州)网络有限公司 游戏中控制虚拟对象移动的方法、装置、电子设备及存储介质
CN113398573A (zh) * 2021-06-29 2021-09-17 网易(杭州)网络有限公司 一种虚拟角色的位移控制方法和装置

Also Published As

Publication number Publication date
CN117427332A (zh) 2024-01-23

Similar Documents

Publication Publication Date Title
JP5887458B1 (ja) プレイヤの移動履歴に基づいてノンプレイヤキャラクタの経路探索を行うゲームシステム等
JP7350088B2 (ja) 仮想オブジェクトの制御方法、装置、デバイス及びコンピュータプログラム
JP7331124B2 (ja) 仮想オブジェクトの制御方法、装置、端末及び記憶媒体
EP3970819B1 (en) Interface display method and apparatus, and terminal and storage medium
KR20140006642A (ko) 온라인 게임에서의 유저 제스처 입력 처리 방법
JP2023552772A (ja) 仮想アイテムの切り替え方法、装置、端末及びコンピュータプログラム
JP7137719B2 (ja) 仮想オブジェクトの選択方法、装置、端末及びプログラム
CN113633975B (zh) 虚拟环境画面的显示方法、装置、终端及存储介质
JP2022533920A (ja) 仮想オブジェクトの選択方法、装置、機器及びコンピュータプログラム
WO2023029828A1 (zh) 虚拟对象的控制方法、装置、终端及存储介质
US20230330543A1 (en) Card casting method and apparatus, device, storage medium, and program product
JP7343355B2 (ja) ゲーム制御方法および装置
WO2024012010A1 (zh) 虚拟对象的控制方法、装置、设备、存储介质及程序产品
WO2023071808A1 (zh) 基于虚拟场景的图形显示方法、装置、设备以及介质
WO2023061133A1 (zh) 虚拟场景显示方法、装置、终端及存储介质
JP6953650B1 (ja) ゲームシステム、プログラム及び情報処理方法
CN111643895A (zh) 操作响应方法、装置、终端及存储介质
Torok et al. Smart controller: Introducing a dynamic interface adapted to the gameplay
US20220054944A1 (en) Virtual object control method and apparatus, terminal, and storage medium
JP7146052B1 (ja) ゲームシステム、ゲームプログラム及び情報処理方法
US12011662B2 (en) Interface display method, apparatus, terminal, and storage medium
JP6922111B1 (ja) ゲームシステム、プログラム及び情報処理方法
CN117753007A (zh) 虚拟场景的互动处理方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23838509

Country of ref document: EP

Kind code of ref document: A1