CN113509729B - Virtual prop control method and device, computer equipment and storage medium - Google Patents

Virtual prop control method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN113509729B
CN113509729B CN202110553093.0A CN202110553093A CN113509729B CN 113509729 B CN113509729 B CN 113509729B CN 202110553093 A CN202110553093 A CN 202110553093A CN 113509729 B CN113509729 B CN 113509729B
Authority
CN
China
Prior art keywords
target
virtual
virtual prop
prop
target position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110553093.0A
Other languages
Chinese (zh)
Other versions
CN113509729A (en
Inventor
潘科宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110553093.0A priority Critical patent/CN113509729B/en
Publication of CN113509729A publication Critical patent/CN113509729A/en
Application granted granted Critical
Publication of CN113509729B publication Critical patent/CN113509729B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a control method and device for virtual props, computer equipment and a storage medium, and belongs to the technical field of computers. According to the method and the device, the touch operation of the user on the virtual scene is detected in the motion process of the virtual prop, the motion track of the virtual prop is adjusted based on the operation position of the touch operation, so that the virtual prop can move towards the touch point, namely, the effect of adjusting the motion track of the virtual prop in real time based on the user operation is achieved, the motion track of the virtual prop can be flexibly controlled by the user, the hit rate of the virtual prop on a target is improved, and the man-machine interaction efficiency is improved.

Description

Virtual prop control method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and apparatus for controlling a virtual prop, a computer device, and a storage medium.
Background
With the development of computer technology, more and more online games are presented, in some online games, a user can interact with a virtual object in a virtual scene using a virtual prop, for example, the user can release the virtual prop to a certain virtual object to attack the virtual object. In the existing virtual prop control method, a user determines a motion track of a virtual prop before releasing the virtual prop, the virtual prop moves in a virtual scene according to the motion track after being released, if a virtual object is located on the motion track, the virtual prop can hit the virtual object, otherwise, the virtual prop cannot hit the virtual object. In the process, the control difficulty of the virtual prop is high, the probability of hitting a target of the virtual prop is low, and the man-machine interaction efficiency is low.
Disclosure of Invention
The embodiment of the application provides a control method, a control device, computer equipment and a storage medium for virtual props, which can reduce the control difficulty of the virtual props, improve the probability of hitting targets of the virtual props and improve the man-machine interaction efficiency. The technical scheme is as follows:
in one aspect, a method for controlling a virtual prop is provided, the method comprising:
displaying a virtual scene;
responding to triggering operation of the virtual prop, and controlling the virtual prop to move in the virtual scene according to a target track, wherein the target track is an original movement track of the virtual prop;
and during the movement of the virtual prop according to the target track, responding to the touch operation on the target position in the virtual scene, and controlling the movement of the virtual prop to the target position.
In one aspect, a control device for a virtual prop is provided, the device comprising:
the display module is used for displaying the virtual scene;
the control module is used for responding to the triggering operation of the virtual prop and controlling the virtual prop to move in the virtual scene according to a target track, wherein the target track is the original movement track of the virtual prop; and during the movement of the virtual prop according to the target track, responding to the touch operation on the target position in the virtual scene, and controlling the movement of the virtual prop to the target position.
In one possible implementation, the control module is configured to:
during the movement of the virtual prop according to the target track, in response to detection of touch operation on any position in the virtual scene, detecting the touch operation on the virtual scene at a first reference frequency, and determining the operation position of the detected touch operation as the target position;
and controlling the virtual prop to move towards the target position.
In one possible implementation, the control module is configured to:
and responding to touch operation on a target position in the virtual scene, wherein the target position is in a target area, and the virtual prop is controlled to move towards the target position.
In one possible implementation, the control module includes:
the determining submodule is used for responding to touch operation on a target position in the virtual scene during the movement of the virtual prop according to the target track, and determining the movement speed of the virtual prop;
and the control sub-module is used for controlling the virtual prop to move towards the target position according to the movement speed.
In one possible implementation, the determining submodule includes:
the device comprises a force determination unit, a force detection unit and a force detection unit, wherein the force determination unit is used for determining a target force in response to touch operation on a target position in the virtual scene, the target force is used for representing attraction effect of the target position on the virtual prop, and the movement speed of the virtual prop is positively related to the force of the target force;
A speed determination unit for determining the movement speed based on the target acting force.
In one possible implementation, the force determination unit is configured to:
responding to touch operation on a target position in the virtual scene, and acquiring limit information of the acting direction of acting force and force information of acting force;
determining at least two first reference positions in the virtual scene based on the target position and the limitation information of the acting direction by taking the target position as a center, wherein the at least two first reference positions are used for indicating the direction of the target acting force;
based on the force information of the force, a force of the target force is determined.
In one possible implementation, the control module is configured to:
applying the target force to the virtual prop;
determining a second reference position among the at least two first reference positions according to a second reference frequency during the action of the target force;
and adjusting the acting direction of the target acting force based on the second reference position, and controlling the virtual prop to move under the acting direction of the target acting force.
In one possible implementation, the strength of the target force is positively correlated with the distance between the virtual prop and the target location;
The control submodule is used for:
adjusting the strength of the target acting force based on the distance between the virtual prop and the target position during the movement of the virtual prop to the target position;
and controlling the virtual prop to move towards the target position according to the movement speed indicated by the target acting force.
In one possible implementation, the determining submodule is configured to:
detecting a pressing force of a touch operation in response to the touch operation on a target position in the virtual scene;
based on the pressing force, determining a movement speed of the virtual prop, wherein the movement speed is positively correlated with the pressing force.
In one possible implementation, the apparatus further includes:
the determining module is used for determining the current movement direction and movement speed of the virtual prop in response to the touch operation of the virtual scene;
the control module is used for controlling the virtual prop to move according to the current movement direction and movement speed.
In one aspect, a computer device is provided that includes one or more processors and one or more memories having at least one computer program stored therein, the at least one computer program loaded and executed by the one or more processors to implement operations performed by a method of controlling the virtual prop.
In one aspect, a computer-readable storage medium having at least one computer program stored therein is provided, the at least one computer program being loaded and executed by a processor to implement operations performed by a method of controlling the virtual prop.
In one aspect, a computer program product is provided that includes at least one computer program stored in a computer readable storage medium. The processor of the computer device reads the at least one computer program from the computer-readable storage medium, and the processor executes the at least one computer program so that the computer device performs the control method of the virtual prop.
According to the technical scheme provided by the embodiment of the application, the touch operation of the user on the virtual scene is detected in the motion process of the virtual prop, and the motion track of the virtual prop is adjusted based on the operation position of the touch operation, so that the virtual prop can move towards the touch point, namely, the effect of adjusting the motion track of the virtual prop in real time based on the user operation is achieved, the motion track of the virtual prop can be flexibly controlled by the user, the hit rate of the virtual prop on a target is improved, and the man-machine interaction efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an implementation environment of a method for controlling a virtual prop according to an embodiment of the present application;
FIG. 2 is a flow chart of a method for controlling a virtual prop provided by an embodiment of the present application;
FIG. 3 is a flowchart of a method for controlling a virtual prop provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of applying a target force to a virtual prop according to an embodiment of the present application;
FIG. 5 is a schematic diagram of configuration information of a virtual prop according to an embodiment of the present application;
FIG. 6 is a schematic diagram of an effect of adjusting a motion trail of a virtual prop according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a method for controlling a virtual prop according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a control device for virtual props according to an embodiment of the present application;
Fig. 9 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
For the purpose of promoting an understanding of the principles and advantages of the application, reference will now be made in detail to the embodiments of the application, some but not all of which are illustrated in the accompanying drawings. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terms "first," "second," and the like in this disclosure are used for distinguishing between similar elements or items having substantially the same function and function, and it should be understood that there is no logical or chronological dependency between the terms "first," "second," and "n," and that there is no limitation on the amount and order of execution.
In order to facilitate understanding of the technical process of the embodiments of the present application, some terms related to the embodiments of the present application are explained below:
virtual scene: is a virtual scene that an application program displays (or provides) while running on a terminal. The virtual scene is illustratively a simulation environment for the real world, or a semi-simulated semi-fictional virtual environment, or a purely fictional virtual environment. Alternatively, the virtual scene is any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, and a three-dimensional virtual scene, which is not limited in the present application. For example, a virtual scene includes sky, land, sea, etc., the land includes environmental elements of a desert, city, etc., and a user can control a virtual object to move in the virtual scene.
Virtual object: refers to a virtual character that is movable in a virtual scene, e.g., the movable object is a virtual character, a virtual animal, a cartoon character, etc. Illustratively, the virtual object is an avatar in the virtual scene that is virtual for representing a user. In one possible implementation, a virtual scene includes a plurality of virtual objects, each virtual object having its own shape and volume in the virtual scene, occupying a portion of the space in the virtual scene. Alternatively, the virtual object is a Character controlled by operating on a client, or is an artificial intelligence (Artificial Intelligence, AI) set in a virtual environment fight by training, or is a Non-Player Character (NPC) set in a virtual scene fight. Optionally, the virtual object is a virtual character playing an athletic in the virtual scene. Optionally, the number of virtual objects in the virtual scene fight may be preset, or dynamically determined according to the number of clients joining the fight, which is not limited by the embodiment of the present application. In one possible implementation, a user controls a virtual object to move in the virtual scene, e.g., controls the virtual object to run, jump, crawl, etc., and optionally, the user controls the virtual object to interact with other virtual objects using skills, virtual props, etc. provided by the application.
Virtual object: the virtual object includes a virtual carrier, a virtual building, a virtual plant, a virtual ornament, and the like, and the type of the virtual object is not limited in the embodiment of the present application.
Virtual prop: refers to props that are capable of interacting with virtual objects, etc. in a virtual scene. In some embodiments, the virtual prop can affect the attribute values of the virtual object, virtual object when acting on the virtual object, virtual object. For example, the virtual prop is a virtual bullet, a virtual bomb, or the like, and when the virtual prop hits the virtual object or the virtual object, the virtual object or the virtual object can be damaged, so that the attribute values of the virtual object or the virtual object can be reduced. It should be noted that, the embodiment of the present application does not limit the type of the virtual prop.
Fig. 1 is a schematic diagram of an implementation environment of a method for controlling a virtual prop according to an embodiment of the present application, referring to fig. 1, the implementation environment includes: a first terminal 110 and a server 140.
The first terminal 110 installs and moves with an application supporting the display of virtual scenes. The application is illustratively any one of a virtual reality application, a three-dimensional map application, a military simulation application, a Role-Playing Game (RPG), a multiplayer online tactical competition Game (Multiplayer Online Battle Arena Games, MOBA), a multiplayer gunfight survival Game. The first terminal 110 is a terminal used by a first user, a user account of the first user is logged in an application program operated by the first terminal 110, and the first user uses the first terminal 110 to control the virtual prop to interact with a virtual object and a virtual object in a virtual scene. In some embodiments, the first user is also able to use the first terminal 110 to operate a first virtual object located in the virtual scene for activity, e.g., the first user controls the first virtual object to use the virtual prop through the first terminal 110. Illustratively, the first virtual object is a first virtual character, such as an emulated persona or a cartoon persona.
The first terminal 110 is connected to the server 140 through a wireless network or a wired network.
Server 140 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 140 is used to provide background services for applications supporting virtual scene display. Optionally, the server 140 takes over primary computing work and the first terminal 110 takes over secondary computing work; alternatively, the server 140 takes on secondary computing work and the first terminal 110 takes on primary computing work; alternatively, the server 140 and the first terminal 110 perform cooperative computing using a distributed computing architecture.
In some embodiments, the implementation environment further includes a second terminal 160, and the second terminal 160 installs and moves an application program supporting virtual scene display, which is illustratively any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a Role Playing Game (RPG), a multiplayer online tactical Game (Multiplayer Online Battle Arena Games, MOBA), and a multiplayer gunfight survival Game. The second terminal 160 is a terminal used by a second user, a user account of the second user is logged in an application program operated by the second terminal 160, and the second user uses the second terminal 160 to control the virtual prop to interact with the virtual object and the virtual object in the virtual scene. In some embodiments, the second user is also able to use second terminal 110 to operate a second virtual object located in the virtual scene for activity, e.g., the second user controls the second virtual object to use the virtual prop through second terminal 160. Illustratively, the second virtual object is a second virtual character, such as an emulated persona or a cartoon persona. The second terminal 160 is connected to the server 140 through a wireless network or a wired network.
Optionally, the first virtual object controlled by the first terminal 110 and the second virtual object controlled by the second terminal 160 are in the same virtual scene, where the first virtual object interacts with the second virtual object in the virtual scene. Alternatively, the applications installed on the first terminal 110 and the second terminal 160 are the same, or the applications installed on the two terminals are the same type of application of different operating system platforms. In the embodiment of the present application, the first terminal 110 refers generally to one of a plurality of terminals, and the second terminal 160 refers generally to one of a plurality of terminals, and this embodiment is exemplified only by the first terminal 110 and the second terminal 160. The device types of the first terminal 110 and the second terminal 160 are the same or different, and include: at least one of a smart phone, a tablet computer, an electronic book reader, an MP3 (Moving Picture Experts Group Audio Layer III, moving picture experts compression standard audio layer 3) player, an MP4 (Moving Picture Experts Group Audio Layer IV, moving picture experts compression standard audio layer 4) player, a laptop portable computer, and a desktop computer. For example, the first terminal 110 and the second terminal 160 are smart phones, or other handheld portable gaming devices. The following embodiments are illustrated with the terminal comprising a smart phone.
Those skilled in the art will recognize that the number of terminals may be greater or lesser. Such as the above-mentioned terminals may be only one, or the above-mentioned terminals may be several tens or hundreds, or more. The embodiment of the application does not limit the number of terminals and the equipment type.
The virtual prop control method provided by the embodiment of the application can be applied to various application programs, and in the embodiment of the application, a user can flexibly control the motion trail of the virtual prop, so that the virtual prop can move along with the touch position of the user in a scene, the control difficulty of the virtual prop is reduced, the probability of hitting a target of the virtual prop is improved, and the man-machine interaction efficiency is improved.
Fig. 2 is a flowchart of a method for controlling a virtual prop according to an embodiment of the present application. The method is applied to any terminal in the above implementation environment, in the embodiment of the present application, the control method of the virtual prop is introduced by using the terminal as an execution body, referring to fig. 2, the embodiment includes the following steps:
201. the terminal displays the virtual scene.
In one possible implementation, the terminal displays a virtual scene of the local athletic interaction, where, for example, at least one of a virtual object and a virtual object is displayed, and the user can use the terminal to control the virtual prop to interact with the virtual object and the virtual object. Optionally, at least one selection control of the virtual prop is also displayed in the virtual scene. It should be noted that, the embodiment of the present application is not limited to the content displayed in the virtual scene.
202. The terminal responds to the triggering operation of the virtual prop and controls the virtual prop to move in the virtual scene according to a target track, wherein the target track is the original movement track of the virtual prop.
The virtual prop is illustratively a virtual bullet, a virtual bomb, or the like, and is triggered to move in the virtual scene until the virtual object or the virtual object in the virtual scene is hit, and the virtual prop stops moving and affects the attribute value of the hit virtual object or the virtual object. The target track is used for indicating an original motion track of a virtual prop moving in a virtual scene, the original motion track is a motion track of the virtual prop under the condition that a user does not adjust the motion track of the virtual prop after triggering the virtual prop, and the shape of the original track is parabolic, linear and the like by way of example, and the embodiment of the application is not limited to the motion track. In the embodiment of the application, before the virtual prop is triggered or determined when the virtual prop is triggered when the original motion trail of the virtual prop is triggered, for example, the original motion trail of the virtual prop is preconfigured by a developer, or the original motion trail of the virtual prop is set when the user triggers the virtual prop.
203. And the terminal responds to the touch operation on the target position in the virtual scene during the movement of the virtual prop according to the target track, and controls the movement of the virtual prop to the target position.
In one possible implementation manner, the terminal can detect a touch operation of the user on the virtual scene, determine an operation position of the touch operation as the target position, adjust a motion track of the virtual prop, and control the virtual prop to move towards the target position, that is, adjust the motion track of the virtual prop in real time based on the user operation in a motion process of the virtual prop.
According to the technical scheme provided by the embodiment of the application, the touch operation of the user on the virtual scene is detected in the motion process of the virtual prop, and the motion track of the virtual prop is adjusted based on the operation position of the touch operation, so that the virtual prop can move towards the touch point, namely, the effect of adjusting the motion track of the virtual prop in real time based on the user operation is achieved, the motion track of the virtual prop can be flexibly controlled by the user, the hit rate of the virtual prop on a target is improved, and the man-machine interaction efficiency is improved.
The above embodiment is a brief description of a method for controlling a virtual prop provided by the present application, and the method is described below with reference to fig. 3. Fig. 3 is a flowchart of a method for controlling a virtual prop according to an embodiment of the present application, where the method is applied in the implementation environment shown in fig. 1, and referring to fig. 3, in one possible implementation manner, the method includes the following steps:
301. The terminal displays the virtual scene.
In one possible implementation manner, the terminal responds to a user to start a game competition interaction, and displays an operation interface corresponding to the game competition interaction, wherein a virtual scene is displayed in the operation interface. Optionally, the virtual scene further displays a first virtual object controlled by the user, and the user controls the first virtual object to move in the virtual scene, for example, controls the first virtual object to interact with other virtual objects by using a virtual prop; of course, in some embodiments, the first virtual object controlled by the user is not displayed in the virtual scene, which is not limited by the embodiment of the present application. Optionally, virtual objects, e.g., virtual plants, virtual buildings, etc., are displayed in the virtual scene. Optionally, a non-player character (NPC) in the local athletic interaction is displayed in the virtual scene; optionally, a second virtual object controlled by other users is displayed in the virtual scene; optionally, at least one selection control corresponding to the virtual prop is displayed in the virtual scene, and the selection control is used for providing a triggering function of the corresponding virtual prop. In some embodiments, the virtual objects and the virtual objects have attribute values, and the terminal displays the attribute values corresponding to the respective virtual objects and the virtual objects in the virtual scene. It should be noted that the above description of the virtual scene is merely an exemplary illustration of one possible implementation, and the embodiment of the present application is not limited to what is displayed in the virtual scene.
302. And the terminal responds to the triggering operation of the virtual prop and controls the virtual prop to move in the virtual scene according to the target track.
In one possible implementation manner, a selection control corresponding to the virtual prop is displayed in the virtual scene, and the triggering operation is a clicking operation or a long-press operation on the selection control. In one possible implementation, different virtual props correspond to different shortcuts, and the triggering operation is a clicking operation, a long-press operation, or the like, on the shortcut corresponding to the virtual prop. In one possible implementation, different virtual props correspond to different gesture operations, the triggering operation is a target gesture operation on the operation interface, and the target gesture operation is, for example, a sliding operation according to a reference movement track on the operation interface, and the terminal detects that the sliding operation of the user on the operation interface accords with the reference movement track and triggers the virtual props to move in the virtual scene according to the target track. The reference movement track is set by a developer, which is not limited in the embodiment of the present application. It should be noted that the foregoing description of the implementation manner of the triggering operation is merely illustrative of several possible implementation manners, and the implementation manner of the triggering operation is not limited by the embodiment of the present application.
In one possible implementation, the terminal determines a target track corresponding to the virtual prop in response to a trigger operation on the virtual prop. The target track is an original motion track of the virtual prop, and in one possible implementation manner, the target track is preset by a developer and is stored in first configuration information corresponding to the virtual prop. The terminal responds to the triggering operation of the virtual prop, acquires first configuration information corresponding to the virtual prop, and reads the target track from the first configuration information. In one possible implementation, the target trajectory is determined based on user operations. The method comprises the steps that when a user triggers the virtual prop, the transmitting angle of the virtual prop is adjusted, and a terminal determines a target track corresponding to the virtual prop based on the transmitting angle; optionally, when the user triggers the virtual prop, displaying a reference line in the virtual scene, wherein the reference line is used for indicating the motion track of the virtual prop, the user adjusts the motion track of the virtual prop by adjusting the reference line, and the virtual prop moves according to the target track indicated by the reference line after being triggered. It should be noted that the above description of the method for determining the target track is merely an exemplary illustration of one possible implementation, and the embodiment of the present application does not limit what method is used to determine the target track.
303. And the terminal responds to the detection of touch operation on any position in the virtual scene during the movement of the virtual prop according to the target track, detects the touch operation on the virtual scene at a first reference frequency, and determines the operation position of the detected touch operation as the target position.
Optionally, the touch operation is a continuous touch operation on a certain position, for example, after the terminal detects that the user touches any position in the virtual scene, the continuous touch operation is performed on the any position; optionally, the touch operation is a sliding touch operation, for example, after the user touches any position in the virtual scene, the sliding operation is performed on the virtual scene with the any position as a starting point, and the embodiment of the present application does not limit the touch operation. In the embodiment of the application, the terminal responds to the detection that the user starts touching the virtual scene, detects the touch operation on the virtual scene according to the first reference frequency, and determines the target position from the operation position of the touch operation. The first reference frequency is set by a developer, which is not limited in the embodiment of the present application.
In some embodiments, after detecting a target position corresponding to a touch operation, the terminal determines whether the target position is within a target area of the virtual scene, where the target area is used to provide a function of triggering adjustment of a motion track of the virtual prop. The terminal responds to the target position in the target area and continues to execute the following step 304, namely, the terminal responds to the touch operation of the target position in the virtual scene, and the target position is in the target area and continues to execute the following step of controlling the virtual prop to move towards the target position; and in response to the target position not being in the target area, the terminal does not execute the following steps, namely the terminal does not trigger the adjustment of the movement path of the virtual prop. The target area is set by a developer, which is not limited in the embodiment of the present application. In some embodiments, the terminal highlights the target area in the virtual scene, for example, the target area is displayed as a reference color, or the target area is framed by applying a reference shape, so that a user can accurately determine the target area, touch operation can be performed in a correct area, and the user can adjust the motion trail of the virtual prop conveniently. Wherein, the reference color and the reference shape are set by the developer, and the embodiment of the application is not limited to this.
304. And the terminal controls the virtual prop to move towards the target position.
In one possible implementation, during the movement of the virtual prop according to the target track, the terminal determines the movement speed of the virtual prop in response to a touch operation on a target position in the virtual scene, and controls the virtual prop to move towards the target position according to the movement speed.
In some embodiments, the virtual scene is regarded as a virtual physical space, the virtual prop is regarded as a physical entity in the virtual physical space, and the terminal controls the movement of the virtual prop by simulating the application of force to the virtual prop in the virtual scene, that is, by applying forces in different directions and forces, the virtual prop is controlled to move in different directions at different speeds. In one possible implementation, the terminal determines a target effort in response to a touch operation on a target position in the virtual scene, and determines the movement speed based on the target effort. The target acting force is used for representing the attraction effect of the target position on the virtual prop, the direction of the target acting force points to the target position, so that the virtual prop moves towards the target position under the action of the target acting force, namely the target position has the attraction effect on the virtual prop, the movement speed of the virtual prop is positively correlated with the force of the target acting force, namely the larger the target acting force applied to the virtual prop is, the faster the movement speed of the virtual prop is, the smaller the target acting force applied to the virtual prop is, and the movement speed of the virtual prop is slower. Fig. 4 is a schematic diagram of applying a target force to a virtual prop according to an embodiment of the present application, where, as shown in fig. 4, a direction of the target force 401 points to a target position 402.
In one possible implementation, the target force is determined as follows: the terminal obtains limitation information of the acting direction of the acting force and the acting force strength information based on the target position and the limitation information of the acting direction, and determines at least two first reference positions in the virtual scene with the target position as a center, wherein the at least two first reference positions are used for indicating the acting direction of the target acting force, that is, the acting direction of the target acting force changes with the target position as a center, so that the acting effect generated by the target acting force is richer; the terminal determines the force of the target acting force based on the force information of the acting force, and further, the terminal can also determine the movement speed corresponding to the target acting force based on the corresponding relation between the acting force and the speed. In one possible implementation manner, the limitation information of the acting direction and the force information of the acting force are stored in the form of configuration information, and in an exemplary embodiment, the terminal obtains the configuration information corresponding to the virtual prop in response to detecting the touch operation on the target position, and reads configuration parameters, that is, the acting direction limitation information and the force information of the acting force, from the configuration information. Fig. 5 is a schematic diagram of configuration information of a virtual prop according to an embodiment of the present application, where the configuration information includes limitation information 501 of an acting direction and strength information 502 of acting force as shown in fig. 5. In one possible implementation, the first reference position is determined based on four parameters Circle Vec Low Limit (minimum cyclic vector), circle Vec Upper Limit (maximum cyclic vector), circle Angle Step Low Limit (minimum rotational angle), circle Angle Step upper Limit (maximum rotational angle), which can indicate a distribution of the first reference position around the touch position of the user. In one possible implementation, as shown in fig. 5, the Force information of the target Force includes Max Force, retian Force Ratio (Force retention rate). It should be noted that, in some embodiments, the configuration information corresponding to the virtual prop further includes other information, and exemplary, maximum movement Speed (Max Speed), speed retention rate (Retian Speed Ratio), maximum shooting Speed (Shoot Max Speed), maximum shooting power (Shoot Max Speed) and the like of the virtual prop, which are not limited in the embodiments of the present application.
In one possible implementation, the target force acts on the virtual prop as follows: and the terminal applies the target acting force to the virtual prop, determines a second reference position in the at least two first reference positions according to a second reference frequency in the acting process of the target acting force, adjusts the acting direction of the target acting force based on the second reference position, and controls the virtual prop to move under the acting force of the target, namely, the terminal controls the virtual prop to move towards the direction of the target position according to the speed corresponding to the target acting force. In some embodiments, the force of the target force varies during movement of the virtual prop. In one possible implementation, the force of the target force is positively correlated with the distance between the virtual prop and the target location, i.e., the further the distance from the target location, the greater the force of the target force, the closer the distance from the target location, the less the force of the target force. It should be noted that, in some embodiments, the strength of the target acting force and the distance between the virtual prop and the target position satisfy other conditions, for example, satisfy a certain function curve, which is not limited by the embodiments of the present application. And the terminal adjusts the strength of the target acting force based on the distance between the virtual prop and the target position during the movement of the virtual prop to the target position, and controls the virtual prop to move to the target position according to the movement speed indicated by the target acting force.
In some embodiments, the speed of movement of the virtual prop is determined based on the force of the user's touch operation. In an exemplary embodiment, the terminal detects a pressing force of a touch operation in response to the touch operation on a target position in the virtual scene, and determines a movement speed of the virtual prop based on the pressing force, wherein the movement speed is positively correlated with the pressing force. It should be noted that, in some embodiments, the terminal stores a correspondence between the pressing force and the movement speed of the virtual prop, and determines the current movement speed of the virtual prop directly based on the correspondence between the pressing force and the movement speed of the virtual prop after detecting the pressing force of the touch operation. Optionally, the terminal stores a correspondence between a pressing force and a force of the target acting force, the pressing force is positively correlated with the force of the target acting force, and after the terminal detects the pressing force of the touch operation, the magnitude of the force of the target acting force applied to the virtual prop is adjusted based on the correspondence between the pressing force and the force of the target acting force, so that the effect of conditional movement speed of the virtual prop is achieved.
It should be noted that, the steps 303 and 304 are steps for controlling the movement of the virtual prop to the target position in response to the touch operation on the target position in the virtual scene during the movement of the virtual prop according to the target track. Fig. 6 is a schematic diagram of an effect of adjusting a motion track of a virtual prop, where when a user touches a target position 601, as shown in fig. 6 (a), the virtual prop moves to the target position 601, and when the user touches a target position 602, as shown in fig. 6 (b), the virtual prop is controlled to move to the target position 602 by the terminal. In the embodiment of the application, the user can flexibly control the motion trail of the virtual prop through the touch operation on the virtual scene, so that the virtual prop can hit the target more easily, and the man-machine interaction efficiency can be improved.
305. And the terminal responds to the touch operation of the virtual scene, determines the current movement direction and movement speed of the virtual prop, and controls the virtual prop to move according to the current movement direction and movement speed.
In one possible implementation manner, the terminal responds to the touch operation of the virtual scene to finish, that is, the terminal detects that the user does not touch the virtual scene any more, acquires the current movement direction and movement speed of the virtual prop, and controls the virtual prop to continue to move according to the current movement direction and movement speed until the virtual prop hits any virtual object or any virtual object in the virtual scene, or until the touch operation of the user on the virtual scene is detected again, and then adjusts the movement track of the virtual prop based on the operation position of the touch operation. It should be noted that, in some embodiments, in the process that the virtual prop continues to move according to the current movement direction and the movement speed, the virtual prop performs a linear movement; in some embodiments, in the process that the virtual prop continues to move according to the current movement direction and movement speed, the terminal simulates the gravity effect of the virtual prop in the virtual scene, and the movement track of the virtual prop appears as a parabola under the action of gravity. Fig. 7 is a schematic diagram of a control method for a virtual prop provided in an embodiment of the present application, as shown in fig. 7, in the process of moving the virtual prop, a terminal detects whether a user touches a screen, if yes, step 701 is performed, that is, the virtual prop is controlled to move toward a touch point (a target position), and if not, step 702 is performed, that is, the virtual prop is controlled to move according to a current moving speed and a current moving direction. If the terminal detects that the user stops touching during the movement of the virtual object to the touch point, the terminal executes step 702. In the embodiment of the application, when the user stops touching operation, the virtual prop can continue to move according to the current movement direction and movement speed, so that the user does not need to continuously touch the virtual scene all the time, and when the movement track of the virtual prop needs to be adjusted again, any position in the virtual scene is touched again, thereby being convenient for the user to operate.
According to the technical scheme provided by the embodiment of the application, the touch operation of the user on the virtual scene is detected in the motion process of the virtual prop, and the motion track of the virtual prop is adjusted based on the operation position of the touch operation, so that the virtual prop can move towards the touch point, namely, the effect of adjusting the motion track of the virtual prop in real time based on the user operation is achieved, the motion track of the virtual prop can be flexibly controlled by the user, the hit rate of the virtual prop on a target is improved, and the man-machine interaction efficiency is improved.
Any combination of the above optional solutions may be adopted to form an optional embodiment of the present application, which is not described herein.
Fig. 8 is a schematic structural diagram of a control device for virtual props according to an embodiment of the present application, referring to fig. 8, the device includes:
a display module 801, configured to display a virtual scene;
the control module 802 is configured to control the virtual prop to move in the virtual scene according to a target track in response to a triggering operation on the virtual prop, where the target track is an original movement track of the virtual prop; and during the movement of the virtual prop according to the target track, responding to the touch operation on the target position in the virtual scene, and controlling the movement of the virtual prop to the target position.
In one possible implementation, the control module 802 is configured to:
during the movement of the virtual prop according to the target track, in response to detection of touch operation on any position in the virtual scene, detecting the touch operation on the virtual scene at a first reference frequency, and determining the operation position of the detected touch operation as the target position;
and controlling the virtual prop to move towards the target position.
In one possible implementation, the control module 802 is configured to:
and responding to touch operation on a target position in the virtual scene, wherein the target position is in a target area, and the virtual prop is controlled to move towards the target position.
In one possible implementation, the control module 802 includes:
the determining submodule is used for responding to touch operation on a target position in the virtual scene during the movement of the virtual prop according to the target track, and determining the movement speed of the virtual prop;
and the control sub-module is used for controlling the virtual prop to move towards the target position according to the movement speed.
In one possible implementation, the determining submodule includes:
the device comprises a force determination unit, a force detection unit and a force detection unit, wherein the force determination unit is used for determining a target force in response to touch operation on a target position in the virtual scene, the target force is used for representing attraction effect of the target position on the virtual prop, and the movement speed of the virtual prop is positively related to the force of the target force;
A speed determination unit for determining the movement speed based on the target acting force.
In one possible implementation, the force determination unit is configured to:
responding to touch operation on a target position in the virtual scene, and acquiring limit information of the acting direction of acting force and force information of acting force;
determining at least two first reference positions in the virtual scene based on the target position and the limitation information of the acting direction by taking the target position as a center, wherein the at least two first reference positions are used for indicating the direction of the target acting force;
based on the force information of the force, a force of the target force is determined.
In one possible implementation, the control module 802 is configured to:
applying the target force to the virtual prop;
determining a second reference position among the at least two first reference positions according to a second reference frequency during the action of the target force;
and adjusting the acting direction of the target acting force based on the second reference position, and controlling the virtual prop to move under the acting direction of the target acting force.
In one possible implementation, the strength of the target force is positively correlated with the distance between the virtual prop and the target location;
The control submodule is used for:
adjusting the strength of the target acting force based on the distance between the virtual prop and the target position during the movement of the virtual prop to the target position;
and controlling the virtual prop to move towards the target position according to the movement speed indicated by the target acting force.
In one possible implementation, the determining submodule is configured to:
detecting a pressing force of a touch operation in response to the touch operation on a target position in the virtual scene;
based on the pressing force, determining a movement speed of the virtual prop, wherein the movement speed is positively correlated with the pressing force.
In one possible implementation, the apparatus further includes:
the determining module is used for determining the current movement direction and movement speed of the virtual prop in response to the touch operation of the virtual scene;
the control module 802 is configured to control the virtual prop to move according to the current movement direction and movement speed.
According to the device provided by the embodiment of the application, the touch operation of the user on the virtual scene is detected in the motion process of the virtual prop, and the motion track of the virtual prop is adjusted based on the operation position of the touch operation, so that the virtual prop can move towards the touch point, namely, the effect of adjusting the motion track of the virtual prop in real time based on the user operation is achieved, the motion track of the virtual prop can be flexibly controlled by the user, the hit rate of the virtual prop on a target is improved, and the man-machine interaction efficiency is improved.
It should be noted that: in the control device for virtual props provided in the above embodiment, only the division of the above functional modules is used for illustration when the virtual props are controlled, and in practical application, the above functional allocation may be completed by different functional modules according to needs, i.e. the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the control device for the virtual prop provided in the above embodiment and the method embodiment for controlling the virtual prop belong to the same concept, and detailed implementation processes of the control device for the virtual prop are shown in the method embodiment, and are not repeated here.
Fig. 9 is a schematic structural diagram of a terminal according to an embodiment of the present application. Illustratively, the terminal 900 is: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. Terminal 900 may also be referred to by other names of user devices, portable terminals, laptop terminals, desktop terminals, etc.
In general, the terminal 900 includes: one or more processors 901 and one or more memories 902.
In one possible implementation, processor 901 includes one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. Optionally, the processor 901 is implemented in hardware in at least one of a DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). In one possible implementation, the processor 901 includes a main processor and a coprocessor, the main processor being a processor for processing data in an awake state, also referred to as a CPU (Central Processing Unit ); a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 901 is integrated with a GPU (Graphics Processing Unit, image processor) for rendering and drawing of content required to be displayed by the display screen. In some embodiments, the processor 901 further includes an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
In one possible implementation, the memory 902 includes one or more computer-readable storage media, which are non-transitory, by way of example. The memory 902 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 902 is used to store at least one program code for execution by processor 901 to implement the method of controlling a virtual prop provided by an embodiment of the method of the present application.
In some embodiments, the terminal 900 may further optionally include: a peripheral interface 903, and at least one peripheral. In one possible implementation, the processor 901, memory 902, and peripheral interface 903 are connected by a bus or signal line. In one possible implementation, each peripheral device is connected to the peripheral device interface 903 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 904, a display 905, a camera assembly 906, audio circuitry 907, and a power source 909.
The peripheral interface 903 may be used to connect at least one peripheral device associated with an I/O (Input/Output) to the processor 901 and the memory 902. In some embodiments, the processor 901, memory 902, and peripheral interface 903 are integrated on the same chip or circuit board; in some other embodiments, any one or both of the processor 901, the memory 902, and the peripheral interface 903 are implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 904 is configured to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 904 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 904 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 904 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuit 904 is capable of communicating with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: metropolitan area networks, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuit 904 also includes NFC (Near Field Communication ) related circuits, which the present application is not limited to.
The display 905 is used to display a UI (User Interface). Illustratively, the UI includes graphics, text, icons, video, and any combination thereof. When the display 905 is a touch display, the display 905 also has the ability to capture touch signals at or above the surface of the display 905. The touch signal can be input to the processor 901 as a control signal for processing. At this time, the display 905 is also used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 905 is one, providing a front panel of the terminal 900; in other embodiments, the display 905 is at least two, and is disposed on different surfaces of the terminal 900 or in a folded design; in some embodiments, the display 905 is a flexible display disposed on a curved surface or a folded surface of the terminal 900. Even more, the display 905 may be arranged in an irregular pattern other than rectangular, i.e., a shaped screen. The display 905 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 906 is used to capture images or video. Optionally, the camera assembly 906 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal and the rear camera is disposed on the rear surface of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, camera assembly 906 also includes a flash. Alternatively, the flash is a single-color temperature flash, or a dual-color temperature flash. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
In some embodiments, the audio circuit 907 includes a microphone and a speaker. The microphone is used for collecting sound waves of users and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 901 for processing, or inputting the electric signals to the radio frequency circuit 904 for voice communication. Optionally, a plurality of microphones are respectively disposed at different portions of the terminal 900 for stereo acquisition or noise reduction. Or the microphone is an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 901 or the radio frequency circuit 904 into sound waves. Alternatively, the speaker is a conventional thin film speaker, or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into sound waves audible to humans, but also the electric signal can be converted into sound waves inaudible to humans for ranging and other purposes. In some embodiments, the audio circuit 907 further comprises a headphone jack.
The power supply 909 is used to supply power to the various components in the terminal 900. Illustratively, the power source 909 is an alternating current, a direct current, a disposable battery, or a rechargeable battery. When the power supply 909 includes a rechargeable battery, the rechargeable battery can support wired or wireless charging. The rechargeable battery can also be used to support fast charge technology.
In some embodiments, terminal 900 can further include one or more sensors 910. The one or more sensors 910 include, but are not limited to: acceleration sensor 911, gyro sensor 912, pressure sensor 913, optical sensor 915, and proximity sensor 916.
In some embodiments, the acceleration sensor 911 is capable of detecting the magnitude of acceleration on three coordinate axes of a coordinate system established with the terminal 900. For example, the acceleration sensor 911 is used to detect components of gravitational acceleration on three coordinate axes. In some embodiments, the processor 901 can control the display screen 905 to display a user interface in a landscape view or a portrait view based on the gravitational acceleration signal acquired by the acceleration sensor 911. In some embodiments, the acceleration sensor 911 is also used for the acquisition of motion data of a game or user.
In some embodiments, the gyro sensor 912 can detect the body direction and the rotation angle of the terminal 900, and the gyro sensor 912 can collect the 3D motion of the user on the terminal 900 in cooperation with the acceleration sensor 911. The processor 901 can realize the following functions according to the data collected by the gyro sensor 912: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
In some embodiments, the pressure sensor 913 is disposed on a side frame of the terminal 900 and/or on an underlying layer of the display 905. When the pressure sensor 913 is disposed on the side frame of the terminal 900, a grip signal of the user on the terminal 900 can be detected, and the processor 901 performs left-right hand recognition or quick operation according to the grip signal collected by the pressure sensor 913. When the pressure sensor 913 is provided at the lower layer of the display 905, the processor 901 performs control of the operability control on the UI interface according to the pressure operation of the user on the display 905. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The optical sensor 915 is used to collect the intensity of ambient light. In some embodiments, the processor 901 can control the display brightness of the display panel 905 based on the intensity of ambient light collected by the optical sensor 915. Specifically, when the ambient light intensity is high, the display luminance of the display screen 905 is turned up; when the ambient light intensity is low, the display luminance of the display panel 905 is turned down. In another embodiment, the processor 901 is also capable of dynamically adjusting the shooting parameters of the camera assembly 906 based on the intensity of ambient light collected by the optical sensor 915.
A proximity sensor 916, also referred to as a distance sensor, is typically provided on the front panel of the terminal 900. Proximity sensor 916 is used to collect the distance between the user and the front of terminal 900. In one embodiment, when the proximity sensor 916 detects that the distance between the user and the front face of the terminal 900 gradually decreases, the processor 901 controls the display 905 to switch from the bright screen state to the off screen state; when the proximity sensor 916 detects that the distance between the user and the front surface of the terminal 900 gradually increases, the processor 901 controls the display 905 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the structure shown in fig. 9 is not limiting and that more or fewer components than shown may be included or certain components may be combined or a different arrangement of components may be employed.
Fig. 10 is a schematic diagram of a server according to an embodiment of the present application, where the server 1000 may have a relatively large difference due to configuration or performance, and in some embodiments, the server 1000 includes one or more processors (Central Processing Units, CPU) 1001 and one or more memories 1002, where at least one program code is stored in the one or more memories 1002, and the at least one program code is loaded and executed by the one or more processors 1001 to implement the methods provided in the foregoing method embodiments. Of course, the server 1000 may also have a wired or wireless network interface, a keyboard, an input/output interface, and other components for implementing the functions of the device, which are not described herein.
In an exemplary embodiment, a computer readable storage medium, such as a memory, comprising at least one program code executable by a processor to perform the method of controlling a virtual prop of the above embodiments is also provided. For example, the computer readable storage medium may be Read-Only Memory (ROM), random-access Memory (Random Access Memory, RAM), compact disc Read-Only Memory (CD-ROM), magnetic tape, floppy disk, optical data storage device, and the like.
In an exemplary embodiment, a computer program product is also provided, the computer program product comprising at least one computer program, the at least one computer program being stored in a computer readable storage medium. The processor of the computer device reads the at least one computer program from the computer-readable storage medium, and the processor executes the at least one computer program so that the computer device performs the control method of the virtual prop.
Those of ordinary skill in the art will appreciate that all or a portion of the steps implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the above storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments of the present application is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements within the spirit and principles of the present application.

Claims (14)

1. A method for controlling a virtual prop, the method comprising:
displaying a virtual scene;
responding to triggering operation of the virtual prop, and controlling the virtual prop to move in the virtual scene according to a target track, wherein the target track is an original movement track of the virtual prop;
during the movement of the virtual prop according to the target track, responding to touch operation on a target position in the virtual scene, and detecting the pressing force of the touch operation; determining a movement speed of the virtual prop based on the pressing force, wherein the movement speed is positively correlated with the pressing force; controlling the virtual prop to move to the target position according to the movement speed;
or alternatively, the process may be performed,
during the movement of the virtual prop according to the target track, responding to the touch operation of a target position in the virtual scene, determining a target acting force, wherein the target acting force is used for representing the attraction effect of the target position on the virtual prop, and the movement speed of the virtual prop is positively related to the strength of the target acting force; determining a speed of movement of the virtual prop based on the target effort; adjusting a force of the target acting force based on a distance between the virtual prop and the target position during movement of the virtual prop to the target position, the force of the target acting force being positively correlated to the distance between the virtual prop and the target position;
And controlling the virtual prop to move to the target position according to the movement speed indicated by the target acting force.
2. The method according to claim 1, wherein the method further comprises:
during the movement of the virtual prop according to the target track, in response to detection of touch operation on any position in the virtual scene, detecting the touch operation on the virtual scene at a first reference frequency, and determining the operation position of the detected touch operation as the target position;
and controlling the virtual prop to move towards the target position.
3. The method according to claim 1, wherein the method further comprises:
and responding to touch operation on a target position in the virtual scene, wherein the target position is in a target area, and the virtual prop is controlled to move towards the target position.
4. The method of claim 1, wherein the determining a target effort in response to a touch operation to a target location in the virtual scene comprises:
responding to touch operation on a target position in the virtual scene, and acquiring limit information of the acting force direction and force information of acting force;
Determining at least two first reference positions in the virtual scene based on the target position and the limitation information of the acting direction by taking the target position as a center, wherein the at least two first reference positions are used for indicating the direction of the target acting force;
and determining the strength of the target acting force based on the acting force information.
5. The method of claim 4, wherein the controlling the movement of the virtual prop to the target location comprises:
applying the target force to the virtual prop;
determining a second reference position among the at least two first reference positions according to a second reference frequency during the action of the target force;
and adjusting the acting direction of the target acting force based on the second reference position, and controlling the virtual prop to move under the acting direction of the target acting force.
6. The method of claim 1, wherein the method further comprises, after controlling the movement of the virtual prop to the target position in the virtual scene in response to a touch operation to the target position during the movement of the virtual prop according to the target trajectory:
Determining the current movement direction and movement speed of the virtual prop in response to the touch operation of the virtual scene;
and controlling the virtual prop to move according to the current movement direction and movement speed.
7. A control device for a virtual prop, the device comprising:
the display module is used for displaying the virtual scene;
the control module is used for responding to the triggering operation of the virtual prop and controlling the virtual prop to move in the virtual scene according to a target track, wherein the target track is the original movement track of the virtual prop;
the control module further comprises a determination submodule and a control submodule;
the determining submodule is used for responding to touch operation on a target position in the virtual scene during the movement of the virtual prop according to the target track, and detecting the pressing force of the touch operation; determining a movement speed of the virtual prop based on the pressing force, wherein the movement speed is positively correlated with the pressing force;
the control submodule is used for controlling the virtual prop to move towards the target position according to the movement speed;
Or alternatively, the process may be performed,
the determining submodule includes:
the acting force determining unit is used for determining target acting force in response to touch operation on a target position in the virtual scene during the movement of the virtual prop according to the target track, wherein the target acting force is used for representing attraction acting on the virtual prop by the target position, and the movement speed of the virtual prop is positively correlated with the force of the target acting force;
a speed determining unit for determining a movement speed of the virtual prop based on the target acting force;
the control sub-module is used for adjusting the strength of the target acting force based on the distance between the virtual prop and the target position during the movement of the virtual prop to the target position, and the strength of the target acting force is positively related to the distance between the virtual prop and the target position; and controlling the virtual prop to move to the target position according to the movement speed indicated by the target acting force.
8. The apparatus of claim 7, wherein the control module is configured to:
during the movement of the virtual prop according to the target track, in response to detection of touch operation on any position in the virtual scene, detecting the touch operation on the virtual scene at a first reference frequency, and determining the operation position of the detected touch operation as the target position;
And controlling the virtual prop to move towards the target position.
9. The apparatus of claim 7, wherein the control module is configured to:
and responding to touch operation on a target position in the virtual scene, wherein the target position is in a target area, and the virtual prop is controlled to move towards the target position.
10. The apparatus according to claim 7, wherein the force determination unit is configured to:
responding to touch operation on a target position in the virtual scene, and acquiring limit information of the acting force direction and force information of acting force;
determining at least two first reference positions in the virtual scene based on the target position and the limitation information of the acting direction by taking the target position as a center, wherein the at least two first reference positions are used for indicating the direction of the target acting force;
and determining the strength of the target acting force based on the acting force information.
11. The apparatus of claim 10, wherein the control module is configured to:
applying the target force to the virtual prop;
determining a second reference position among the at least two first reference positions according to a second reference frequency during the action of the target force;
And adjusting the acting direction of the target acting force based on the second reference position, and controlling the virtual prop to move under the acting direction of the target acting force.
12. The apparatus of claim 7, wherein the apparatus further comprises:
determining the current movement direction and movement speed of the virtual prop in response to the touch operation of the virtual scene;
and controlling the virtual prop to move according to the current movement direction and movement speed.
13. A computer device comprising one or more processors and one or more memories, the one or more memories having stored therein at least one computer program loaded and executed by the one or more processors to implement the operations performed by the method of controlling a virtual prop of any of claims 1 to 6.
14. A computer readable storage medium having stored therein at least one computer program loaded and executed by a processor to implement the operations performed by the method of controlling a virtual prop of any of claims 1 to 6.
CN202110553093.0A 2021-05-20 2021-05-20 Virtual prop control method and device, computer equipment and storage medium Active CN113509729B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110553093.0A CN113509729B (en) 2021-05-20 2021-05-20 Virtual prop control method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110553093.0A CN113509729B (en) 2021-05-20 2021-05-20 Virtual prop control method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113509729A CN113509729A (en) 2021-10-19
CN113509729B true CN113509729B (en) 2023-10-03

Family

ID=78065069

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110553093.0A Active CN113509729B (en) 2021-05-20 2021-05-20 Virtual prop control method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113509729B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105597315A (en) * 2015-12-17 2016-05-25 网易(杭州)网络有限公司 Virtual object throwing control method and device
WO2017133601A1 (en) * 2016-02-01 2017-08-10 腾讯科技(深圳)有限公司 Method for determining a movement trace, and user equipment
CN108837507A (en) * 2018-05-29 2018-11-20 网易(杭州)网络有限公司 Virtual item control method and device, electronic equipment, storage medium
WO2020024806A1 (en) * 2018-08-02 2020-02-06 腾讯科技(深圳)有限公司 Method and device for controlling interaction between virtual object and throwing object, and storage medium
WO2020024726A1 (en) * 2018-08-02 2020-02-06 腾讯科技(深圳)有限公司 Method for controlling movement of virtual prop, terminal, and storage medium
CN110917619A (en) * 2019-11-18 2020-03-27 腾讯科技(深圳)有限公司 Interactive property control method, device, terminal and storage medium
CN111265858A (en) * 2020-01-15 2020-06-12 腾讯科技(深圳)有限公司 Operation control method, operation control device, storage medium, and electronic device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6751565B2 (en) * 2016-01-29 2020-09-09 任天堂株式会社 Golf game device, golf game control program, golf game system and golf game control method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105597315A (en) * 2015-12-17 2016-05-25 网易(杭州)网络有限公司 Virtual object throwing control method and device
WO2017133601A1 (en) * 2016-02-01 2017-08-10 腾讯科技(深圳)有限公司 Method for determining a movement trace, and user equipment
CN108837507A (en) * 2018-05-29 2018-11-20 网易(杭州)网络有限公司 Virtual item control method and device, electronic equipment, storage medium
WO2020024806A1 (en) * 2018-08-02 2020-02-06 腾讯科技(深圳)有限公司 Method and device for controlling interaction between virtual object and throwing object, and storage medium
WO2020024726A1 (en) * 2018-08-02 2020-02-06 腾讯科技(深圳)有限公司 Method for controlling movement of virtual prop, terminal, and storage medium
CN110917619A (en) * 2019-11-18 2020-03-27 腾讯科技(深圳)有限公司 Interactive property control method, device, terminal and storage medium
CN111265858A (en) * 2020-01-15 2020-06-12 腾讯科技(深圳)有限公司 Operation control method, operation control device, storage medium, and electronic device

Also Published As

Publication number Publication date
CN113509729A (en) 2021-10-19

Similar Documents

Publication Publication Date Title
CN111589128B (en) Operation control display method and device based on virtual scene
CN109529319B (en) Display method and device of interface control and storage medium
CN111013142B (en) Interactive effect display method and device, computer equipment and storage medium
CN110141859B (en) Virtual object control method, device, terminal and storage medium
CN111603771B (en) Animation generation method, device, equipment and medium
CN111589127B (en) Control method, device and equipment of virtual role and storage medium
CN111589136B (en) Virtual object control method and device, computer equipment and storage medium
CN111672106B (en) Virtual scene display method and device, computer equipment and storage medium
CN111589146A (en) Prop operation method, device, equipment and storage medium based on virtual environment
CN111596838B (en) Service processing method and device, computer equipment and computer readable storage medium
CN113398572B (en) Virtual item switching method, skill switching method and virtual object switching method
CN113041620B (en) Method, device, equipment and storage medium for displaying position mark
CN111013137B (en) Movement control method, device, equipment and storage medium in virtual scene
CN111589116B (en) Method, device, terminal and storage medium for displaying function options
CN110833695B (en) Service processing method, device, equipment and storage medium based on virtual scene
CN112704876A (en) Method, device and equipment for selecting virtual object interaction mode and storage medium
CN113134232B (en) Virtual object control method, device, equipment and computer readable storage medium
CN111672115B (en) Virtual object control method and device, computer equipment and storage medium
CN112274936B (en) Method, device, equipment and storage medium for supplementing sub-props of virtual props
CN112755517B (en) Virtual object control method, device, terminal and storage medium
CN111752697B (en) Application program running method, device, equipment and readable storage medium
CN112121438B (en) Operation prompting method, device, terminal and storage medium
CN111659122B (en) Virtual resource display method and device, electronic equipment and storage medium
CN112306332A (en) Method, device and equipment for determining selected target and storage medium
CN113559494B (en) Virtual prop display method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40053530

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant