US11513657B2 - Method and apparatus for controlling movement of virtual object, terminal, and storage medium - Google Patents

Method and apparatus for controlling movement of virtual object, terminal, and storage medium Download PDF

Info

Publication number
US11513657B2
US11513657B2 US17/359,497 US202117359497A US11513657B2 US 11513657 B2 US11513657 B2 US 11513657B2 US 202117359497 A US202117359497 A US 202117359497A US 11513657 B2 US11513657 B2 US 11513657B2
Authority
US
United States
Prior art keywords
joystick
virtual
virtual joystick
movable area
moment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US17/359,497
Other languages
English (en)
Other versions
US20210326027A1 (en
Inventor
Junxiang WANG
Yizhong Hu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Assigned to TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED reassignment TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HU, YIZHONG, WANG, Junxiang
Publication of US20210326027A1 publication Critical patent/US20210326027A1/en
Application granted granted Critical
Publication of US11513657B2 publication Critical patent/US11513657B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens

Definitions

  • Embodiments of this application relate to the field of computer technologies, and in particular, to a method and an apparatus for controlling movements of a virtual object, a terminal, and a storage medium.
  • a user controls a virtual object to move in the virtual environment based on a touch operation on a screen, such as by a drag operation on a virtual joystick in an interface.
  • Embodiments of this application provide a method and an apparatus for controlling movements of a virtual object, a terminal, and a storage medium.
  • the technical solutions are as follows:
  • this present disclosure provides a method controlling movements of a virtual object.
  • the method includes displaying a target perspective picture of a target application, and superimposing a virtual joystick and a movable area of the virtual joystick on the target perspective picture; starting the virtual joystick when a trigger operation corresponding to the virtual joystick is received; adjusting a position of the virtual joystick in the movable area according to a position change of a touch point when the virtual joystick is in an activated state, the position of the virtual joystick and a position of the touch point changing synchronously in real time when the touch point moves within an effective touch range, the effective touch range being larger than the movable area; and controlling a virtual object to move according to the position of the virtual joystick.
  • the embodiments of this application provide an apparatus for controlling movements of a virtual object, including a display module, configured to: display a target perspective picture of a target application, and superimpose a virtual joystick and a movable area of the virtual joystick on the target perspective picture; a starting module, configured to start the virtual joystick when a trigger operation corresponding to the virtual joystick is received; an adjustment module, configured to adjust a position of the virtual joystick in the movable area according to a position change of a touch point when the virtual joystick is in an activated state, the position of the virtual joystick and a position of the touch point changing synchronously in real time when the touch point moves within an effective touch range, the effective touch range being larger than the movable area; and a control module, configured to control a virtual object to move according to the position of the virtual joystick.
  • an embodiment of this application provides a terminal, including a processor and a memory, the memory storing at least one instruction, at least one program, a code set, or an instruction set, the at least one instruction, the at least one program, the code set, or the instruction set being loaded and executed by the processor to implement: displaying a target perspective picture of a target application, and superimposing a virtual joystick and a movable area of the virtual joystick on the target perspective picture; starting the virtual joystick when a trigger operation corresponding to the virtual joystick is received; adjusting a position of the virtual joystick in the movable area according to a position change of a touch point when the virtual joystick is in an activated state, the position of the virtual joystick and a position of the touch point changing synchronously in real time when the touch point moves within an effective touch range, the effective touch range being larger than the movable area; and controlling a virtual object to move according to the position of the virtual joystick.
  • an embodiment of this application provides a non-transitory computer-readable storage medium, storing at least one instruction, at least one program, a code set, or an instruction set, the at least one instruction, the at least one program, the code set, or the instruction set being loaded and executed by a processor to implement the method for controlling movements of a virtual object described in the foregoing aspect.
  • the position of the virtual joystick in the movable area is adjusted according to the position change of the touch point.
  • the position of the virtual joystick and the position of the touch point change synchronously in real time when the touch point moves within the effective touch range.
  • the moving distance of the touch point in this application is smaller, which improves operation efficiency of a user.
  • FIG. 1 is a flowchart of a method for controlling movements of a virtual object according to an embodiment of this application.
  • FIG. 2 is a schematic exemplary diagram of a method for controlling movements of a virtual object.
  • FIG. 3 is a flowchart of a method for controlling movements of a virtual object according to another embodiment of this application.
  • FIG. 4 is a schematic exemplary diagram of calculating an actual position of a virtual joystick at a second moment.
  • FIG. 5 is an exemplary flowchart of a method for controlling movements of a virtual object.
  • FIG. 6 is a schematic exemplary diagram of reverse movement of a touch point.
  • FIG. 7 is a schematic exemplary diagram of vertical movement of a touch point.
  • FIG. 8 is a block diagram of an apparatus for controlling movements of a virtual object according to an embodiment of this application.
  • FIG. 9 is a block diagram of an apparatus for controlling movements of a virtual object according to another embodiment of this application.
  • FIG. 10 is a structural block diagram of a terminal according to an embodiment of this application.
  • a virtual environment is displayed (or provided) by an application when run on a terminal.
  • the virtual environment may be a simulated environment of a real world, or may be a semi-simulated semi-fictional three-dimensional environment, or may be an entirely fictional three-dimensional environment.
  • the virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment. A description is made by using an example in which the virtual environment is a three-dimensional virtual environment in the following embodiment, but this application is not limited thereto.
  • a virtual object is a movable or non-movable object in a virtual environment.
  • the movable object may be at least one of a virtual character, a virtual animal, and a cartoon character.
  • the virtual object when the virtual environment is a three-dimensional virtual environment, the virtual object is a three-dimensional model created based on a skeletal animation technology.
  • Each virtual object has a respective shape and size in the three-dimensional virtual environment, and occupies some space in the three-dimensional virtual environment.
  • a virtual joystick is a control configured to control a virtual object to move in a virtual environment.
  • a user may control movements of the virtual joystick by using a touch operation on a terminal screen, to further control movements of the virtual object.
  • the virtual joystick may be a circle.
  • the virtual joystick may alternatively be a triangle, a square, a hexagon, an octagon, or the like, or may be other irregular shapes, and this is not limited in this embodiment of this application.
  • the virtual joystick moves in a movable area.
  • a shape of the movable area may be the same as or different from a shape of the virtual joystick.
  • the virtual joystick is a circle
  • the movable area is also a circle
  • the two are concentric circles.
  • the virtual joystick is a hexagon
  • the movable area is an octagon
  • the centers of the two coincide.
  • the method for controlling movements of a virtual object provided in this application is applicable to a terminal.
  • the terminal may be a laptop portable computer, a mobile phone, a tablet computer, an e-book reader, a video game console, a Moving Picture Experts Group Audio Layer IV (MP4) player, or the like.
  • MP4 Moving Picture Experts Group Audio Layer IV
  • the virtual environment includes a virtual object.
  • the application is an application supporting a three-dimensional virtual environment.
  • the application may be any one of a virtual reality application, a three-dimensional map program, a military simulation program, or a game application such as a TPS game, a first-person shooting (FPS) game, a multiplayer online battle arena (MOBA) game, or a multiplayer gunfight survival game.
  • the application may be a standalone application, such as a standalone 3D game application, or may be a network online application.
  • a touch point of a user finger on a screen can only move in the movable area of the virtual joystick, which is relatively severely limited.
  • a method is provided in the related art. That is, when the touch point is outside the movable area of the virtual joystick, the virtual joystick still maintains an activated state. In this case, a connecting line between a position of the touch point and the center of the movable area is obtained, and coordinates of an intersection of the connecting line and an edge of the movable area are determined as a position of the virtual joystick.
  • the position of the virtual joystick in the movable area is adjusted according to the position change of the touch point.
  • the position of the virtual joystick and the position of the touch point change synchronously in real time when the touch point moves within the effective touch range.
  • FIG. 1 is a flowchart of a method for controlling movements of a virtual object according to an embodiment of this application.
  • the method may include the following steps ( 101 to 104 ):
  • Step 101 Display a target perspective picture of a target application, and superimpose a virtual joystick and a movable area of the virtual joystick on the target perspective picture.
  • the target application may be an application based on a virtual environment.
  • the application may be a virtual reality application, a three-dimensional map application, a military simulation program, or a game application, and the game application may be any one of a TPS game, an FPS game, and a MOBA game.
  • the terminal runs the target application to display the target perspective picture.
  • the target perspective picture is a picture obtained by observing the virtual environment from a target perspective of the virtual object.
  • the target perspective is a perspective of observing the virtual environment by using a first-person perspective of the virtual object
  • the target perspective is a perspective of observing the virtual environment by using a third-person perspective of the virtual object.
  • the virtual joystick and the movable area of the virtual joystick may be superimposed and displayed on the target perspective picture.
  • the terminal may further display at least one virtual object included in the virtual environment.
  • the virtual joystick is configured to control a virtual object to move in a virtual environment.
  • the virtual joystick reference is made to the brief introduction of the foregoing terms, and details are not repeated herein.
  • the movable area of the virtual joystick is used for limiting a moving range of the virtual joystick, so that the virtual joystick can only move in the movable area.
  • the virtual joystick in an initial state, is located at the center of the movable area.
  • a shape of the movable area may be the same as or different from a shape of the virtual joystick.
  • the virtual joystick is a circle
  • the movable area is also a circle
  • the two are concentric circles.
  • the virtual joystick is a hexagon
  • the movable area is an octagon
  • centers of the two coincide.
  • Step 102 Start the virtual joystick when a trigger operation corresponding to the virtual joystick is obtained.
  • the terminal may detect whether the trigger operation that is performed by the user on the screen and is used for starting the virtual joystick is received.
  • the terminal starts the virtual joystick when obtaining the trigger operation corresponding to the virtual joystick.
  • the trigger operation may be a press operation, a single-click/tap operation, a swipe operation, or the like, and this is not limited in this embodiment of this application.
  • the terminal may detect a position of the trigger operation by using a touchscreen, that is, a position of a touch point of the user finger on the screen, determine whether the touch point is within a trigger range of the virtual joystick.
  • the terminal starts the virtual joystick according to the trigger operation; and when the touch point is not within the trigger range of the virtual joystick, the terminal does not respond to the trigger operation.
  • the trigger range of the virtual joystick may be the movable area of the virtual joystick, or may be an area larger than the movable area, and this is not limited in this embodiment of this application.
  • the terminal turns off the virtual joystick.
  • Step 103 Adjust a position of the virtual joystick in the movable area according to a position change of a touch point when the virtual joystick is in an activated state, the position of the virtual joystick and a position of the touch point changing synchronously in real time when the touch point moves within an effective touch range, the effective touch range including and being larger than the movable area.
  • the terminal determines that the virtual joystick is in the activated state.
  • the user swipes on the terminal screen to change the position of the touch point.
  • the terminal may determine a position change of the touch point between two moments, and may further adjust the position of the virtual joystick in the movable area according to the position change of the touch point.
  • the position of the virtual joystick and the position of the touch point change synchronously in real time when the touch point moves within the effective touch range
  • the effective touch range includes and is larger than the movable area.
  • the effective touch range may be the entire screen of the terminal, or may be a partial screen area of the terminal, and this is not limited in this embodiment of this application.
  • the position of the virtual joystick is the position of the touch point.
  • the terminal obtains the position of the virtual joystick through particular calculation according to the position of the touch point.
  • Step 104 Control a virtual object according to the position of the virtual joystick to move.
  • the terminal may control the virtual object according to the position such as a direction and a distance of the virtual joystick to move.
  • different marks such as an alert mark, a material mark, an attack mark, or a defense mark may be generated according to the position of the virtual joystick.
  • the alert mark is used for alerting the virtual object, or alerting the virtual object to a marked position.
  • the material mark is used for prompting the virtual object that there is material at the marked position.
  • the attack mark is used for prompting the virtual object to start attacking or prompting the virtual object to attack the marked position.
  • the defense mark is used for prompting the virtual object to perform defense.
  • FIG. 2 is a schematic exemplary diagram of a method for controlling movements of a virtual object.
  • An example in which the virtual joystick and the movable area of the virtual joystick are both circles and the touch point moves in a reverse direction is used.
  • a virtual joystick 10 is located at the center of a movable area 20 .
  • the terminal When obtaining a trigger operation corresponding to the virtual joystick 10 , the terminal starts the virtual joystick 10 ; when a touch point 30 moves in the movable area 20 , a position of the touch point 30 is the same as a position of the virtual joystick 10 ; when the touch point 30 moves outside the movable area 20 , the virtual joystick 10 moves on an edge of the movable area 20 ; and when the touch point 30 starts to move in a reverse direction, the virtual joystick 10 located on the edge of the movable area 20 also starts to move in a reverse direction, and the two change synchronously in real time.
  • the position of the virtual joystick in the movable area is adjusted according to the position change of the touch point.
  • the position of the virtual joystick and the position of the touch point change synchronously in real time when the touch point moves within the effective touch range.
  • FIG. 3 is a flowchart of a method for controlling movements of a virtual object according to another embodiment of this application.
  • the method may include the following steps:
  • Step 301 Display a target perspective picture of a target application, and superimpose a virtual joystick and a movable area of the virtual joystick on the target perspective picture.
  • the virtual joystick in an initial state, is located at the center of the movable area.
  • This step is the same as or similar to step 101 in the embodiment in FIG. 1 , and details are not repeated herein.
  • Step 302 Start the virtual joystick when a trigger operation corresponding to the virtual joystick is obtained.
  • This step is the same as or similar to step 102 in the embodiment in FIG. 1 , and details are not repeated herein.
  • the terminal may determine in real time whether the virtual joystick is still started. When it is determined that the virtual joystick is still started, the terminal continues to keep the virtual joystick in the activated state; and when it is determined that the virtual joystick is not started, the terminal turns off the virtual joystick.
  • step 303 is performed.
  • Step 303 Obtain a position change of a touch point from a first moment to a second moment.
  • the terminal may obtain a position of the touch point in real time, so that the terminal may obtain a position of the touch point at the first moment and a position of the touch point at the second moment, to further obtain the position change of the touch point from the first moment to the second moment.
  • the position change is a vector including a direction and a distance.
  • an interval between the first moment and the second moment is a target duration, and the target duration is not limited in this embodiment of this application.
  • the position of the touch point at the first moment may be represented as (X 1 , Y 1 ), and the position of the touch point at the second moment may be represented as (X 2 , Y 2 ).
  • Step 304 Calculate an estimated position of the virtual joystick at the second moment according to the position change and a position of the virtual joystick at the first moment.
  • the terminal may further obtain the position of the virtual joystick at the first moment.
  • a position of the virtual joystick also changes.
  • the virtual joystick can achieve the same position change as that of the touch point, and in this case, the estimated position at the second moment is obtained.
  • the position of the virtual joystick at the first moment may be represented as (x 1 , y 1 ), and the position change of the touch point may be represented as (X 0 , Y 0 ).
  • the position of the virtual joystick at the first moment may be represented as ( 0 , 0 ).
  • the terminal may determine whether the virtual joystick is located in the movable area when the virtual joystick is at the estimated position at the second moment. The terminal then performs the following step 305 when the virtual joystick is located outside the movable area; and performs the following step 306 when the virtual joystick is located in the movable area.
  • the terminal may calculate a distance between the estimated position of the virtual joystick at the second moment and the center (that is, the origin) of the movable area. If the distance is greater than the movable area, it is considered that the estimated position is outside the movable area; and if the distance is less than or equal to the movable area, it is considered that the estimated position is in the movable area.
  • the estimated position of the virtual joystick at the second moment may be represented as (A, B).
  • Step 305 Calculate an actual position of the virtual joystick at the second moment according to the estimated position and the movable area when the estimated position is outside the movable area.
  • the virtual joystick can only move in the movable area, when the estimated position is outside the movable area, the estimated position needs to be mapped to the actual position.
  • step 305 may include the following two steps:
  • FIG. 4 is a schematic exemplary diagram of calculating an actual position of a virtual joystick at a second moment.
  • the position of the touch point at the first moment is M (X 1 , Y 1 )
  • the position of the touch point at the second moment is N (X 2 , Y 2 )
  • the position of the virtual joystick at the first moment is P (x 1 , y 1 )
  • the estimated position of the virtual joystick at the second moment is Q 1 (A, B)
  • the actual position of the virtual joystick at the second moment is Q 2 (x 2 , y 2 ).
  • the coordinates of the intersection of the connecting line between the estimated position Q 1 (A, B) and the center O ( 0 , 0 ) of the movable area and the edge of the movable area are Q 2 (x 2 , y 2 ).
  • x 2 R*SIN ⁇
  • y 2 R*COS ⁇ .
  • x 2 R*SIN(arctan(A/B)
  • y 2 R*COS(arctan(A/B)).
  • the coordinates of the intersection Q 2 (x 2 , y 2 ) are determined as the actual position of the virtual joystick at the second moment.
  • Step 306 Determine the estimated position as the actual position of the virtual joystick at the second moment when the estimated position is in the movable area.
  • the terminal directly determines the estimated position as the actual position of the virtual joystick at the second moment when the estimated position is in the movable area.
  • Step 307 Calculate a direction and a distance of the virtual joystick relative to the center of the movable area according to the position of the virtual joystick.
  • the terminal may calculate the direction and the distance of the virtual joystick relative to the center of the movable area after obtaining the foregoing actual position of the virtual joystick at the second moment.
  • Step 308 Determine a moving direction and a moving speed of a virtual object according to the direction and the distance.
  • the terminal determines the moving direction and the moving speed of the virtual object according to the direction and the distance of the virtual joystick.
  • the direction of the virtual joystick may be mapped to the moving direction of the virtual object, and the distance of the virtual joystick may be mapped to the moving speed of the virtual object. When the distance is larger, the moving speed of the virtual object is higher.
  • Step 309 Control the virtual object according to the moving direction and the moving speed to move.
  • the virtual object is controlled to move in the virtual environment, for example, walk, run, or jump.
  • the technical solution provided in this embodiment of this application it is determined whether the estimated position of the virtual joystick is in the movable area, and when the estimated position is outside the movable area, the coordinates of the intersection of the connecting line between the estimated position and the center of the movable area and the edge of the movable area are determined as the actual position of the virtual joystick at the second moment.
  • a moving distance of the touch point is smaller when the operation is completed, that is, a swipe distance of the user finger is smaller, which further improves the operation efficiency of the user.
  • FIG. 5 is an exemplary flowchart of a method for controlling movements of a virtual object.
  • Step 501 Obtain a trigger operation corresponding to a virtual joystick, and start the virtual joystick.
  • Step 502 Determine whether the virtual joystick is still in an activated state.
  • step 503 When the virtual joystick is not in the activated state, the following step 503 is performed; and when the virtual joystick is still in the activated state, the following step 504 is performed.
  • Step 503 Turn off the virtual joystick.
  • Step 504 Obtain position coordinates (X, Y) of a touch point in real time.
  • Step 505 Obtain a position change (X 0 , Y 0 ) of the touch point from a first moment to a second moment.
  • Step 506 Calculate vertical and horizontal coordinates (A, B) of an estimated position of the virtual joystick at the second moment according to the position change (X 0 , Y 0 ) and coordinates (x 1 , y 1 ) of the virtual joystick at the first moment.
  • Step 507 Calculate a distance r between the vertical and horizontal coordinates (A, B) of the estimated position of the virtual joystick at the second moment and the center ( 0 , 0 ) of a movable area.
  • Step 508 Determine whether the distance r is greater than a radius R of the movable area.
  • Step 509 Determine the estimated position (A, B) as an actual position of the virtual joystick at the second moment.
  • Step 510 Determine coordinates of an intersection of a connecting line between the estimated position and the center of the movable area and an edge of the movable area as the actual position of the virtual joystick at the second moment.
  • Step 511 Calculate a direction and a distance of the virtual joystick relative to the center of the movable area according to the position of the virtual joystick.
  • Step 512 Determine a moving direction and a moving speed of a virtual object according to the direction and the distance.
  • Step 513 Control the virtual object according to the moving direction and the moving speed to move.
  • FIG. 6 is a schematic exemplary diagram of reverse movement of a touch point.
  • Parts (a), (b), and (c) in FIG. 6 are a technical solution provided in the related art; and parts (d) and (e) in FIG. 6 are a technical solution provided in this application.
  • parts (a), (b), and (c) in FIG. 6 when a touch point 30 outside a movable area 20 of a virtual joystick 10 moves in a reverse direction, a position of the virtual joystick 10 remains unchanged while the touch point 30 moves to an edge of the movable area 20 ; and once the touch point 30 moves inside the edge of the movable area 20 , the virtual joystick 10 starts to move in a reverse direction.
  • the direction of the virtual joystick can be quickly changed without the need to reversely compensate for the distance, to further move the virtual object quickly.
  • FIG. 7 is a schematic exemplary diagram of vertical movement of a touch point.
  • Parts (a) and (b) in FIG. 7 are a technical solution provided in the related art; and parts (c) and (d) in FIG. 7 are a technical solution provided in this application.
  • part (c) and part (d) are a technical solution provided in this application.
  • a moving distance of the touch point is smaller when the virtual joystick moves by the same distance, that is, a swipe distance of the user finger is smaller, which improves the operation efficiency of the user.
  • a moving distance of the user finger is smaller when the virtual joystick moves by the same distance. Therefore, the user may move the virtual joystick to a position that is convenient for finger operation, and perform touch operations such as rotation, swipe, and pressing at the position, to control movements of the virtual joystick. Therefore, a problem of inconsistent user operation comfort caused by different finger lengths of users can be alleviated.
  • FIG. 8 is a block diagram of an apparatus for controlling movements of a virtual object according to an embodiment of this application.
  • the apparatus has functions of implementing the foregoing method examples. The functions may be implemented by using hardware, or may be implemented by hardware executing corresponding software.
  • the apparatus may be the terminal described above, or may be disposed on the terminal.
  • the apparatus 800 may include: a display module 810 , a starting module 820 , an adjustment module 830 , and a control module 840 .
  • the display module 810 is configured to: display a target perspective picture of a target application, and superimpose a virtual joystick and a movable area of the virtual joystick on the target perspective picture.
  • the starting module 820 is configured to start the virtual joystick when a trigger operation corresponding to the virtual joystick is obtained.
  • the adjustment module 830 is configured to adjust a position of the virtual joystick in the movable area according to a position change of a touch point when the virtual joystick is in an activated state, the position of the virtual joystick and a position of the touch point changing synchronously in real time when the touch point moves within an effective touch range, the effective touch range including and being larger than the movable area.
  • the control module 840 is configured to control a virtual object according to the position of the virtual joystick to move.
  • the position of the virtual joystick in the movable area is adjusted according to the position change of the touch point.
  • the position of the virtual joystick and the position of the touch point change synchronously in real time when the touch point moves within the effective touch range.
  • the adjustment module 830 includes: a variation obtaining unit 831 , an estimated position calculation unit 832 , and an actual position calculation unit 833 .
  • the variation obtaining unit 831 is configured to obtain the position change of the touch point from a first moment to a second moment.
  • the estimated position calculation unit 832 is configured to calculate an estimated position of the virtual joystick at the second moment according to the position change and a position of the virtual joystick at the first moment.
  • the actual position calculation unit 833 is configured to calculate an actual position of the virtual joystick at the second moment according to the estimated position and the movable area when the estimated position is outside the movable area.
  • the movable area is a circle; and the actual position calculation unit 833 is configured to: calculate coordinates of an intersection of a connecting line between the estimated position and the center of the movable area and an edge of the movable area; and determine the coordinates of the intersection as the actual position of the virtual joystick at the second moment.
  • the apparatus 800 further includes a determining module 850 .
  • the determining module 850 is configured to determine the estimated position as the actual position of the virtual joystick at the second moment when the estimated position is in the movable area.
  • control module 840 is configured to: calculate a direction and a distance of the virtual joystick relative to the center of the movable area according to the position of the virtual joystick; determine a moving direction and a moving speed of the virtual object according to the direction and the distance; and control the virtual object according to the moving direction and the moving speed to move.
  • the apparatus provided in the foregoing embodiments implements functions of the apparatus, it is illustrated with an example of division of each functional module.
  • the function distribution may be finished by different functional modules according to the requirements, that is, the internal structure of the device is divided into different functional modules, to implement all or some of the functions described above.
  • the apparatus and method embodiments provided in the foregoing embodiments belong to the same concept. For the specific implementation process, reference may be made to the method embodiments, and details are not described herein again.
  • FIG. 10 is a structural block diagram of a terminal according to an embodiment of this application.
  • a terminal 1000 includes a processor 1001 and a memory 1002 .
  • the processor 1001 may include one or more processing cores, for example, a 4-core processor or an 8-core processor.
  • the processor 1001 may be implemented by using at least one hardware form of a digital signal processor (DSP), a field programmable gate array (FPGA), and a programmable logic array (PLA).
  • DSP digital signal processor
  • FPGA field programmable gate array
  • PDA programmable logic array
  • the processor 1001 may alternatively include a main processor and a coprocessor.
  • the main processor is a processor that is configured to process data in an awake state and also referred to as a central processing unit (CPU), and the coprocessor is a low-power processor configured to process data in an idle state.
  • the processor 1001 may be integrated with a graphics processing unit (GPU).
  • the GPU is configured to be responsible for rendering and drawing content that a display needs to display.
  • the processor 1001 may further include an artificial intelligence (AI) processor.
  • the AI processor is configured to process a computing operation related to machine
  • the memory 1002 may include one or more computer-readable storage media.
  • the computer-readable storage media may be non-transient.
  • the memory 1002 may further include a high-speed random access memory and a nonvolatile memory, for example, one or more disk storage devices or flash storage devices.
  • the non-transient computer-readable storage medium in the memory 1002 is configured to store at least one instruction. The at least one instruction is executed by the processor 1001 to perform the method for controlling movements of a virtual object provided in the method embodiment in this application.
  • the terminal 1000 may optionally include a peripheral device interface 1003 and at least one peripheral device.
  • the processor 1001 , the memory 1002 , and the peripheral device interface 1003 may be connected by using a bus or a signal cable.
  • Each peripheral device may be connected to the peripheral device interface 1003 by using a bus, a signal cable, or a circuit board.
  • the peripheral device may include: at least one of a communication interface 1004 , a display screen 1005 , an audio circuit 1006 , a camera component 1007 , a positioning component 1008 , and a power supply 1009 .
  • FIG. 10 constitutes no limitation on the terminal 2100 , and the terminal may include more or fewer components than those shown in the figure, or some components may be combined, or a different component deployment may be used.
  • a terminal is further provided in an exemplary embodiment.
  • the terminal includes a processor and a memory, the memory storing at least one instruction, at least one program, a code set, or an instruction set, the at least one instruction, the at least one program, the code set, or the instruction set being loaded and executed by the processor to implement the foregoing method for controlling movements of a virtual object.
  • a computer-readable storage medium is further provided, storing at least one instruction, at least one program, a code set, or an instruction set, the at least one instruction, the at least one program, the code set, or the instruction set, when executed by a processor, implementing the foregoing method for controlling movements of a virtual object.
  • a computer program product is further provided, the computer program product, when executed by a processor, being used for implementing the foregoing method for controlling movements of a virtual object.
  • module and other similar terms such as unit, subunit, module, submodule, etc., in this disclosure may refer to a software unit, a hardware unit, or a combination thereof.
  • a software unit e.g., computer program
  • a hardware unit may be implemented using processing circuitry and/or memory.
  • Each unit can be implemented using one or more processors (or processors and memory).
  • a processor or processors and memory
  • each unit can be part of an overall unit that includes the functionalities of the unit.
  • “Plurality of” mentioned in this specification means two or more. “And/or” describes an association relationship for associated objects and represents that three relationships may exist. For example, A and/or B may represent the following three cases: only A exists, both A and B exist, and only B exists. The character “/” in this specification generally indicates an “or” relationship between the associated objects.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
US17/359,497 2019-06-05 2021-06-25 Method and apparatus for controlling movement of virtual object, terminal, and storage medium Active US11513657B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201910487940.0 2019-06-05
CN201910487940.0A CN110096214B (zh) 2019-06-05 2019-06-05 虚拟对象的移动控制方法、装置、终端和存储介质
PCT/CN2020/092342 WO2020244421A1 (zh) 2019-06-05 2020-05-26 虚拟对象的移动控制方法、装置、终端和存储介质

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/092342 Continuation WO2020244421A1 (zh) 2019-06-05 2020-05-26 虚拟对象的移动控制方法、装置、终端和存储介质

Publications (2)

Publication Number Publication Date
US20210326027A1 US20210326027A1 (en) 2021-10-21
US11513657B2 true US11513657B2 (en) 2022-11-29

Family

ID=67450455

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/359,497 Active US11513657B2 (en) 2019-06-05 2021-06-25 Method and apparatus for controlling movement of virtual object, terminal, and storage medium

Country Status (6)

Country Link
US (1) US11513657B2 (ja)
JP (1) JP7238143B2 (ja)
KR (1) KR102539606B1 (ja)
CN (1) CN110096214B (ja)
SG (1) SG11202108601XA (ja)
WO (1) WO2020244421A1 (ja)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110096214B (zh) 2019-06-05 2021-08-06 腾讯科技(深圳)有限公司 虚拟对象的移动控制方法、装置、终端和存储介质
CN111589112B (zh) * 2020-04-24 2021-10-22 腾讯科技(深圳)有限公司 界面显示方法、装置、终端及存储介质
CN111632372A (zh) * 2020-06-03 2020-09-08 深圳市瑞立视多媒体科技有限公司 虚拟对象的控制方法、装置、设备及存储介质
CN114489457B (zh) * 2022-01-27 2024-01-19 北京字跳网络技术有限公司 虚拟对象的控制方法、装置、可读介质和电子设备
CN117427332A (zh) * 2022-07-12 2024-01-23 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、设备、存储介质及程序产品
CN115129224B (zh) * 2022-07-26 2023-08-04 网易(杭州)网络有限公司 移动控制的方法、装置、存储介质及电子设备
CN115460543B (zh) * 2022-08-31 2024-04-19 中国地质大学(武汉) 一种分布式环形栅栏覆盖方法、设备及存储设备

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6515689B1 (en) * 1998-07-10 2003-02-04 Fuji Photo Optical Co., Ltd. Control apparatus
US20110172013A1 (en) * 2010-01-06 2011-07-14 Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) User interface processing apparatus, method of processing user interface, and program for processing user interface
US20110285636A1 (en) * 2010-05-20 2011-11-24 Howard John W Touch screen with virtual joystick and methods for use therewith
JP2013061803A (ja) 2011-09-13 2013-04-04 Sony Computer Entertainment Inc 情報処理装置、情報処理方法、およびコンテンツファイルのデータ構造
US20150378459A1 (en) * 2014-06-26 2015-12-31 GungHo Online Entertainment, Inc. Terminal device
EP3267300A1 (en) 2015-10-10 2018-01-10 Tencent Technology (Shenzhen) Co., Ltd Information processing method and terminal, and computer storage medium
CN107577345A (zh) 2017-09-04 2018-01-12 苏州英诺迈医学创新服务有限公司 一种控制虚拟人物漫游的方法及装置
CN108404408A (zh) 2018-02-01 2018-08-17 网易(杭州)网络有限公司 信息处理方法、装置、存储介质及电子设备
US20190099669A1 (en) 2017-09-30 2019-04-04 Netease (Hangzhou) Network Co.,Ltd Information Processing Method and Apparatus, Electronic Device, and Storage Medium
CN110096214A (zh) 2019-06-05 2019-08-06 腾讯科技(深圳)有限公司 虚拟对象的移动控制方法、装置、终端和存储介质
US10500493B2 (en) * 2017-09-28 2019-12-10 Netease (Hangzhou) Network Co., Ltd. Information processing method and apparatus, electronic device, and storage medium
US20210086071A1 (en) * 2018-06-06 2021-03-25 Konami Digital Entertainment Co., Ltd. Recording medium and information processing apparatus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107042018B (zh) * 2016-02-05 2018-09-18 腾讯科技(深圳)有限公司 控制对象的空间位置确定方法和装置
KR101984305B1 (ko) * 2017-04-24 2019-05-30 주식회사 넥슨코리아 인터페이스 제공 방법 및 장치

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6515689B1 (en) * 1998-07-10 2003-02-04 Fuji Photo Optical Co., Ltd. Control apparatus
US20110172013A1 (en) * 2010-01-06 2011-07-14 Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) User interface processing apparatus, method of processing user interface, and program for processing user interface
US20110285636A1 (en) * 2010-05-20 2011-11-24 Howard John W Touch screen with virtual joystick and methods for use therewith
JP2013061803A (ja) 2011-09-13 2013-04-04 Sony Computer Entertainment Inc 情報処理装置、情報処理方法、およびコンテンツファイルのデータ構造
US10146343B2 (en) * 2014-06-26 2018-12-04 GungHo Online Entertainment, Inc. Terminal device having virtual operation key
US20150378459A1 (en) * 2014-06-26 2015-12-31 GungHo Online Entertainment, Inc. Terminal device
JP2016009473A (ja) 2014-06-26 2016-01-18 ガンホー・オンライン・エンターテイメント株式会社 端末装置
CN105302453A (zh) 2014-06-26 2016-02-03 工合线上娱乐株式会社 终端装置
EP3267300A1 (en) 2015-10-10 2018-01-10 Tencent Technology (Shenzhen) Co., Ltd Information processing method and terminal, and computer storage medium
CN107577345A (zh) 2017-09-04 2018-01-12 苏州英诺迈医学创新服务有限公司 一种控制虚拟人物漫游的方法及装置
US10500493B2 (en) * 2017-09-28 2019-12-10 Netease (Hangzhou) Network Co., Ltd. Information processing method and apparatus, electronic device, and storage medium
US20190099669A1 (en) 2017-09-30 2019-04-04 Netease (Hangzhou) Network Co.,Ltd Information Processing Method and Apparatus, Electronic Device, and Storage Medium
JP2019067390A (ja) 2017-09-30 2019-04-25 ネットイース(ハンチョウ)ネットワーク カンパニーリミテッド 情報処理方法、装置、電子機器及び記憶媒体
CN108404408A (zh) 2018-02-01 2018-08-17 网易(杭州)网络有限公司 信息处理方法、装置、存储介质及电子设备
US20210086071A1 (en) * 2018-06-06 2021-03-25 Konami Digital Entertainment Co., Ltd. Recording medium and information processing apparatus
CN110096214A (zh) 2019-06-05 2019-08-06 腾讯科技(深圳)有限公司 虚拟对象的移动控制方法、装置、终端和存储介质

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Japan Patent Office (JPO) The Office Action For JP Application No. 2021-541498 dated Aug. 8, 2022 6 Pages (Translation Included).
The State Intellectual Property Office of the People's Republic of China (SIPO) Office Action 1 for for 201910487940.0 dated Dec. 23, 2020 9 Pages (including translation).
The World Intellectual Property Organization (WIPO) International Search Report for PCT/CN2020/092342 dated Aug. 26, 2020 6 Pages (including translation).
Wolf Funny Commentary, "Stimulation battlefield: you play with two fingers, see how the experts play," Tencent Video, Feb. 21, 2017 (Feb. 21, 2019), Retrieved from the Internet: URL: https://v.qq.com/x/page/o0840z19pnv.html. 1 page.

Also Published As

Publication number Publication date
KR102539606B1 (ko) 2023-06-01
CN110096214A (zh) 2019-08-06
CN110096214B (zh) 2021-08-06
JP7238143B2 (ja) 2023-03-13
SG11202108601XA (en) 2021-09-29
WO2020244421A1 (zh) 2020-12-10
KR20210103553A (ko) 2021-08-23
US20210326027A1 (en) 2021-10-21
JP2022518465A (ja) 2022-03-15

Similar Documents

Publication Publication Date Title
US11513657B2 (en) Method and apparatus for controlling movement of virtual object, terminal, and storage medium
US11376501B2 (en) Method and apparatus for displaying marker element in virtual scene, computer device, and computer-readable storage medium
US11992760B2 (en) Virtual object control method and apparatus, terminal, and storage medium
KR102592632B1 (ko) 가상 환경에서 마크 정보를 생성하는 방법 및 장치, 전자 장치 및 저장 매체
KR20210140747A (ko) 가상 객체 제어 방법 및 장치, 디바이스 및 매체
EP3970819B1 (en) Interface display method and apparatus, and terminal and storage medium
US20120154311A1 (en) Information storage medium, terminal, and input determination method
US20230059116A1 (en) Mark processing method and apparatus, computer device, storage medium, and program product
US20220051470A1 (en) Method and apparatus for displaying virtual environment picture, device, and storage medium
JP2015123244A (ja) プログラム及びゲーム装置
JP7186901B2 (ja) ホットスポットマップの表示方法、装置、コンピュータ機器および読み取り可能な記憶媒体
JP2022526512A (ja) インタラクティブオブジェクト駆動方法、装置、機器、及び記憶媒体
JP2021535824A (ja) 視角回転の方法、装置及びコンピュータプログラム
US12064689B2 (en) Method for selecting virtual objects, apparatus, terminal and storage medium
JP2023139033A (ja) 視点回転の方法、装置、端末およびコンピュータプログラム
CN108553895A (zh) 用户界面元素与三维空间模型关联的方法和装置
JP2024509064A (ja) 位置マークの表示方法及び装置、機器並びにコンピュータプログラム
JP7384521B2 (ja) 仮想オブジェクトの制御方法、装置、コンピュータ機器及びコンピュータプログラム
US11100723B2 (en) System, method, and terminal device for controlling virtual image by selecting user interface element
KR20240067252A (ko) 인터페이스 디스플레이 방법 및 장치, 단말, 저장 매체, 및 컴퓨터 프로그램 제품
KR102701092B1 (ko) 가상 객체 제어 방법과 장치, 단말기, 및 저장 매체
US11475609B2 (en) Computer program, server device, terminal device, and method for moving gift in virtual space
CN114146411A (zh) 游戏对象控制方法、装置、电子设备和存储介质
CN111260792A (zh) 虚拟内容显示方法、装置、终端设备及存储介质

Legal Events

Date Code Title Description
AS Assignment

Owner name: TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED, CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WANG, JUNXIANG;HU, YIZHONG;REEL/FRAME:056678/0861

Effective date: 20210602

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE