WO2020244421A1 - 虚拟对象的移动控制方法、装置、终端和存储介质 - Google Patents

虚拟对象的移动控制方法、装置、终端和存储介质 Download PDF

Info

Publication number
WO2020244421A1
WO2020244421A1 PCT/CN2020/092342 CN2020092342W WO2020244421A1 WO 2020244421 A1 WO2020244421 A1 WO 2020244421A1 CN 2020092342 W CN2020092342 W CN 2020092342W WO 2020244421 A1 WO2020244421 A1 WO 2020244421A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
virtual joystick
movable area
joystick
touch point
Prior art date
Application number
PCT/CN2020/092342
Other languages
English (en)
French (fr)
Inventor
王俊翔
胡艺钟
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to JP2021541498A priority Critical patent/JP7238143B2/ja
Priority to SG11202108601XA priority patent/SG11202108601XA/en
Priority to KR1020217023113A priority patent/KR102539606B1/ko
Publication of WO2020244421A1 publication Critical patent/WO2020244421A1/zh
Priority to US17/359,497 priority patent/US11513657B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens

Definitions

  • the embodiments of the present application relate to the field of computer technology, and in particular, to a method, device, terminal, and storage medium for controlling the movement of virtual objects.
  • the user controls the movement of a virtual object in the virtual environment through a touch operation based on the screen, for example, a drag operation on a virtual joystick in the interface.
  • the embodiments of the present application provide a method, device, terminal, and storage medium for controlling the movement of a virtual object.
  • the technical solution is as follows:
  • an embodiment of the present application provides a method for controlling the movement of a virtual object, which is applied to a terminal, and the method includes:
  • the virtual joystick When the virtual joystick is in the activated state, adjust the position of the virtual joystick in the movable area according to the amount of change in the position of the touch point, wherein, when the touch point is within the effective touch range
  • the position of the virtual joystick changes synchronously with the position of the touch point in real time, and the effective touch range includes and is larger than the movable area;
  • an embodiment of the present application provides a device for controlling movement of a virtual object, and the device includes:
  • the display module is used to display the target perspective screen of the target application program, and superimpose and display the virtual joystick and the movable area of the virtual joystick on the target perspective screen;
  • the activation module is configured to activate the virtual joystick when a trigger operation corresponding to the virtual joystick is acquired;
  • the adjustment module is used to adjust the position of the virtual joystick in the movable area according to the position change of the touch point when the virtual joystick is in the activated state, wherein, when the touch point is When moving within the effective touch range, the position of the virtual joystick and the position of the touch point change synchronously in real time, and the effective touch range includes and is larger than the movable area;
  • the control module is used to control the virtual object to move according to the position of the virtual joystick.
  • an embodiment of the present application provides a terminal.
  • the terminal includes a processor and a memory.
  • the memory stores at least one instruction, at least one program, code set, or instruction set.
  • the at least one program, the code set, or the instruction set is loaded and executed by the processor to implement the method for controlling the movement of the virtual object as described in the foregoing aspect.
  • an embodiment of the present application provides a computer-readable storage medium that stores at least one instruction, at least one program, code set, or instruction set, the at least one instruction, the At least one program, the code set or the instruction set is loaded and executed by the processor to implement the method for controlling the movement of the virtual object as described in the above aspect.
  • the embodiments of the present application provide a computer program product, which is used to implement the above-mentioned method for controlling movement of virtual objects when the computer program product is executed by a processor.
  • Fig. 1 is a flowchart of a method for controlling movement of a virtual object provided by an embodiment of the present application
  • Fig. 2 exemplarily shows a schematic diagram of a method for controlling movement of a virtual object
  • FIG. 3 is a flowchart of a method for controlling movement of a virtual object provided by another embodiment of the present application.
  • Fig. 4 exemplarily shows a schematic diagram of calculating the actual position of the virtual joystick at the second moment
  • Fig. 5 exemplarily shows a flow chart of a method for controlling movement of a virtual object
  • Fig. 6 exemplarily shows a schematic diagram of the touch point moving in the reverse direction
  • FIG. 7 exemplarily shows a schematic diagram of the vertical movement of the touch point
  • Fig. 8 is a block diagram of a device for controlling movement of a virtual object provided by an embodiment of the present application.
  • FIG. 9 is a block diagram of a device for controlling movement of a virtual object according to another embodiment of the present application.
  • FIG. 10 is a structural block diagram of a terminal provided by an embodiment of the present application.
  • Virtual environment the virtual environment displayed (or provided) when the application is running on the terminal.
  • the virtual environment may be a simulation environment of the real world, a semi-simulation and semi-fictional three-dimensional environment, or a purely fictitious three-dimensional environment.
  • the virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment.
  • the following embodiments take the virtual environment as a three-dimensional virtual environment as an example, but are not limited thereto.
  • Virtual object refers to the movable object in the virtual environment.
  • the movable object may be at least one of a virtual character, a virtual animal, and an animation character.
  • the virtual environment is a three-dimensional virtual environment
  • the virtual object is a three-dimensional model created based on animation skeleton technology.
  • Each virtual object has its own shape and volume in the three-dimensional virtual environment, and occupies a part of the space in the three-dimensional virtual environment.
  • Virtual joystick refers to the control used to control the movement of virtual objects in the virtual environment in the virtual environment.
  • the user can control the movement of the virtual joystick through the touch operation on the terminal screen, and further control the movement of the virtual object.
  • the virtual joystick may be a circle.
  • the virtual joystick may also be a triangle, a square, a hexagon, an octagon, etc., or It is other irregular graphics, which is not limited in the embodiment of the application.
  • the aforementioned virtual joystick moves within the movable area.
  • the shape of the movable area may be the same as or different from the shape of the virtual rocker.
  • the virtual joystick is a circle, and the movable area is also a circle, and the two are concentric circles; for another example, the virtual joystick is a hexagon, and the movable area is an octagon, and the centers of the two coincide.
  • the method for controlling the movement of virtual objects can be applied to a terminal, which can be a laptop portable computer, a mobile phone, a tablet computer, an e-book reader, an electronic game machine, and a dynamic image expert compressing standard audio level 4 ( Moving Picture Experts Group Audio Layer IV, MP4) players, etc.
  • a terminal which can be a laptop portable computer, a mobile phone, a tablet computer, an e-book reader, an electronic game machine, and a dynamic image expert compressing standard audio level 4 ( Moving Picture Experts Group Audio Layer IV, MP4) players, etc.
  • MP4 Moving Picture Experts Group Audio Layer IV
  • the application program supports an application program in a three-dimensional virtual environment.
  • the application can be a virtual reality application, a three-dimensional map program, a military simulation program, and a game program, such as TPS games, First Person Shooting (FPS) games, Multiplayer Online Battle Arena, MOBA) Any of games, multiplayer gun battle survival games.
  • the application program may be a stand-alone version application program, such as a stand-alone version of a 3D game program; it may also be a network online version application program.
  • the touch point of the user's finger on the screen can only move within the movable area of the virtual joystick, which has relatively strong limitations.
  • the related technology provides a method, that is, when the touch point is outside the movable area of the virtual joystick, the virtual joystick remains activated. In this case, the line between the position of the touch point and the center of the movable area is acquired, and the coordinates of the intersection point between the line and the edge of the movable area are determined as the position of the virtual joystick.
  • the position of the virtual joystick in the movable area is adjusted according to the position change of the touch point, and when the touch point moves within the effective touch range, the virtual joystick The position of the joystick changes in real time with the position of the touch point.
  • the moving distance of the touch point in this application is smaller when the virtual joystick moves the same distance, which improves the user’s Operational efficiency.
  • FIG. 1 shows a flowchart of a method for controlling movement of a virtual object provided by an embodiment of the present application.
  • the method is mainly applied to the terminal introduced above for illustration.
  • the method can include the following steps (101-104):
  • Step 101 Display a target perspective screen of the target application, and superimpose and display the virtual joystick and the movable area of the virtual joystick on the target perspective screen.
  • the above-mentioned target application may be an application based on a virtual environment.
  • the application program may be a virtual reality application program, a three-dimensional map application program, a military simulation program, a game application program, and the game application program may be at least one of a TPS game, an FPS game, and a MOBA game.
  • the terminal runs the target application to display the target perspective screen.
  • the above-mentioned target perspective picture is a picture obtained by observing the virtual environment from the target perspective of the virtual object.
  • the above-mentioned target angle of view is the angle of view for observing the virtual environment using the first-person angle of the virtual object, or the above-mentioned object angle of view is the angle of view for observing the virtual environment using the third-person angle of the virtual object.
  • the virtual joystick and the movable area of the virtual joystick can be superimposed and displayed on the above-mentioned target view screen.
  • the terminal may also display at least one virtual object included in the virtual environment.
  • the aforementioned virtual joystick is used to control the movement of virtual objects in a virtual environment.
  • virtual joysticks please refer to the above brief introduction of terms, so I won’t repeat them here.
  • the movable area of the virtual joystick is used to limit the moving range of the virtual joystick, so that the virtual joystick can only move within the movable area.
  • the virtual joystick is located in the center of the movable area.
  • the shape of the above movable area may be the same as or different from the shape of the virtual rocker.
  • the virtual joystick is a circle, and the movable area is also a circle, and the two are concentric circles; for another example, the virtual joystick is a hexagon, and the movable area is an octagon, and the centers of the two coincide.
  • Step 102 When a trigger operation corresponding to the virtual joystick is acquired, the virtual joystick is activated.
  • the terminal can detect whether a trigger operation for starting the virtual joystick is received by the user on the screen, and when the terminal obtains the trigger operation corresponding to the virtual joystick, the virtual joystick is activated.
  • the trigger operation may be a pressing operation, a single click operation, a sliding operation, etc., which is not limited in the embodiment of the present application.
  • the terminal may detect the position of the trigger operation through the touch screen, that is, the position of the touch point of the user's finger on the screen, and determine whether the touch point is within the trigger range of the virtual joystick.
  • the virtual joystick is activated according to the trigger operation.
  • the terminal does not respond to the trigger operation.
  • the trigger range of the aforementioned virtual joystick may be the movable area of the virtual joystick, or may be an area larger than the movable area, which is not limited in the embodiment of the present application.
  • Step 103 When the virtual joystick is in the activated state, adjust the position of the virtual joystick in the movable area according to the position change of the touch point, wherein when the touch point moves within the effective touch range, the virtual joystick The position of the lever and the position of the touch point change synchronously in real time, and the effective touch range includes and is larger than the movable area.
  • the terminal detects continuous touch operations acting on the screen, it confirms that the virtual joystick is in the activated state.
  • the user swipes on the terminal screen to change the position of the touch point.
  • the terminal can determine the change in the position of the touch point between two moments, and further, can change according to the position of the touch point Adjust the position of the virtual joystick in the movable area.
  • the position of the virtual joystick and the position of the touch point change synchronously in real time
  • the effective touch range includes and is larger than the movable area.
  • the effective touch range may be the entire screen of the terminal, or part of the screen area of the terminal, which is not limited in the embodiment of the present application.
  • the position of the virtual joystick is the position of the touch point.
  • the terminal obtains the position of the virtual joystick through a certain calculation according to the position of the touch point.
  • Step 104 Control the virtual object to move according to the position of the virtual joystick.
  • the terminal After determining the position of the virtual joystick, the terminal can control the movement of the virtual object according to the position of the virtual joystick, such as the direction and distance.
  • different marks may be generated according to the position of the virtual joystick, such as alert marks, material marks, offensive marks, defensive marks, etc., where the alert marks are used to prompt the virtual object to be on guard, or prompt The virtual object guards the marked position; the material mark is used to remind the virtual object that there is material at the marked position; the offensive mark is used to prompt the virtual object to start attacking, or the virtual object to attack the marked position; the defense mark is used to prompt the virtual Subject defends.
  • FIG. 2 it exemplarily shows a schematic diagram of a method for controlling movement of a virtual object.
  • the virtual joystick and the movable area of the virtual joystick both in a circle, and the touch point moves in the opposite direction as an example.
  • the virtual joystick 10 is located in the center of the movable area 20.
  • the virtual joystick 10 When the terminal obtains the trigger operation corresponding to the virtual joystick 10, the virtual joystick 10 is activated; when the touch point 30 moves within the movable area 20, the position of the touch point 30 is the same as the position of the virtual joystick 10; When the touch point 30 moves outside the movable area 20, the virtual joystick 10 moves on the edge of the movable area 20; when the touch point 30 starts to move in the opposite direction, the virtual joystick 10 is located on the edge of the movable area 20. The virtual joystick 10 also starts to move in the reverse direction, and the two are synchronized in real time.
  • the position of the virtual joystick in the movable area is adjusted according to the amount of change in the position of the touch point, and when the touch point is When moving within the effective touch range, the position of the virtual joystick changes in real time with the position of the touch point.
  • the moving distance of the touch point in this application is smaller when the virtual joystick moves the same distance, which improves the user’s Operational efficiency.
  • FIG. 3 shows a flowchart of a method for controlling movement of a virtual object according to another embodiment of the present application.
  • the method is mainly applied to the terminal introduced above for illustration.
  • the method can include the following steps:
  • Step 301 Display the target view screen of the target application, and superimpose and display the virtual joystick and the movable area of the virtual joystick on the target view screen.
  • the virtual joystick is located in the center of the movable area.
  • This step is the same as or similar to step 101 in the embodiment of FIG. 1, and will not be repeated here.
  • Step 302 When a trigger operation corresponding to the virtual joystick is acquired, the virtual joystick is activated.
  • This step is the same as or similar to step 102 in the embodiment of FIG. 1, and will not be repeated here.
  • the terminal can judge in real time whether the virtual joystick is still activated, and when the judgment is yes, the terminal continues to keep the virtual joystick in the activated state; when the judgment is no, the terminal turns off the virtual joystick.
  • step 303 is executed.
  • Step 303 Obtain the position change amount of the touch point from the first moment to the second moment.
  • the terminal can obtain the position of the touch point in real time, thereby obtaining the position of the touch point at the first time and the position of the second time, and further obtain the position change of the touch point from the first time to the second time.
  • the amount of position change is a vector including direction and distance.
  • the position of the touch point at the first moment may be expressed as (X1, Y1)
  • the position of the touch point at the second moment may be expressed as (X2, Y2)
  • the touch point is from the first moment to the first moment.
  • Step 304 Calculate the predicted position of the virtual joystick at the second moment according to the position change and the position of the virtual joystick at the first moment.
  • the terminal can also obtain the position of the virtual joystick at the first moment.
  • the position of the virtual joystick also changes. Assuming that when the virtual joystick is not restricted by the movable area, the virtual joystick can achieve the same position change as the touch point, and the predicted position at the second moment is obtained at this time.
  • the position of the virtual joystick at the first moment may be expressed as (0, 0).
  • the terminal may determine whether the virtual joystick is located within the movable area at the predicted position at the second moment. When it is outside the movable area, perform the following step 305; when it is within the movable area , Perform the following step 306.
  • the terminal may calculate the estimated position of the virtual joystick at the second moment relative to the distance from the center of the movable area (that is, the origin). If the distance is greater than the movable area, the estimated position is considered to be within the movable area. Outside; if the distance is less than or equal to the movable area, the predicted position is considered to be within the movable area.
  • the estimated position of the virtual joystick at the second moment can be expressed as (A, B), and the distance between the estimated position and the center of the movable area can be expressed as
  • r ⁇ R the predicted position is considered to be within the movable area; when r>R, the predicted position is considered to be outside the movable area.
  • Step 305 If the predicted position is outside the movable area, calculate the actual position of the virtual joystick at the second moment according to the predicted position and the movable area.
  • step 305 may include the following two steps:
  • Fig. 4 it exemplarily shows a schematic diagram of calculating the actual position of the virtual joystick at the second moment.
  • the radius of the movable area is R
  • the position of the touch point at the first moment is M(X1, Y1)
  • the touch point at the second moment is The position is N(X2, Y2)
  • the position of the virtual joystick at the first moment is P(x1, y1)
  • the predicted position of the virtual joystick at the second moment is Q1(A, B)
  • the virtual joystick is at the second moment
  • the actual position at time is Q2(x2, y2).
  • Step 306 If the predicted position is within the movable area, the predicted position is determined as the actual position of the virtual joystick at the second moment.
  • the terminal When the predicted position is within the movable area, the terminal directly determines the predicted position as the actual position of the virtual joystick at the second moment.
  • Step 307 Calculate the direction and distance of the virtual joystick relative to the center of the movable area according to the position of the virtual joystick.
  • the terminal may calculate the direction and distance of the virtual joystick relative to the center of the movable area.
  • Step 308 Determine the moving direction and moving speed of the virtual object according to the direction and distance.
  • the terminal determines the moving direction and moving speed of the virtual object according to the direction and distance of the virtual joystick.
  • the direction of the virtual joystick may be mapped to the moving direction of the virtual object
  • the distance of the virtual joystick may be mapped to the moving speed of the virtual object. The larger the distance, the faster the moving speed of the virtual object.
  • Step 309 Control the virtual object to move according to the moving direction and moving speed.
  • the virtual object is controlled to move in the virtual environment, such as walking, running, jumping, etc.
  • the moving distance of the touch point is smaller when the operation is completed, that is, the sliding distance of the user's finger is smaller, which further improves the operation efficiency of the user.
  • FIG. 5 it exemplarily shows a flowchart of a method for controlling movement of a virtual object.
  • Step 501 Acquire a trigger operation corresponding to the virtual joystick, and start the virtual joystick.
  • Step 502 Determine whether the virtual joystick is still in the activated state.
  • step 503 If the virtual joystick is not in the activated state, execute the following step 503; if the virtual joystick is still in the activated state, execute the following step 504.
  • Step 503 Turn off the virtual joystick.
  • Step 504 Acquire the position coordinates (X, Y) of the touch point in real time.
  • Step 505 Obtain the position change (X0, Y0) of the touch point from the first moment to the second moment.
  • Step 506 according to the position change (X0, Y0) and the coordinates (x1, y1) of the virtual joystick at the first moment, calculate the horizontal and vertical coordinates (A, B) of the predicted position of the virtual joystick at the second moment.
  • Step 507 Calculate the distance r between the horizontal and vertical coordinates (A, B) of the predicted position of the virtual joystick at the second moment and the center (0, 0) of the movable area.
  • Step 508 Determine whether the above-mentioned distance r is greater than the radius R of the movable area.
  • Step 509 Determine the predicted position (A, B) as the actual position of the virtual joystick at the second moment.
  • Step 510 Determine 508 the coordinates of the intersection between the predicted position and the center of the movable area and the edge of the movable area as the actual position of the virtual joystick at the second moment.
  • Step 511 Calculate the direction and distance of the virtual joystick relative to the center of the movable area according to the position of the virtual joystick.
  • Step 512 Determine the moving direction and moving speed of the virtual object according to the direction and distance.
  • Step 513 Control the virtual object to move according to the moving direction and moving speed.
  • FIG. 6 it exemplarily shows a schematic diagram of the touch point moving in the reverse direction.
  • part (a), part (b) and part (c) in Figure 6 are technical solutions provided by related technologies; part (d) and (e) in Figure 6 are technical solutions provided by this application.
  • part (a), part (b) and part (c) of FIG. 6 when the touch point 30 outside the movable area 20 of the virtual joystick 10 moves in the reverse direction, the touch point 30 During the process of moving to the edge of the movable area 20, the position of the virtual joystick 10 remains unchanged; when the touch point 30 moves to the edge of the movable area 20, the virtual joystick 10 starts to move in the reverse direction.
  • part (d) and part (e) of FIG. 6 when the touch point 30 located outside the movable area 20 of the virtual joystick 10 moves in the reverse direction, the virtual rocker located at the edge of the movable area 20 The rod 10 moves in reverse at the same time.
  • the technical solution provided by the embodiments of the present application can quickly change the direction of the virtual joystick, without the need to reversely compensate the distance, and further quickly manipulate the movement of the virtual object.
  • FIG. 7 it exemplarily shows a schematic diagram of the vertical movement of the touch point.
  • part (a) and part (b) in FIG. 7 are technical solutions provided by related technologies
  • part (c) and (d) in FIG. 7 are technical solutions provided by this application. Comparing part (c) and part (d), it can be seen that when the touch point 30 outside the movable area 20 of the virtual joystick 10 moves upward, when the virtual joystick 10 moves the same distance, the relevant In the technical solution provided by the technology, the distance that the touch point 30 needs to move is greater than the distance that the touch point 30 needs to move in the technical solution provided by this application.
  • the technical solution provided by the embodiments of the present application when the virtual joystick moves the same distance, the moving distance of the touch point is smaller, that is, the sliding distance of the user's finger is smaller, which improves the operation efficiency of the user.
  • the technical solutions provided in the embodiments of the present application can move the virtual joystick for the same distance.
  • the user's finger sliding distance is smaller, therefore, the user can move the virtual joystick to a position that is convenient for finger operation, and then perform touch operations at that location, such as rotating, sliding, and pressing, to control the movement of the virtual joystick. Therefore, the problem of inconsistent user operation comfort caused by different finger lengths of the user can be effectively alleviated.
  • FIG. 8 shows a block diagram of an apparatus for controlling movement of a virtual object according to an embodiment of the present application.
  • the device has the function of realizing the above method example, and the function can be realized by hardware, or by hardware executing corresponding software.
  • the device can be the terminal described above, or it can be set on the terminal.
  • the device 800 may include: a display module 810, an activation module 820, an adjustment module 830, and a control module 840.
  • the display module 810 is configured to display a target perspective screen of the target application program, and superimpose and display the virtual joystick and the movable area of the virtual joystick on the target perspective screen.
  • the activation module 820 is configured to activate the virtual joystick when a trigger operation corresponding to the virtual joystick is acquired.
  • the adjustment module 830 is configured to adjust the position of the virtual joystick in the movable area according to the amount of change in the position of the touch point when the virtual joystick is in the activated state, wherein when the touch point When moving within the effective touch range, the position of the virtual joystick and the position of the touch point synchronously change in real time, and the effective touch range includes and is larger than the movable area.
  • the control module 840 is configured to control the virtual object to move according to the position of the virtual joystick.
  • the position of the virtual joystick in the movable area is adjusted according to the amount of change in the position of the touch point, and when the touch point is When moving within the effective touch range, the position of the virtual joystick changes in real time with the position of the touch point.
  • the moving distance of the touch point in this application is smaller when the virtual joystick moves the same distance, which improves the user’s Operational efficiency.
  • the adjustment module 830 includes: a variation acquisition unit 831, a predicted position calculation unit 832, and an actual position calculation unit 833.
  • Change acquisition unit 831 for acquiring the position change of the touch point from the first moment to the second moment.
  • the predicted position calculation unit 832 is configured to calculate the predicted position of the virtual joystick at the second moment according to the position change and the position of the virtual joystick at the first moment.
  • the actual position calculation unit 833 is configured to calculate the actual position of the virtual joystick at the second moment according to the estimated position and the movable area when the estimated position is outside the movable area .
  • the movable area is circular
  • the actual position calculation unit 833 is configured to calculate the line between the predicted position and the center of the movable area, and the coordinates of the intersection point with the edge of the movable area; determine the coordinates of the intersection point as the The actual position of the virtual joystick at the second moment.
  • the device 800 further includes: a determining module 850.
  • the determining module 850 is configured to determine the estimated position as the actual position of the virtual joystick at the second moment when the estimated position is within the movable area.
  • control module 840 is configured to calculate the direction and distance of the virtual joystick relative to the center of the movable area according to the position of the virtual joystick; according to the direction and distance , Determining the moving direction and moving speed of the virtual object; controlling the virtual object to move according to the moving direction and moving speed.
  • the device provided in the above embodiment when implementing its functions, only uses the division of the above functional modules for illustration. In practical applications, the above functions can be allocated by different functional modules as needed, namely The internal structure of the device is divided into different functional modules to complete all or part of the functions described above.
  • the apparatus and method embodiments provided by the above-mentioned embodiments belong to the same concept, and the specific implementation process is detailed in the method embodiments, which will not be repeated here.
  • FIG. 10 shows a structural block diagram of a terminal provided by an embodiment of the present application.
  • the terminal 1000 includes a processor 1001 and a memory 1002.
  • the processor 1001 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on.
  • the processor 1001 may be implemented in at least one hardware form among DSP (Digital Signal Processing), FPGA (Field Programmable Gate Array), PLA (Programmable Logic Array, Programmable Logic Array) .
  • the processor 1001 may also include a main processor and a coprocessor.
  • the main processor is a processor used to process data in the wake state, also called a CPU (Central Processing Unit, central processing unit); the coprocessor is A low-power processor used to process data in the standby state.
  • the processor 1001 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used to render and draw content that needs to be displayed on the display screen.
  • the processor 1001 may further include an AI (Artificial Intelligence) processor, and the AI processor is used to process computing operations related to machine learning.
  • AI Artificial Intelligence
  • the memory 1002 may include one or more computer-readable storage media, which may be non-transitory.
  • the memory 1002 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
  • the non-transitory computer-readable storage medium in the memory 1002 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 1001 to implement the virtual object provided in the method embodiment of the present application. Mobile control method.
  • the terminal 1000 may optionally further include: a peripheral device interface 1003 and at least one peripheral device.
  • the processor 1001, the memory 1002, and the peripheral device interface 1003 may be connected by a bus or a signal line.
  • Each peripheral device can be connected to the peripheral device interface 1003 through a bus, a signal line or a circuit board.
  • the peripheral device may include: at least one of a communication interface 1004, a display screen 1005, an audio circuit 1006, a camera component 1007, a positioning component 1008, and a power supply 1009.
  • FIG. 10 does not constitute a limitation on the terminal 2100, and may include more or fewer components than shown in the figure, or combine some components, or adopt different component arrangements.
  • a terminal is also provided.
  • the terminal includes a processor and a memory, and the memory stores at least one instruction, at least one program, code set or instruction set, and the at least one instruction, the at least one program, the code set or the instruction set is controlled by the
  • the processor is loaded and executed to implement the above-mentioned movement control method of the virtual object.
  • a computer-readable storage medium stores at least one instruction, at least one program, code set, or instruction set, the at least one instruction, the At least one program, the code set or the instruction set implements the above-mentioned method for controlling the movement of the virtual object when being executed by the processor.
  • a computer program product is also provided, which is used to implement the above-mentioned method for controlling the movement of virtual objects when the computer program product is executed by a processor.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

本申请提供了一种虚拟对象的移动控制方法、装置、终端和存储介质。所述方法包括:显示目标应用程序的目标视角画面,在目标视角画面上叠加显示虚拟摇杆和虚拟摇杆的可移动区域;当获取到对应于虚拟摇杆的触发操作时,启动虚拟摇杆;在虚拟摇杆处于启动状态下,根据触控点的位置变化量,调整虚拟摇杆在可移动区域中的位置,其中,当触控点在有效触控范围内移动时,虚拟摇杆的位置与触控点的位置实时同步变化,有效触控范围包含且大于可移动区域;根据虚拟摇杆的位置控制虚拟对象进行移动。相比于相关技术中,当触控点位于虚拟摇杆的可移动区域之外时,在虚拟摇杆移动同样距离的情况下,本申请中触控点移动的距离更小,提高了用户的操作效率。

Description

虚拟对象的移动控制方法、装置、终端和存储介质
本申请要求于2019年06月05日提交的申请号为201910487940.0、发明名称为“虚拟对象的移动控制方法、装置、终端和存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及计算机技术领域,特别涉及一种虚拟对象的移动控制方法、装置、终端和存储介质。
背景技术
在基于虚拟环境的应用程序中,用户通过基于屏幕上的触控操作,例如通过对界面内的虚拟摇杆的拖曳操作来控制虚拟对象在虚拟环境中移动。
发明内容
本申请实施例提供了一种虚拟对象的移动控制方法、装置、终端和存储介质。所述技术方案如下:
一方面,本申请实施例提供了一种虚拟对象的移动控制方法,应用于终端,所述方法包括:
显示目标应用程序的目标视角画面,在所述目标视角画面上叠加显示虚拟摇杆和所述虚拟摇杆的可移动区域;
当获取到对应于所述虚拟摇杆的触发操作时,启动所述虚拟摇杆;
在所述虚拟摇杆处于启动状态下,根据触控点的位置变化量,调整所述虚拟摇杆在所述可移动区域中的位置,其中,当所述触控点在有效触控范围内移动时,所述虚拟摇杆的位置与所述触控点的位置实时同步变化,所述有效触控范围包含且大于所述可移动区域;
根据所述虚拟摇杆的位置控制所述虚拟对象进行移动。
另一方面,本申请实施例提供了一种虚拟对象的移动控制装置,所述装置包括:
显示模块,用于显示目标应用程序的目标视角画面,在所述目标视角画面上叠加显示虚拟摇杆和所述虚拟摇杆的可移动区域;
启动模块,用于当获取到对应于所述虚拟摇杆的触发操作时,启动所述虚拟摇杆;
调整模块,用于在所述虚拟摇杆处于启动状态下,根据触控点的位置变化量,调整所述虚拟摇杆在所述可移动区域中的位置,其中,当所述触控点在有效触控范围内移动时,所述虚拟摇杆的位置与所述触控点的位置实时同步变化,所述有效触控范围包含且大于所述可移动区域;
控制模块,用于根据所述虚拟摇杆的位置控制所述虚拟对象进行移动。
再一方面,本申请实施例提供了一种终端,所述终端包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执 行以实现如上述方面所述的虚拟对象的移动控制方法。
再一方面,本申请实施例提供了一种计算机可读存储介质,所述计算机可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由处理器加载并执行以实现如上述方面所述的虚拟对象的移动控制方法。
还一方面,本申请实施例提供了一种计算机程序产品,所述计算机程序产品被处理器执行时,用于实现上述虚拟对象的移动控制方法。
本申请实施例提供的技术方案可以包括如下有益效果:
在启动虚拟摇杆后,根据触控点的位置变化量,调整虚拟摇杆在可移动区域中的位置,且当触控点在有效触控范围内移动时,虚拟摇杆的位置与触控点的位置实时同步变化。相比于相关技术中,在虚拟摇杆移动同样距离的情况下,本申请中触控点移动的距离更小,提高了用户的操作效率。
附图说明
图1是本申请一个实施例提供的虚拟对象的移动控制方法的流程图;
图2示例性示出了一种虚拟对象的移动控制方法的示意图;
图3是本申请另一个实施例提供的虚拟对象的移动控制方法的流程图;
图4示例性示出了计算虚拟摇杆第二时刻的实际位置的示意图;
图5示例性示出了虚拟对象的移动控制方法的流程图;
图6示例性示出了触控点反向移动的示意图;
图7示例性示出了触控点竖直移动的示意图;
图8是本申请一个实施例提供的虚拟对象的移动控制装置的框图;
图9是本申请另一个实施例提供的虚拟对象的移动控制装置的框图;
图10是本申请一个实施例提供的终端的结构框图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施方式作进一步地详细描述。
首先,对本申请实施例中涉及的若干个名词进行简要介绍。
虚拟环境:是应用程序在终端上运行时显示(或提供)的虚拟环境。该虚拟环境可以是对真实世界的仿真环境,也可以是半仿真半虚构的三维环境,还可以是纯虚构的三维环境。虚拟环境可以是二维虚拟环境、2.5维虚拟环境和三维虚拟环境中的任意一种,下述实施例以虚拟环境是三维虚拟环境来举例说明,但对此不加以限定。
虚拟对象:是指在虚拟环境中的可活动对象。该可活动对象可以是虚拟人物、虚拟动物、动漫人物中的至少一种。可选地,当虚拟环境为三维虚拟环境时,虚拟对象是基于动画骨骼技术创建的三维立体模型。每个虚拟对象在三维虚拟环境中具有自身的形状和体积,占据三维虚拟环境中的一部分空间。
虚拟摇杆:是指在虚拟环境中,用于控制虚拟对象在虚拟环境中移动的控件。用户通过终端屏幕上的触控操作,可以控制该虚拟摇杆的移动,进一步控制虚拟对象进行移动。可选地,在本申请实施例中,该虚拟摇杆可以是圆形、 在一些其它实施例中,该虚拟摇杆还可以是三角形、正方形、六边形、八边形等等,也可以是其它不规则图形,本申请实施例对此不作限定。上述虚拟摇杆在可移动区域内移动。可选地,该可移动区域的形状可以与虚拟摇杆的形状相同或者不同。例如,虚拟摇杆为圆形,可移动区域也为圆形,且两者为同心圆;又例如,虚拟摇杆为六边形,可移动区域为八边形,且两者中心重合。
本申请提供的虚拟对象的移动控制方法可以应用于的终端中,该终端可以是膝上型便携计算机、手机、平板电脑、电子书阅读器、电子游戏机、动态影像专家压缩标准音频层面4(Moving Picture Experts Group Audio Layer IV,MP4)播放器等等。
终端中安装基于虚拟环境的应用程序。该虚拟环境中包括虚拟对象。可选地,应用程序支持三维虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图程序、军事仿真程序、游戏程序,如TPS游戏、第一人称射击(First Person Shooting,FPS)游戏、多人在线战术竞技(Multiplayer Online Battle Arena,MOBA)游戏、多人枪战类生存游戏中的任意一种。可选地,该应用程序可以是单机版的应用程序,比如单机版的3D游戏程序;也可以是网络联机版的应用程序。
在传统方案中,用户手指在屏幕上的触控点只能在虚拟摇杆的可移动区域内移动,局限性比较强。为了解决传统方案存在的问题,相关技术提供了一种方法,即当触控点在虚拟摇杆的可移动区域之外时,虚拟摇杆仍然保持启动状态。在这种情况下,获取触控点的位置与可移动区域的中心之间的连线,将该连线与可移动区域的边缘的交点坐标,确定为虚拟摇杆的位置。
在上述相关技术中,当位于虚拟摇杆的可移动区域之外的触控点反向移动时,在触控点移动至可移动区域的边缘的过程中,虚拟摇杆的位置保持不变。触控点移动至可移动区域的边缘内时,虚拟摇杆才开始反向移动。从而,导致用户的操作效率低。
本申请提供的技术方案,在启动虚拟摇杆后,根据触控点的位置变化量,调整虚拟摇杆在可移动区域中的位置,且当触控点在有效触控范围内移动时,虚拟摇杆的位置与触控点的位置实时同步变化。相比于相关技术中,当触控点位于虚拟摇杆的可移动区域之外时,在虚拟摇杆移动同样距离的情况下,本申请中触控点移动的距离更小,提高了用户的操作效率。
下面,将通过几个实施例对本申请技术方案进行介绍说明。
请参考图1,其示出了本申请一个实施例提供的虚拟对象的移动控制方法的流程图。在本实施例中,主要以该方法应用于上文介绍的终端中来举例说明。该方法可以包括如下几个步骤(101-104):
步骤101,显示目标应用程序的目标视角画面,在目标视角画面上叠加显示虚拟摇杆和虚拟摇杆的可移动区域。
上述目标应用程序可以是基于虚拟环境的应用程序。可选地,该应用程序可以是虚拟现实应用程序、三维地图应用程序、军事仿真程序、游戏应用程序, 该游戏应用程序可以是TPS游戏、FPS游戏、MOBA游戏等中的至少一种。
终端运行该目标应用程序,以显示目标视角画面。上述目标视角画面是以虚拟对象的目标视角对虚拟环境进行观察所得到的画面。可选地,上述目标视角是采用虚拟对象的第一人称视角观察虚拟环境的视角,或者,上述目标视角是采用虚拟对象的第三人称视角观察虚拟环境的视角。
上述目标视角画面中可以叠加显示虚拟摇杆和该虚拟摇杆的可移动区域。另外,终端还可以显示虚拟环境中包括的至少一个虚拟对象。
上述虚拟摇杆用于在虚拟环境中控制虚拟对象进行移动。其它关于虚拟摇杆的介绍请参考上文的名词简要介绍中,此处不再赘述。
上述虚拟摇杆的可移动区域用于限制虚拟摇杆的移动范围,使虚拟摇杆只能在该可移动区域内移动。可选地,在初始状态下,虚拟摇杆位于可移动区域的中心。
可选地,上述可移动区域的形状可以与虚拟摇杆的形状相同或者不同。例如,虚拟摇杆为圆形,可移动区域也为圆形,且两者为同心圆;又例如,虚拟摇杆为六边形,可移动区域为八边形,且两者中心重合。
步骤102,当获取到对应于虚拟摇杆的触发操作时,启动虚拟摇杆。
终端可以检测是否接收到用户作用于屏幕上的用于启动虚拟摇杆的触发操作,当终端获取到对应于虚拟摇杆的触发操作时,启动虚拟摇杆。
可选地,该触发操作可以是按压操作、单击操作、滑动操作等等,本申请实施例对此不作限定。
可选地,终端可以通过触摸屏检测该触发操作的位置,即用户手指在屏幕上的触控点的位置,并判断该触控点是否在虚拟摇杆的触发范围内,当在虚拟摇杆的触发范围内时,根据该触发操作启动虚拟摇杆,当不在虚拟摇杆的触发范围内时,终端不响应该触发操作。上述虚拟摇杆的触发范围可以是虚拟摇杆的可移动区域,也可以是大于可移动区域的一块区域,本申请实施例对此不作限定。
另外,当终端检测到作用于屏幕上的触控操作消失时,关闭虚拟摇杆。
步骤103,在虚拟摇杆处于启动状态下,根据触控点的位置变化量,调整虚拟摇杆在可移动区域中的位置,其中,当触控点在有效触控范围内移动时,虚拟摇杆的位置与触控点的位置实时同步变化,有效触控范围包含且大于可移动区域。
若终端检测到作用于屏幕上的连续的触控操作,则确认该虚拟摇杆处于启动状态。在启动状态下,用户在终端屏幕上滑动使触控点位置发生改变,对应地,终端可以确定两个时刻之间该触控点的位置变化量,进一步,可以根据该触控点的位置变化量来调整虚拟摇杆在可移动区域中的位置。
在本申请实施例中,当触控点在有效触控范围内移动时,虚拟摇杆的位置与触控点的位置实时同步变化,有效触控范围包含且大于可移动区域。可选地,该有效触控范围可以是终端的整个屏幕,也可以是终端的部分屏幕区域,本申请实施例对此不作限定。在触控点的有效触控范围内移动时,当触控点的位置 发生改变的同时,虚拟摇杆的位置也发生改变。例如,当触控点的位置发生反向的同时,虚拟摇杆的位置也反向,两者实时同步变化。
需要说明的是,当触控点在可移动区域内移动时,虚拟摇杆的位置即为触控点的位置。当触控点在有效触控范围内但在可移动区域外移动时,终端根据触控点的位置,通过一定的运算得到虚拟摇杆的位置。
步骤104,根据虚拟摇杆的位置控制虚拟对象进行移动。
在确定虚拟摇杆的位置之后,终端可以根据虚拟摇杆的位置,如方向和距离,控制虚拟对象移动。
在一些其它实施例中,根据虚拟摇杆的位置还可以生成不同的标记,如警戒标记、物资标记、进攻标记、防御标记等等,其中,警戒标记用于提示虚拟对象进行警戒,或者,提示虚拟对象对标记的位置进行警戒;物资标记用于提示虚拟对象在标记的位置有物资;进攻标记用于提示虚拟对象开始进攻,或者,提示虚拟对象向标记的位置进攻;防御标记用于提示虚拟对象进行防御。
结合参考图2,其示例性示出了一种虚拟对象的移动控制方法的示意图。以虚拟摇杆和虚拟摇杆的可移动区域均为圆形,触控点反向移动为例。如图2所示,在初始状态下,虚拟摇杆10位于可移动区域20的中心。当终端获取到对应于虚拟摇杆10的触发操作时,启动虚拟摇杆10;当触控点30在可移动区域20内移动时,触控点30的位置与虚拟摇杆10的位置相同;当触控点30在可移动区域20之外移动时时,虚拟摇杆10在可移动区域20的边缘上移动;当触控点30开始反向移动的同时,位于可移动区域20的边缘上的虚拟摇杆10也开始反向移动,两者实时同步。
综上所述,在本申请实施例提供的技术方案中,在启动虚拟摇杆后,根据触控点的位置变化量,调整虚拟摇杆在可移动区域中的位置,且当触控点在有效触控范围内移动时,虚拟摇杆的位置与触控点的位置实时同步变化。相比于相关技术中,当触控点位于虚拟摇杆的可移动区域之外时,在虚拟摇杆移动同样距离的情况下,本申请中触控点移动的距离更小,提高了用户的操作效率。
请参考图3,其示出了本申请另一个实施例提供的虚拟对象的移动控制方法的流程图。在本实施例中,主要以该方法应用于上文介绍的终端中来举例说明。该方法可以包括如下几个步骤:
步骤301,显示目标应用程序的目标视角画面,在目标视角画面上叠加显示虚拟摇杆和虚拟摇杆的可移动区域。
可选地,在初始状态下,虚拟摇杆位于可移动区域的中心。
本步骤与图1实施例中的步骤101相同或类似,此处不再赘述。
步骤302,当获取到对应于虚拟摇杆的触发操作时,启动虚拟摇杆。
本步骤与图1实施例中的步骤102相同或类似,此处不再赘述。
可选地,终端可以实时判断虚拟摇杆是否仍然启动,当判断为是时,终端继续保持虚拟摇杆处于启动状态;当判断为否时,终端关闭虚拟摇杆。
在虚拟摇杆处于启动状态下,执行下述步骤303。
步骤303,获取触控点从第一时刻到第二时刻的位置变化量。
终端可以实时获取触控点的位置,从而可以得到触控点在第一时刻的位置和第二时刻的位置,进一步得到触控点从第一时刻到第二时刻的位置变化量。该位置变化量为包括方向和距离的矢量。可选地,上述第一时刻和第二时刻之间间隔目标时长,本申请实施例中对目标时长不作限定。
示例性地,触控点在第一时刻的位置可以表示为(X1,Y1),触控点在第二时刻的位置可以表示为(X2,Y2),则触控点从第一时刻到第二时刻的位置变化量,即横坐标的变化量与纵坐标的变化量,可以表示为(X0,Y0)=(X2-X1,Y2-Y1)。
步骤304,根据位置变化量和虚拟摇杆在第一时刻的位置,计算虚拟摇杆在第二时刻的预计位置。
终端还可以获取虚拟摇杆在第一时刻的位置,当触控点从第一时刻的位置移动到第二时刻的位置时,虚拟摇杆的位置也发生改变。假设当虚拟摇杆不受可移动区域的限制时,该虚拟摇杆可以实现与触控点相同的位置变化量,此时即得到第二时刻的预计位置。
示例性地,虚拟摇杆在第一时刻的位置可以表示为(x1,y1),触控点的位置变化量可以表示为(X0,Y0),则虚拟摇杆在第二时刻的预计位置可以表示为(A,B)=(x1+X0,y1+Y0)。可选地,若第一时刻为初始状态,则虚拟摇杆在第一时刻的位置可以表示为(0,0)。
可选地,终端可以判断虚拟摇杆在第二时刻的预计位置时,是否位于可移动区域之内,当位于可移动区域之外时,执行下述步骤305;当位于可移动区域之内时,执行下述步骤306。
示例性地,终端可以计算该虚拟摇杆在第二时刻的预计位置,相对于可移动区域中心(即原点)的距离,若该距离大于可移动区域,则认为该预计位置位于可移动区域之外;若该距离小于或等于可移动区域,则认为该预计位置位于可移动区域之内。
以可移动区域为半径为R的圆形为例,虚拟摇杆在第二时刻的预计位置可以表示为(A,B),则该预计位置相对于可移动区域中心的距离可以表示为
Figure PCTCN2020092342-appb-000001
当r≤R时,认为该预计位置位于可移动区域之内;当r>R时,认为该预计位置位于可移动区域之外。
步骤305,若预计位置位于可移动区域之外,则根据预计位置和可移动区域,计算虚拟摇杆在第二时刻的实际位置。
实际上,由于虚拟摇杆只能在可移动区域内移动,因此,当预计位置位于可移动区域之外时,需要将上述预计位置映射到其实际位置上。
可选地,上述步骤305可以包括下述两个步骤:
(1)计算预计位置与可移动区域的中心之间的连线,与可移动区域的边缘的交点坐标;
(2)将该交点坐标确定为虚拟摇杆在第二时刻的实际位置。
示例性地,结合参考图4,其示例性示出了计算虚拟摇杆第二时刻的实际位置的示意图。以可移动区域和虚拟摇杆为圆形为例,且可移动区域的半径为R, 其中,触控点在第一时刻的位置为M(X1,Y1),触控点在第二时刻的位置为N(X2,Y2),虚拟摇杆在第一时刻的位置为P(x1,y1),虚拟摇杆在第二时刻的预计位置为Q1(A,B),虚拟摇杆在第二时刻的实际位置为Q2(x2,y2)。根据M(X1,Y1)和N(X2,Y2),可以得到触控点的位置变化量,即X轴变化了X0=X2-X1,Y轴变化了Y0=Y2-Y1;虚拟摇杆在第二时刻的预计位置Q1(A,B),其中,A=x1+X0,B=y1+Y0;对应地,预计位置Q1(A,B)与可移动区域的中心O(0,0)之间的连线,与可移动区域的边缘的交点坐标为Q2(x2,y2),其中,x2=R*SINβ,y2=R*COSβ,又根据几何原理,β=α=arctan(A/B),即x2=R*SIN(arctan(A/B)),y2=R*COS(arctan(A/B));进一步,将该交点坐标确定Q2(x2,y2)确定为虚拟摇杆在第二时刻的实际位置。
步骤306,若预计位置位于可移动区域之内,则将预计位置确定为虚拟摇杆在第二时刻的实际位置。
当预计位置位于可移动区域之内时,终端直接将该预计位置确定为虚拟摇杆在第二时刻的实际位置。
步骤307,根据虚拟摇杆的位置,计算虚拟摇杆相对于可移动区域的中心的方向和距离。
在获取到上述虚拟摇杆第二时刻的实际位置后,终端可以计算虚拟摇杆相对于可移动区域的中心的方向和距离。
步骤308,根据方向和距离,确定虚拟对象的移动方向和移动速度。
进一步,终端根据该虚拟摇杆的方向和距离,确定虚拟对象的移动方向和移动速度。其中,虚拟摇杆的方向可以映射为虚拟对象的移动方向,虚拟摇杆的距离可以映射为虚拟对象的移动速度,当该距离越大时,表示虚拟对象的移动速度越快。
步骤309,根据移动方向和移动速度,控制虚拟对象进行移动。
在获取到上述移动方向和移动速度之后,控制虚拟对象在虚拟环境中进行移动,如行走、奔跑、跳跃等。
综上所述,在本申请实施例提供的技术方案中,通过判断虚拟摇杆的预计位置是否在可移动区域内,并当预计位置位于可移动区域之外时,将预计位置与可移动区域的中心之间的连线,与可移动区域的边缘的交点坐标,确定为虚拟摇杆在第二时刻的实际位置。本申请实施例提供的技术方案,在完成该项操作时触控点移动距离更小,即用户手指滑动距离更小,进一步提高了用户的操作效率。
下面,结合参考图5,其示例性示出了虚拟对象的移动控制方法的流程图。
步骤501,获取对应于虚拟摇杆的触发操作,启动虚拟摇杆。
步骤502,判断虚拟摇杆是否仍然处于启动状态。
若虚拟摇杆不是处于启动状态,执行下述步骤503;若虚拟摇杆仍然处于启动状态,则执行下述步骤504。
步骤503,关闭虚拟摇杆。
步骤504,实时获取触控点的位置坐标(X,Y)。
步骤505,获取触控点从第一时刻到第二时刻的位置变化量(X0,Y0)。
步骤506,根据位置变化量(X0,Y0)和虚拟摇杆在第一时刻的坐标(x1,y1),计算虚拟摇杆在第二时刻的预计位置横纵坐标(A,B)。
步骤507,计算虚拟摇杆在第二时刻的预计位置横纵坐标(A,B)距离可移动区域的中心(0,0)的距离r。
步骤508,判断上述距离r是否大于可移动区域的半径R。
当r<=R时,执行下述步骤509;当r>R时,执行下述步骤510。
步骤509,将预计位置(A,B)确定为虚拟摇杆在第二时刻的实际位置。
步骤510,将预计位置与可移动区域的中心之间的连线,与可移动区域的边缘的交点坐标,确定508为虚拟摇杆在第二时刻的实际位置。
上述交点坐标(x2,y2)中,x2=R*SIN(arctan(A/B)),y2=R*COS(arctan(A/B))。
步骤511,根据虚拟摇杆的位置,计算虚拟摇杆相对于可移动区域的中心的方向和距离。
步骤512,根据方向和距离,确定虚拟对象的移动方向和移动速度。
步骤513,根据移动方向和移动速度,控制虚拟对象进行移动。
下面,将本申请提供的技术方案与相关技术提供的技术方案进行对比,进一步阐述本本申请提供的技术方案的有益效果。
结合参考图6,其示例性示出了触控点反向移动的示意图。其中,图6中的(a)部分、(b)部分和(c)部分为相关技术提供的技术方案;图6中的(d)部分和(e)部分为本申请提供的技术方案。如图6中的(a)部分、(b)部分和(c)部分所示,当位于虚拟摇杆10的可移动区域20之外的触控点30反向移动时,在触控点30移动至可移动区域20的边缘的过程中,虚拟摇杆10的位置保持不变;触控点30移动至可移动区域20的边缘内时,虚拟摇杆10才开始反向移动。如图6中的(d)部分和(e)部分所示,当位于虚拟摇杆10的可移动区域20之外的触控点30反向移动时,位于可移动区域20的边缘的虚拟摇杆10同时反向移动。
因此,相比于相关技术,本申请实施例提供的技术方案,可以快速改变虚拟摇杆的方向,不需要反向补偿距离,进一步可以快速操作虚拟对象移动。
结合参考图7,其示例性示出了触控点竖直移动的示意图。其中,图7中的(a)部分和(b)部分为相关技术提供的技术方案;图7中的(c)部分和图(d)部分为本申请提供的技术方案。对比(c)部分和(d)部分,可以看出,当位于虚拟摇杆10的可移动区域20之外的触控点30向上移动时,在虚拟摇杆10移动相同距离的情况下,相关技术提供的技术方案中,触控点30需要移动的距离大于本申请提供的技术方案中触控点30需要移动的距离。
因此,相比于相关技术,本申请实施例提供的技术方案,在虚拟摇杆移动同样距离时,触控点移动距离更小,即用户手指滑动距离更小,提高了用户的操作效率。
另外,由于虚拟摇杆的位置与触控点的位置实时同步变化,如反向移动和 竖直移动,相比于相关技术,本申请实施例提供的技术方案,在虚拟摇杆移动同样距离时,用户手指滑动距离更小,因此,用户可以将该虚拟摇杆移动至手指方便操作的位置,然后在该位置执行触控操作,如旋转、滑动和按压等,以控制虚拟摇杆的移动。从而,能够有效缓解因用户手指长度不同而导致的用户操作舒适度不一致的问题。
下述为本申请装置实施例,可以用于执行本申请方法实施例。对于本申请装置实施例中未披露的细节,请参照本申请方法实施例。
请参考图8,其示出了本申请一个实施例提供的虚拟对象的移动控制装置的框图。该装置具有实现上述方法示例的功能,所述功能可以由硬件实现,也可以由硬件执行相应的软件实现。该装置可以是上文介绍的终端,也可以设置在终端上。该装置800可以包括:显示模块810、启动模块820、调整模块830和控制模块840。
显示模块810,用于显示目标应用程序的目标视角画面,在目标视角画面上叠加显示虚拟摇杆和所述虚拟摇杆的可移动区域。
启动模块820,用于当获取到对应于所述虚拟摇杆的触发操作时,启动所述虚拟摇杆。
调整模块830,用于在所述虚拟摇杆处于启动状态下,根据触控点的位置变化量,调整所述虚拟摇杆在所述可移动区域中的位置,其中,当所述触控点在有效触控范围内移动时,所述虚拟摇杆的位置与所述触控点的位置实时同步变化,所述有效触控范围包含且大于所述可移动区域。
控制模块840,用于根据所述虚拟摇杆的位置控制所述虚拟对象进行移动。
综上所述,在本申请实施例提供的技术方案中,在启动虚拟摇杆后,根据触控点的位置变化量,调整虚拟摇杆在可移动区域中的位置,且当触控点在有效触控范围内移动时,虚拟摇杆的位置与触控点的位置实时同步变化。相比于相关技术中,当触控点位于虚拟摇杆的可移动区域之外时,在虚拟摇杆移动同样距离的情况下,本申请中触控点移动的距离更小,提高了用户的操作效率。
在一些可能的设计中,所述调整模块830,包括:变化量获取单元831、预计位置计算单元832和实际位置计算单元833。
变化量获取单元831,用于获取所述触控点从第一时刻到第二时刻的位置变化量.
预计位置计算单元832,用于根据所述位置变化量和所述虚拟摇杆在所述第一时刻的位置,计算所述虚拟摇杆在所述第二时刻的预计位置。
实际位置计算单元833,用于当所述预计位置位于所述可移动区域之外时,根据所述预计位置和所述可移动区域,计算所述虚拟摇杆在所述第二时刻的实际位置。
在一些可能的设计中,所述可移动区域为圆形;
所述实际位置计算单元833,用于计算所述预计位置与所述可移动区域的中心之间的连线,与所述可移动区域的边缘的交点坐标;将所述交点坐标确定为所述虚拟摇杆在所述第二时刻的实际位置。
在一些可能的设计中,所述装置800还包括:确定模块850。
确定模块850,用于当所述预计位置位于所述可移动区域之内时,将所述预计位置确定为所述虚拟摇杆在所述第二时刻的实际位置。
在一些可能的设计中,所述控制模块840,用于根据所述虚拟摇杆的位置,计算所述虚拟摇杆相对于所述可移动区域的中心的方向和距离;根据所述方向和距离,确定所述虚拟对象的移动方向和移动速度;根据所述移动方向和移动速度,控制所述虚拟对象进行移动。
需要说明的是,上述实施例提供的装置,在实现其功能时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将设备的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的装置与方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
请参考图10,其示出了本申请一个实施例提供的终端的结构框图。通常,终端1000包括有:处理器1001和存储器1002。
处理器1001可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器1001可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。处理器1001也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器1001可以在集成有GPU(Graphics Processing Unit,图像处理器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器1001还可以包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器1002可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是非暂态的。存储器1002还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1002中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器1001所执行以实现本申请中方法实施例提供的虚拟对象的移动控制方法。
在一些实施例中,终端1000还可选包括有:外围设备接口1003和至少一个外围设备。处理器1001、存储器1002和外围设备接口1003之间可以通过总线或信号线相连。各个外围设备可以通过总线、信号线或电路板与外围设备接口1003相连。具体地,外围设备可以包括:通信接口1004、显示屏1005、音频电路1006、摄像头组件1007、定位组件1008和电源1009中的至少一种。
本领域技术人员可以理解,图10中示出的结构并不构成对终端2100的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
在示例性实施例中,还提供了一种终端。所述终端包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现上述虚拟对象的移动控制方法。
在示例性实施例中,还提供了一种计算机可读存储介质,所述计算机可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或所述指令集在被处理器执行时实现上述虚拟对象的移动控制方法。
在示例性实施例中,还提供了一种计算机程序产品,当该计算机程序产品被处理器执行时,其用于实现上述虚拟对象的移动控制方法。
应当理解的是,在本文中提及的“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。字符“/”一般表示前后关联对象是一种“或”的关系。
以上所述仅为本申请的示例性实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (12)

  1. 一种虚拟对象的移动控制方法,应用于终端,所述方法包括:
    显示目标应用程序的目标视角画面,在所述目标视角画面上叠加显示虚拟摇杆和所述虚拟摇杆的可移动区域;
    当获取到对应于所述虚拟摇杆的触发操作时,启动所述虚拟摇杆;
    在所述虚拟摇杆处于启动状态下,根据触控点的位置变化量,调整所述虚拟摇杆在所述可移动区域中的位置,其中,当所述触控点在有效触控范围内移动时,所述虚拟摇杆的位置与所述触控点的位置实时同步变化,所述有效触控范围包含且大于所述可移动区域;
    根据所述虚拟摇杆的位置控制所述虚拟对象进行移动。
  2. 根据权利要求1所述的方法,其中,所述根据触控点的位置变化量,调整所述虚拟摇杆在所述可移动区域中的位置,包括:
    获取所述触控点从第一时刻到第二时刻的位置变化量;
    根据所述位置变化量和所述虚拟摇杆在所述第一时刻的位置,计算所述虚拟摇杆在所述第二时刻的预计位置;
    若所述预计位置位于所述可移动区域之外,则根据所述预计位置和所述可移动区域,计算所述虚拟摇杆在所述第二时刻的实际位置。
  3. 根据权利要求2所述的方法,其中,所述可移动区域为圆形;
    所述根据所述预计位置和所述可移动区域,计算所述虚拟摇杆在所述第二时刻的实际位置,包括:
    计算所述预计位置与所述可移动区域的中心之间的连线,与所述可移动区域的边缘的交点坐标;
    将所述交点坐标确定为所述虚拟摇杆在所述第二时刻的实际位置。
  4. 根据权利要求2所述的方法,其中,所述根据所述位置变化量和所述虚拟摇杆在所述第一时刻的位置,计算所述虚拟摇杆在所述第二时刻的预计位置之后,还包括:
    若所述预计位置位于所述可移动区域之内,则将所述预计位置确定为所述虚拟摇杆在所述第二时刻的实际位置。
  5. 根据权利要求1至4任一项所述的方法,其中,所述根据所述虚拟摇杆的位置控制所述虚拟对象进行移动,包括:
    根据所述虚拟摇杆的位置,计算所述虚拟摇杆相对于所述可移动区域的中心的方向和距离;
    根据所述方向和距离,确定所述虚拟对象的移动方向和移动速度;
    根据所述移动方向和移动速度,控制所述虚拟对象进行移动。
  6. 一种虚拟对象的移动控制装置,所述装置包括:
    显示模块,用于显示目标应用程序的目标视角画面,在所述目标视角画面上叠加显示虚拟摇杆和所述虚拟摇杆的可移动区域;
    启动模块,用于当获取到对应于所述虚拟摇杆的触发操作时,启动所述虚拟摇杆;
    调整模块,用于在所述虚拟摇杆处于启动状态下,根据触控点的位置变化量,调整所述虚拟摇杆在所述可移动区域中的位置,其中,当所述触控点在有效触控范围内移动时,所述虚拟摇杆的位置与所述触控点的位置实时同步变化,所述有效触控范围包含且大于所述可移动区域;
    控制模块,用于根据所述虚拟摇杆的位置控制所述虚拟对象进行移动。
  7. 根据权利要求6所述的装置,其中,所述调整模块,包括:
    变化量获取单元,用于获取所述触控点从第一时刻到第二时刻的位置变化量;
    预计位置计算单元,用于根据所述位置变化量和所述虚拟摇杆在所述第一时刻的位置,计算所述虚拟摇杆在所述第二时刻的预计位置;
    实际位置计算单元,用于当所述预计位置位于所述可移动区域之外时,根据所述预计位置和所述可移动区域,计算所述虚拟摇杆在所述第二时刻的实际位置。
  8. 根据权利要求7所述的装置,其中,所述可移动区域为圆形;
    所述实际位置计算单元,用于:
    计算所述预计位置与所述可移动区域的中心之间的连线,与所述可移动区域的边缘的交点坐标;
    将所述交点坐标确定为所述虚拟摇杆在所述第二时刻的实际位置。
  9. 根据权利要求7所述的装置,其中,所述装置还包括:
    确定模块,用于当所述预计位置位于所述可移动区域之内时,将所述预计位置确定为所述虚拟摇杆在所述第二时刻的实际位置。
  10. 根据权利要求6至9任一项所述的装置,其中,所述控制模块,用于:
    根据所述虚拟摇杆的位置,计算所述虚拟摇杆相对于所述可移动区域的中心的方向和距离;
    根据所述方向和距离,确定所述虚拟对象的移动方向和移动速度;
    根据所述移动方向和移动速度,控制所述虚拟对象进行移动。
  11. 一种终端,所述终端包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如权利要求1至5任一项所述的方法。
  12. 一种计算机可读存储介质,所述计算机可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由处理器加载并执行以实现如权利要求1至5任一项所述的方法。
PCT/CN2020/092342 2019-06-05 2020-05-26 虚拟对象的移动控制方法、装置、终端和存储介质 WO2020244421A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2021541498A JP7238143B2 (ja) 2019-06-05 2020-05-26 仮想対象の移動制御方法及びその装置、端末並びにコンピュータプログラム
SG11202108601XA SG11202108601XA (en) 2019-06-05 2020-05-26 Method and apparatus for controlling movement of virtual object, and terminal and storage medium
KR1020217023113A KR102539606B1 (ko) 2019-06-05 2020-05-26 가상 객체의 이동을 제어하기 위한 방법과 장치, 및 단말기와 저장 매체
US17/359,497 US11513657B2 (en) 2019-06-05 2021-06-25 Method and apparatus for controlling movement of virtual object, terminal, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910487940.0A CN110096214B (zh) 2019-06-05 2019-06-05 虚拟对象的移动控制方法、装置、终端和存储介质
CN201910487940.0 2019-06-05

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/359,497 Continuation US11513657B2 (en) 2019-06-05 2021-06-25 Method and apparatus for controlling movement of virtual object, terminal, and storage medium

Publications (1)

Publication Number Publication Date
WO2020244421A1 true WO2020244421A1 (zh) 2020-12-10

Family

ID=67450455

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/092342 WO2020244421A1 (zh) 2019-06-05 2020-05-26 虚拟对象的移动控制方法、装置、终端和存储介质

Country Status (6)

Country Link
US (1) US11513657B2 (zh)
JP (1) JP7238143B2 (zh)
KR (1) KR102539606B1 (zh)
CN (1) CN110096214B (zh)
SG (1) SG11202108601XA (zh)
WO (1) WO2020244421A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024012010A1 (zh) * 2022-07-12 2024-01-18 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、设备、存储介质及程序产品

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110096214B (zh) * 2019-06-05 2021-08-06 腾讯科技(深圳)有限公司 虚拟对象的移动控制方法、装置、终端和存储介质
CN111589112B (zh) * 2020-04-24 2021-10-22 腾讯科技(深圳)有限公司 界面显示方法、装置、终端及存储介质
CN111632372A (zh) * 2020-06-03 2020-09-08 深圳市瑞立视多媒体科技有限公司 虚拟对象的控制方法、装置、设备及存储介质
CN114489457B (zh) * 2022-01-27 2024-01-19 北京字跳网络技术有限公司 虚拟对象的控制方法、装置、可读介质和电子设备
CN115129224B (zh) * 2022-07-26 2023-08-04 网易(杭州)网络有限公司 移动控制的方法、装置、存储介质及电子设备
CN115460543B (zh) * 2022-08-31 2024-04-19 中国地质大学(武汉) 一种分布式环形栅栏覆盖方法、设备及存储设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110285636A1 (en) * 2010-05-20 2011-11-24 Howard John W Touch screen with virtual joystick and methods for use therewith
CN105302453A (zh) * 2014-06-26 2016-02-03 工合线上娱乐株式会社 终端装置
CN107577345A (zh) * 2017-09-04 2018-01-12 苏州英诺迈医学创新服务有限公司 一种控制虚拟人物漫游的方法及装置
CN108404408A (zh) * 2018-02-01 2018-08-17 网易(杭州)网络有限公司 信息处理方法、装置、存储介质及电子设备
CN110096214A (zh) * 2019-06-05 2019-08-06 腾讯科技(深圳)有限公司 虚拟对象的移动控制方法、装置、终端和存储介质

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4032404B2 (ja) * 1998-07-10 2008-01-16 フジノン株式会社 操作装置
JP4932010B2 (ja) * 2010-01-06 2012-05-16 株式会社スクウェア・エニックス ユーザインタフェース処理装置、ユーザインタフェース処理方法、およびユーザインタフェース処理プログラム
JP5669697B2 (ja) * 2011-09-13 2015-02-12 株式会社ソニー・コンピュータエンタテインメント 情報処理装置、情報処理方法、およびコンテンツファイルのデータ構造
CN105335065A (zh) * 2015-10-10 2016-02-17 腾讯科技(深圳)有限公司 一种信息处理方法、终端及计算机存储介质
CN107042018B (zh) * 2016-02-05 2018-09-18 腾讯科技(深圳)有限公司 控制对象的空间位置确定方法和装置
KR101984305B1 (ko) * 2017-04-24 2019-05-30 주식회사 넥슨코리아 인터페이스 제공 방법 및 장치
CN107837531B (zh) * 2017-09-28 2018-11-23 网易(杭州)网络有限公司 信息处理方法、装置、电子设备及存储介质
CN107754309B (zh) * 2017-09-30 2019-03-08 网易(杭州)网络有限公司 信息处理方法、装置、电子设备及存储介质
WO2019235180A1 (ja) * 2018-06-06 2019-12-12 株式会社コナミデジタルエンタテインメント 記録媒体、及び、情報処理装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110285636A1 (en) * 2010-05-20 2011-11-24 Howard John W Touch screen with virtual joystick and methods for use therewith
CN105302453A (zh) * 2014-06-26 2016-02-03 工合线上娱乐株式会社 终端装置
CN107577345A (zh) * 2017-09-04 2018-01-12 苏州英诺迈医学创新服务有限公司 一种控制虚拟人物漫游的方法及装置
CN108404408A (zh) * 2018-02-01 2018-08-17 网易(杭州)网络有限公司 信息处理方法、装置、存储介质及电子设备
CN110096214A (zh) * 2019-06-05 2019-08-06 腾讯科技(深圳)有限公司 虚拟对象的移动控制方法、装置、终端和存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024012010A1 (zh) * 2022-07-12 2024-01-18 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、设备、存储介质及程序产品

Also Published As

Publication number Publication date
KR20210103553A (ko) 2021-08-23
JP7238143B2 (ja) 2023-03-13
CN110096214B (zh) 2021-08-06
KR102539606B1 (ko) 2023-06-01
US20210326027A1 (en) 2021-10-21
SG11202108601XA (en) 2021-09-29
US11513657B2 (en) 2022-11-29
JP2022518465A (ja) 2022-03-15
CN110096214A (zh) 2019-08-06

Similar Documents

Publication Publication Date Title
WO2020244421A1 (zh) 虚拟对象的移动控制方法、装置、终端和存储介质
JP7331124B2 (ja) 仮想オブジェクトの制御方法、装置、端末及び記憶媒体
JP2023076494A (ja) 仮想環境においてマーク情報を生成する方法、装置、電子機器及び記憶媒体
EP3970819B1 (en) Interface display method and apparatus, and terminal and storage medium
KR20210140747A (ko) 가상 객체 제어 방법 및 장치, 디바이스 및 매체
WO2019154255A1 (zh) 视角调整方法、装置、电子装置及计算机可读存储介质
JP6438198B2 (ja) プログラム及びゲーム装置
JP7451563B2 (ja) 仮想キャラクタの制御方法並びにそのコンピュータ機器、コンピュータプログラム、及び仮想キャラクタの制御装置
WO2021203904A1 (zh) 虚拟环境画面的显示方法、装置、设备及存储介质
US20230059116A1 (en) Mark processing method and apparatus, computer device, storage medium, and program product
JP7137719B2 (ja) 仮想オブジェクトの選択方法、装置、端末及びプログラム
CN110613933A (zh) 游戏中技能释放控制方法、装置、存储介质和处理器
WO2023020125A1 (zh) 虚拟环境画面的显示方法、装置、终端、介质及程序产品
US20230082928A1 (en) Virtual aiming control
JP7384521B2 (ja) 仮想オブジェクトの制御方法、装置、コンピュータ機器及びコンピュータプログラム
WO2023071808A1 (zh) 基于虚拟场景的图形显示方法、装置、设备以及介质
JP2024536009A (ja) 画面表示方法並びにその、装置、端末及びコンピュータプログラム
US20240316455A1 (en) Processing information for virtual environment
JP2024538553A (ja) 仮想オブジェクトの切替方法、装置、コンピュータデバイス及びコンピュータプログラム
WO2024001504A1 (zh) 画面显示方法、装置、设备、存储介质及程序产品

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20818723

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021541498

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20217023113

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20818723

Country of ref document: EP

Kind code of ref document: A1