WO2019153837A1 - 虚拟对象控制方法、装置、电子装置及存储介质 - Google Patents

虚拟对象控制方法、装置、电子装置及存储介质 Download PDF

Info

Publication number
WO2019153837A1
WO2019153837A1 PCT/CN2018/117034 CN2018117034W WO2019153837A1 WO 2019153837 A1 WO2019153837 A1 WO 2019153837A1 CN 2018117034 W CN2018117034 W CN 2018117034W WO 2019153837 A1 WO2019153837 A1 WO 2019153837A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual object
acceleration
virtual
touch
viewing angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2018/117034
Other languages
English (en)
French (fr)
Chinese (zh)
Inventor
邓杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to JP2020541985A priority Critical patent/JP7166708B2/ja
Publication of WO2019153837A1 publication Critical patent/WO2019153837A1/zh
Priority to US16/893,773 priority patent/US11161041B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/807Gliding or sliding on surfaces, e.g. using skis, skates or boards
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5255Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/65Methods for processing data by generating or executing the game program for computing the condition of a game character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8005Athletics
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • the present invention relates to the field of computer technologies, and in particular, to a virtual object control method, apparatus, electronic device, and storage medium.
  • simulation technology refers to the use of similar principles to establish a model of the research object, and indirectly through the model to study the experimental method of prototype regularity.
  • a person can control a virtual object by controlling a virtual object in a virtual scene to make a movement, a jump, a shooting, a skydiving, and the like to simulate a human motion in reality.
  • the virtual object control method generally sets a virtual joystick area, determines a moving direction of the virtual object by detecting a touch operation of the virtual rocker area, and controls the virtual object to move in the moving direction, and
  • the fixed angle of view shows the change in position of the virtual object, wherein the direction of movement is usually up, down, left, and right.
  • the embodiment of the invention provides a virtual object control method, device, electronic device and storage medium, which can solve the problem that the action of the virtual object cannot be simulated.
  • the technical solution is as follows:
  • a virtual object control method comprising:
  • a virtual object control apparatus comprising:
  • a determining module configured to determine a viewing angle of the virtual scene when the touch operation on the control area is detected, where the control area is used to control a moving manner of the virtual object in the virtual scene;
  • An acquiring module configured to acquire an acceleration of the virtual object during a falling process based on a touch operation of the control area, the viewing angle, and a gravity acceleration;
  • control module configured to control the virtual object to fall in the virtual scene according to the acceleration.
  • an electronic device comprising: a processor; a memory for storing a computer program; wherein the processor is configured to execute a computer program stored on the memory to implement the virtual object control The method steps of any of the methods.
  • a computer readable storage medium storing a computer program, the computer program being executed by a processor to implement the virtual object control method Method steps.
  • the acceleration of the virtual object is determined, and the movement control of the virtual object is implemented, and the direction of the acceleration is not fixed, and the size is not Zero, the virtual object can be moved in any direction, and the virtual object can be accelerated or decelerated, so that the action of the real person can be simulated.
  • FIG. 1 is a schematic diagram of a terminal interface according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of orientation of a virtual object according to an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of orientation of a virtual object according to an embodiment of the present invention.
  • FIG. 4 is a flowchart of a virtual object control method according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a body posture of a virtual object according to an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of a body posture of a virtual object according to an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of a body posture of a virtual object according to an embodiment of the present invention.
  • FIG. 8 is a schematic diagram of a body posture of a virtual object according to an embodiment of the present invention.
  • FIG. 9 is a flowchart of a virtual object control method according to an embodiment of the present invention.
  • FIG. 10 is a schematic structural diagram of a virtual object control apparatus according to an embodiment of the present invention.
  • FIG. 11 is a schematic structural diagram of an electronic device 1100 according to an embodiment of the present invention.
  • the terminal may simulate a virtual scene based on the acquired scene data, and the virtual scene may be a virtual reality, an electronic game scene, or a simulation scene in the fields of national defense science and physics teaching.
  • the virtual scene may be a virtual reality, an electronic game scene, or a simulation scene in the fields of national defense science and physics teaching.
  • the user can perform a touch operation on the terminal, and the terminal can determine the game data corresponding to the touch operation according to the detected touch operation, and render and display the game data.
  • the data may include virtual scene data, behavior data of virtual objects in the virtual scene, and the like.
  • the virtual scene in the embodiment of the present invention may be used to simulate a virtual space, and the virtual space may be an open space.
  • the virtual scene may be a three-dimensional virtual scene or a two-dimensional virtual scene. limited.
  • the user can control the virtual object to move in the virtual scene, where the virtual object can be a virtual avatar for representing the user in the virtual scene, and the virtual object has its own shape and volume in the virtual scene, occupying the virtual scene. Part of the space.
  • the avatar may be in any form, for example, a person, an animal, or the like, which is not limited by the embodiment of the present invention.
  • the three-dimensional virtual space may include a sky area and a non-sky area, and the non-sky area may be a land area and a sea area.
  • the movement process of the virtual object due to gravity It is a falling process.
  • the user can perform a touch operation during the falling of the virtual object, change the falling speed of the virtual object and the offset direction during the falling process, so that the user can choose to land at different landing points.
  • the method can also be used in the process of dive in the ocean area. At this time, the acceleration of the virtual object when dive can be adjusted based on the touch operation, thereby changing the speed of the dive and the moving direction.
  • the method for determining that the virtual object is located in the sky region may be a ray detecting method, and the terminal may vertically emit a ray from the target portion of the virtual object, and detect between the virtual object and the object in a vertically downward direction.
  • the distance when the distance is not zero, the terminal can determine that the virtual object is located in the sky area.
  • the target part may be the foot of the virtual object, and the head of the virtual object is not limited in this embodiment of the present invention.
  • the terminal may invoke a ray detecting function to perform the ray detecting step to determine whether the virtual object is in contact with an object on the ground or the ground according to whether the ray detects a collision with an object on the ground or the ground.
  • the terminal may further detect the coordinates of the object in the vertical coordinate direction of the virtual object in the world coordinate system by using a ray, and determine the virtual object based on the coordinates of the object and the coordinates of the virtual object. Whether to contact the object to determine whether the virtual object is in the sky area.
  • the terminal may further determine whether the virtual object is located in the sky area by determining whether the size of the ray vector from the virtual object to the ground or the object on the ground is zero. When the size of the ray vector is not zero, it can be determined that the virtual object is located in the sky area.
  • the terminal can also project the object or the ground and the virtual object on a plane by detecting the object or the ground on the ground, and calculate the distance between the virtual object and the object or the ground.
  • the plane may be a plane perpendicular to the ground. In the embodiment of the present invention, it is not specifically limited to determine which ray detecting method is used to determine that the virtual object is located in the sky region.
  • the virtual scene When the virtual object is located in the non-sky area, the virtual scene may have ground support in addition to gravity in the vertical direction, or the vertical upward buoyancy in the ocean, the virtual object no longer falls, and the user can control the virtual The object moves, runs, jumps, and advances on land, or floats and swims in the ocean.
  • the terminal controls the virtual object based on the touch operation to simulate the action may also be different. How does the specific terminal control the virtual object based on the touch operation? For the movement, see the embodiment shown in FIG.
  • the user can control the moving speed of the virtual object through the touch operation, and can also control the moving direction of the virtual object through the touch operation.
  • the virtual object is a virtual avatar of the user, taking the first person perspective as an example, the virtual scene seen by the user is usually a virtual scene observed through the perspective of the virtual object, and in reality, when the person moves, the perspective is usually the front of the person.
  • it can provide a control method: change the moving direction of the virtual object by adjusting the angle of view of the virtual scene.
  • the terminal may further provide a control area, where the control area is used to control the movement manner of the virtual object in the virtual scene, and the terminal may determine the movement of the virtual object according to the perspective of the virtual scene and the touch operation of the control area.
  • Direction and speed of movement is used to control the movement manner of the virtual object in the virtual scene.
  • the control area may be a virtual rocker area or a virtual control area.
  • the control area may also be an area where the joystick control in the real rocker device is located. This is not specifically limited.
  • the touch operation of the control area by the user may be a sliding operation or a click operation, which is not limited by the embodiment of the present invention.
  • the virtual control area is an area that can be collected by the camera, and the touch operation of the virtual control area by the user may be a gesture operation, and the terminal may control the virtual object according to the gesture operation collected by the camera. Move.
  • the touch operation of the control area by the user may be a pressing operation, a toggle operation, or the like.
  • the shape of the virtual rocker region may be a circular shape or a semicircular shape.
  • the shape of the virtual rocker region is not specifically limited in the embodiment of the present invention.
  • the terminal can determine the direction and speed at which the user wants to control the movement of the virtual object according to the relative position between the end point of the touch operation of the user and the origin.
  • FIG. 1 is a schematic diagram of a terminal interface according to an embodiment of the present invention.
  • the shape of the virtual rocker area is circular, and the touch operation of the virtual rocker area is a sliding operation.
  • There may be four directional arrow icons in the virtual joystick area and the four directional arrow icons are upper, lower, left, and right, respectively, which are used to indicate forward, backward, leftward, and rightward, respectively.
  • the directional arrow icon is only used to make the direction of the virtual joystick area more intuitive, thereby guiding the user how to determine the direction corresponding to the user's operation.
  • the four directional arrow icons may not be displayed in the virtual joystick area, and the user may gradually familiarize with the touch operation of the virtual joystick area according to actual operation conditions.
  • the touch operations may be referred to as first, second, third, and fourth touch operations, respectively.
  • the terminal may determine the magnitude of the acceleration corresponding to the touch operation according to the distance between the end point of the touch operation and the origin of the virtual joystick area, where the acceleration refers to the touch Controlling the acceleration applied to the virtual object by the operation, the acceleration is used to control the moving direction and the moving speed of the virtual object, for example, the magnitude of the acceleration corresponding to the touch operation may be between the touch operation and the origin position.
  • the distance is positively correlated, that is, the greater the distance between the touch operation and the origin position of the virtual joystick area, the greater the acceleration corresponding to the touch operation.
  • the terminal may determine the magnitude of the acceleration corresponding to the touch operation according to the pressure value of the touch operation, etc., and the magnitude of the acceleration corresponding to the touch operation may be positively correlated with the pressure value of the touch operation.
  • the embodiment of the present invention does not specifically limit this.
  • the terminal may determine that the touch operation includes two sub-touch operations. For example, when the end point of the touch operation detected by the terminal is between the direction of the first touch operation and the direction of the third touch operation, that is, when the end point of the touch operation is located in the virtual joystick area.
  • the terminal may consider that the touch operation includes a first touch operation and a third touch operation, and optionally, according to an end point of the touch operation and an origin of the virtual joystick region.
  • the relative position of the touch operation determines the magnitude of the acceleration corresponding to the first touch operation and the third touch operation respectively.
  • the terminal may directly perform the touch operation and the virtual shake according to the touch operation.
  • the relative position between the origins of the rod regions determines the direction and magnitude of the acceleration corresponding to the touch operation.
  • the above is an exemplary description of the operation of the virtual rocker area.
  • the specific shape and operation setting of the virtual rocker area may be adjusted according to actual operation requirements, which is not specifically limited in the embodiment of the present invention.
  • the perspective is the perspective of the user viewing the virtual scene
  • the terminal may display the region corresponding to different angles in the same virtual scene according to different perspectives.
  • the terminal can provide a default perspective and a default orientation of the virtual object.
  • the orientation of the virtual object may include a first orientation that refers to a yaw angle in the attitude angle of the virtual object, and a second orientation that refers to a pitch angle in the attitude angle of the virtual object.
  • the yaw angle and the pitch angle are pitch coordinates and yaw coordinates in a preset Cartesian coordinate system whose origin is the center of the virtual object, the pitch coordinate range is -90° to 0°, and the yaw coordinate range is - 180° to 180°.
  • the terminal when the initial location of the virtual object can be located in the sky region in the virtual scene, the terminal can provide a first default perspective, which can be 45 degrees of the front and lower of the virtual object.
  • the terminal When the initial location of the virtual object is in a non-sky area in the virtual scene, the terminal may provide a second default perspective, which is directly in front of the virtual object when in the standing posture.
  • the default view is only an exemplary description, and the setting of the default view is not specifically limited in the embodiment of the present invention.
  • the default first orientation of the virtual object may be -90°
  • the default second orientation of the virtual object may be -90°.
  • the yaw angle of the first default viewing angle may be -90° and the pitch angle may be -45°.
  • FIG. 4 is a flowchart of a virtual object control method according to an embodiment of the present invention.
  • the virtual object control method includes the following steps:
  • the terminal determines a perspective of the virtual scene.
  • the terminal can provide the above-mentioned virtual joystick area and the perspective of the virtual scene, and the user can change the angle of view by adjusting the angle of view.
  • the terminal needs to determine the perspective of the virtual scene to determine how the user wants to control how the virtual object moves.
  • the user may perform a viewing angle adjustment operation on the terminal to adjust the viewing angle.
  • the terminal may adjust the viewing angle according to the viewing angle adjustment operation.
  • the viewing angle adjustment operation may be a sliding operation or a click operation.
  • the viewing angle adjustment operation is a sliding operation, and the user may perform a sliding operation on the terminal screen.
  • the sliding direction of the sliding operation may be performed according to the sliding operation.
  • the angle of view is adjusted such that the direction of movement of the angle of view is the same as the direction of the sliding.
  • the angle of the viewing angle movement may be proportional to the sliding distance of the sliding operation. That is, the larger the sliding distance, the larger the angle at which the viewing angle moves.
  • the user may also set another virtual joystick area, and the other virtual joystick area is used by the terminal to detect the touch operation of the user to implement the adjustment step of the angle of view.
  • the terminal detects the touch operation in the virtual joystick area, determining the corresponding operation of the touch operation according to the relative position of the end point of the touch operation in the virtual joystick area and the origin of the virtual joystick area.
  • the angle of view is adjusted and the angle is adjusted.
  • the end point of the touch operation is the same as the direction of the viewing angle of the virtual rocker area, and the angle of view adjustment is between the end point of the touch operation and the origin of the virtual rocker area.
  • the distance is positively correlated. That is, the larger the distance, the larger the angle of view adjustment angle.
  • the terminal may adjust the first orientation of the virtual object according to the adjusted viewing angle, such that the yaw angle of the virtual object and the yaw angle of the viewing angle are adjusted according to the adjusted viewing angle.
  • the angle of view also follows the scene of change. For example, when the terminal detects the viewing angle adjustment operation, the terminal adjusts the yaw angle of the viewing angle from -90° to -45°, and adjusts the first orientation of the virtual object to -45°, so that the virtual object remains Facing the state of the currently displayed virtual scene.
  • the user can select a drop point based on the surrounding environment, or observe whether there are other virtual objects in the surrounding environment, and the terminal can also provide an observation angle function button, when the terminal detects the When the touch operation of the viewing function button is observed, the terminal may adjust the viewing angle according to the operation direction of the touch operation function of the pair of viewing angle function buttons, and the terminal may not adjust the virtual object according to the touch operation of the viewing angle function button.
  • the touch operation may be a target area centered on the viewing angle function button.
  • the shape of the target area may be a circle or other shapes. The shape and size of the target area are not specifically limited in the embodiment of the present invention.
  • the style of the viewing angle function button may be an eye shape.
  • the viewing angle function button may also be other shapes, which is not limited in the embodiment of the present invention. It should be noted that when the terminal detects that the touch operation of the observation angle function button disappears, the angle of view may be adjusted back to the angle of view before the touch operation function is performed on the observation angle function button.
  • the specific operation of the viewing angle function button by the user may be: first pressing and holding the viewing angle function button, and sliding or dragging near the viewing angle function button.
  • the terminal may adjust the viewing angle based on the user operation, and provide a virtual scene according to the adjusted viewing angle, so that the user can observe the surrounding environment of the virtual object, so that the user can conveniently view the current environment.
  • the surrounding environment is selected for the next point and the next step is taken.
  • the viewing angle of the virtual scene determined by the terminal in the step 401 is not the perspective adjusted by the touch operation of the viewing angle function button. If the terminal detects the touch operation of the virtual joystick area, the terminal The viewing angle of the virtual scene is determined according to the touch operation of the viewing angle function button, and the viewing angle of the virtual scene determined by the terminal is a viewing angle before the viewing angle is adjusted according to the touch operation of the viewing angle function button.
  • the terminal when detecting the touch operation, may detect a location of the virtual object in the virtual scene, and when detecting that the virtual object is located in a sky region of the virtual scene, the virtual object is in the virtual object In the gliding state, the user can control the falling point and the falling speed of the virtual object through the virtual joystick area and the perspective of the virtual scene, and the terminal can perform the step of determining the angle of view of the virtual scene in the step; and when the virtual When the object is located in the non-sky area of the virtual scene, the virtual object is in a state of standing on the land or swimming in the ocean, and the user can directly control the virtual object to move to the periphery by performing a touch operation on the virtual joystick area, according to The moving direction indicated by the touch operation is different, and the virtual object can be moved in any of the 360° directions.
  • the terminal may perform step 401 without performing the step 401, and control the virtual based on the touch operation of the user.
  • the object is moved.
  • the user can also adjust the front view of the virtual object by adjusting the angle of view of the virtual scene, so that the direction in which the virtual object moves forward can be changed to simulate the movement of the real character and the change of the character to the front. Scenes such as azimuth movement.
  • the terminal receives a touch instruction triggered by a touch operation of the virtual joystick area.
  • the touch operations detected by the terminal may be different. Accordingly, the touch commands received by the terminal may also be different.
  • the terminal can receive the first touch command, the second touch command, the third touch command, and the first touch operation, the second touch operation, the third touch operation, and the fourth touch operation.
  • Four touch commands when the end point of the touch operation mentioned in the foregoing embodiment is located between the directions indicated by two adjacent directional arrow icons, the terminal may acquire two triggered by the touch operation.
  • the touch command may be a touch command corresponding to the touch operation corresponding to the direction indicated by the two adjacent directional arrow icons, or the terminal may acquire the touch operation
  • the triggered touch command, the virtual object moving effect indicated by the touch command may be the superimposed effect of the two touch commands in the implementation manner, and the specific one of the above two methods is not specifically limited in the embodiment of the present invention. .
  • the terminal obtains a first acceleration according to the touch instruction and the viewing angle.
  • the terminal may determine the direction of the first acceleration according to the direction indicated by the touch command and the direction of the viewing angle, and determine, according to the touch command, whether the size of the first acceleration is the first preset threshold or the second. Preset threshold.
  • the touch command may include four types of touch commands, and the first acceleration obtained by the terminal may include the following four situations:
  • the terminal may determine the direction of the first acceleration as the viewing direction according to the direction indicated by the first touch command, and determine the direction of the first acceleration according to the first touch command.
  • the magnitude of the first acceleration is a first predetermined threshold.
  • the terminal may determine the direction of the first acceleration as the opposite direction of the viewing direction according to the “backward” and the viewing direction indicated by the second touch command, and according to the second touch instruction, The size of the first acceleration is determined to be a first preset threshold.
  • the first acceleration level corresponding to the first touch command and the second touch command is a first preset threshold.
  • the first preset threshold may be set by a technician, and the user may trigger the first touch command by using the first touch operation, and the terminal may control according to the first touch command.
  • the virtual object is accelerated to move forward, and the second touch command is triggered by the second touch operation, and the terminal can control the virtual object to decelerate and move forward according to the second touch command, thereby controlling the movement of the virtual object.
  • the purpose of speed It should be noted that the second touch operation is only used to indicate that the virtual object is decelerated to move, and the virtual object does not move backward in the sky region.
  • a virtual object controlled by a user may be located in an aircraft with other virtual objects, and the aircraft is moving in a preset direction at a preset speed, and the user may perform related operations on the terminal to control the
  • the virtual object leaves the plane and falls.
  • the initial position of the virtual object may be in the sky area
  • the default first orientation of the virtual object may be -90°
  • the left side of the virtual object is 0°
  • the default second orientation of the virtual object can be -90° and the vertical downward direction is 0°.
  • the yaw angle of the first default viewing angle may be -90° and the pitch angle may be -45°.
  • the terminal may receive the first touch command triggered by the first triggering operation, and determine that the direction of the first acceleration corresponding to the first triggering command is: yaw
  • the angle is the same as the first orientation of the virtual object, which is -90°
  • the pitch angle is the same as the yaw angle of the angle of view, which is -45°.
  • the terminal may determine, according to the direction indicated by the third touch command, "leftward" and the direction of the viewing direction, the direction of the first acceleration is the left side of the front view direction, and according to the third touch And instructing to determine that the magnitude of the first acceleration is a second preset threshold.
  • the terminal may determine, according to the direction indicated by the fourth touch command, “rightward” and the direction of the viewing direction, the direction of the first acceleration is the right side of the front view direction, and according to the fourth touch And instructing to determine that the magnitude of the first acceleration is a second preset threshold.
  • the first acceleration size corresponding to the third touch command and the fourth touch command is a second preset threshold.
  • the second preset threshold may be set by a technician.
  • the value of the second preset threshold is relatively small, and the user may use the third touch operation or the third touch operation.
  • the fourth touch operation triggers the third touch command or the fourth touch command, and the terminal can control the virtual object to move to the left or the right slightly according to the third touch command or the fourth touch command, to subtly
  • the moving direction of the virtual object is corrected in the left and right direction, so that the moving direction of the virtual object can be adjusted more accurately.
  • the terminal may receive the two touch commands in the four touch commands, and the terminal may perform vector on the two first accelerations corresponding to the two touch commands.
  • the summation calculation obtains a first acceleration corresponding to the triggering operation of the user, that is, the first acceleration may be a vector sum of the sub-first accelerations corresponding to the two touch commands triggered by the touch operation.
  • the terminal may directly receive the touch command triggered by the touch operation.
  • the virtual motion effect of the indication may be a superposition effect of the two touch commands, and the direction and the size of the first acceleration are determined according to the touch command, and the embodiment of the present invention specifically adopts one of the foregoing two implementation manners.
  • the manner to determine the direction and size of the first acceleration is not specifically limited.
  • the terminal performs a vector summation calculation on the first acceleration and the gravity acceleration to obtain an acceleration of the virtual object during the falling process.
  • the terminal can pass the first acceleration and the gravity acceleration.
  • the vector summation calculation is performed to obtain the acceleration of the virtual object.
  • the terminal may determine the vector direction obtained by summing the first acceleration and the gravity acceleration vector as the acceleration direction of the virtual object, and determine the vector size obtained by summing the first acceleration and the gravity acceleration vector as the acceleration of the virtual object. .
  • the steps 402 to 404 are based on the touch operation of the virtual joystick area, the angle of view, and the acceleration of gravity, and acquire the acceleration of the virtual object during the falling process. If the user does not perform the touch operation on the control area, the user wants to The virtual object is free to fall, the human object is not interfered by the movement of the virtual object, and the virtual object is located in the sky area of the virtual scene. The virtual object is subjected to gravity, and the terminal can use the acceleration of gravity as the acceleration of the virtual object.
  • the terminal controls the virtual object to fall according to the acceleration in the virtual scene.
  • the user may want to control the falling trajectory of the virtual object, and select a more suitable landing point, so that the advantage is in the subsequent competition.
  • the competitive content of the virtual object may be set in the electronic game.
  • the user may want to control the virtual object to fall to the location where the building is located, and the user may also want to control the virtual object's falling speed so that the virtual object can land quickly to snatch resources, or the user You may want to slow down and drop to gain more time to choose the right landing point.
  • the user performs the touch operation multiple times during the falling process, so that each time the terminal detects the touch operation, the direction and size of the acceleration of the virtual object can be determined, so that the motion of the virtual object is detected based on the touch operation.
  • the virtual object is originally performing a free-falling motion.
  • the terminal may control the virtual object to The front lower speed accelerates.
  • the terminal may also adjust the second orientation of the virtual object during the falling process.
  • the terminal may adjust according to the direction of the first acceleration. Specifically, when the direction of the first acceleration is the viewing angle direction, the terminal adjusts the second orientation of the virtual object during the falling process, so that the pitch angle of the virtual object is the same as the pitch angle of the viewing angle, and the real angle can be simulated more realistically.
  • the character swoops down to change the scene of the body orientation; when the direction of the first acceleration is the opposite direction of the viewing direction, during the falling process, the terminal adjusts the second orientation of the virtual object such that the pitch angle of the virtual object is 0°, It can more realistically simulate the scene in which the real person adjusts the body orientation during the fall.
  • the pitch angle of the virtual object may range from -90° to 0°, and the virtual object does not face above the horizontal direction, that is, the virtual object does not in the sky.
  • the area moves up.
  • the terminal controls the virtual object to perform a free fall motion.
  • the terminal controls the virtual object to face the front lower side and accelerates forward and downward, and now the user no longer performs the touch operation on the virtual object, and the acceleration of the virtual object Returning to the gravitational acceleration, the terminal can control that the virtual object will continue to move forward and downward, the motion track of the virtual object will be a parabolic shape, and the speed of the virtual object moving in the horizontal direction will not change, and the virtual The second orientation of the object is adjusted back to 0°.
  • the range of the pitch angle of the virtual object may also be other ranges, which is not limited by the embodiment of the present invention.
  • the terminal may further adjust the body posture of the virtual object according to the touch operation, so that the virtual object may simulate a body posture that may appear in the air in the real scene during the falling process, so that the simulation The result is more real.
  • the body posture corresponding to the touch operation is also different.
  • the terminal controls the virtual object to be in a dive posture, and the direction of the first acceleration corresponding to the first touch command triggered by the first touch operation is
  • the forward direction of the head of the virtual object that is, the direction of the first acceleration is the same as the orientation of the virtual object, and the acceleration of the virtual object further includes gravity acceleration, and the direction of the gravity acceleration is vertically downward.
  • the terminal controls the virtual object to be in the backward posture, and the direction of the first acceleration corresponding to the second touch command triggered by the second touch operation
  • the direction of the first acceleration is opposite to the direction of the viewing angle, and the acceleration of the virtual object further includes gravity acceleration, and the direction of the gravity acceleration is vertically downward.
  • the terminal controls the virtual object to be tilted to the left, and the direction of the first acceleration corresponding to the third touch command triggered by the third touch operation is to the left.
  • the acceleration of the virtual object also includes the acceleration of gravity, and the direction of the acceleration of gravity is vertically downward.
  • the terminal controls the virtual object to be tilted to the right, and the first touch corresponding to the fourth touch command triggered by the fourth touch operation is
  • the direction is rightward, and the acceleration of the virtual object also includes gravity acceleration, and the direction of gravity acceleration is vertically downward.
  • the rightward tilting posture and the leftward tilting posture of the virtual object are mirror-symmetrical with respect to a line connecting the head and the foot of the virtual object.
  • the terminal detects the first touch operation the virtual object is controlled to move forward while falling, and then the fourth touch operation is detected, and the virtual object can be kept forward. While moving, slowly shift to the right.
  • the terminal may adjust the body posture of the virtual object to fall in a horizontal direction when controlling the virtual object to perform a free fall motion. Gesture.
  • the user can adjust the angle of view to the horizontal direction, that is, the pitch angle is -90, and then the first touch operation, terminal detection
  • the direction of the first acceleration corresponding to the first touch command triggered by the first touch operation is determined to be horizontal forward, and the gravity acceleration is vertically downward, so the terminal can control the virtual
  • the object moves forward and downward, and the motion trajectory can be a parabola. If the magnitude of the first acceleration is the same, the direction of the first acceleration is horizontally forward, the virtual object can be moved to the farthest position in the horizontal direction to achieve the effect of “shallow dive”.
  • the user can adjust the viewing angle to the vertical direction, that is, the pitch angle is 0, and then perform the first touch operation.
  • the terminal detects the first touch operation the first touch can be determined.
  • the first acceleration corresponding to the first touch command triggered by the control operation is the same as the direction of the gravity acceleration, and is vertically downward, so the terminal can control the virtual object to move vertically downward, and control the pitch angle of the virtual object to be 0. It can achieve vertical dive down to achieve the fastest landing effect.
  • the foregoing steps 401 to 405 are a process of dynamically controlling the falling of the virtual object
  • the terminal may perform the above steps 401 to 405 in each frame, and after obtaining the acceleration of the virtual object, calculate the virtual object under the The position in one frame, and then the above acquisition and calculation process is repeated based on the user operation at the next frame until the position of the virtual object is switched from the sky area to the non-sky area, then the following steps 406 and 407 can be performed.
  • the time interval between two adjacent frames may be determined by the user according to the performance parameter settings of the own device.
  • the acceleration of the virtual object is obtained in real time in each frame, and the position in the next frame of the virtual object is calculated according to the real-time acceleration, and the terminal can render and display the next frame based on the calculated position, so that the terminal can display the virtual in the virtual scene.
  • the object makes the virtual object drop process presented by the terminal more realistic and accurate.
  • the terminal acquires an acceleration corresponding to the touch operation based on a touch operation of the virtual joystick area.
  • the terminal may provide a second default perspective, where the second default perspective is that the virtual object is in the standing posture The front of the time.
  • the terminal may control the initial posture of the virtual object to be a standing posture. Since the virtual object is not in the sky region, the virtual object is balanced in the vertical direction, and the acceleration corresponding to the touch operation of the virtual rocker region detected by the terminal is the acceleration of the virtual object. Specifically, the terminal can receive the touch command triggered by the touch operation, and determine the direction indicated by the touch command as the direction of the acceleration of the virtual object, for example, the direction indicated by the touch command corresponding to the touch operation.
  • the direction of the acceleration is the front of the virtual object; the direction indicated by the touch command corresponding to the touch operation is backward, and the direction of the acceleration is the rear of the virtual object;
  • the direction indicated by the touch command corresponding to the touch operation is the left front, and the direction of the acceleration is the left front of the virtual object, and the direction indicated by the touch command corresponding to the touch operation is 360°.
  • the embodiments of the present invention are not enumerated here.
  • the terminal controls the virtual object to move according to the acceleration in the virtual scene.
  • the virtual object may also be controlled to run, crawl, move, swim, etc. according to the acceleration in the virtual scene, and details are not described herein.
  • FIG. 9 is a flowchart of a virtual object control method according to an embodiment of the present invention.
  • the terminal may detect whether the virtual object is landing at each frame. When the virtual object does not land, the terminal may determine the current orientation of the virtual object, and calculate a vector corresponding to the current touch operation of the virtual object. Speed, the vector speed is combined with the gravitational acceleration to obtain the final acceleration of the virtual object, and the position of the virtual object in the next frame is determined according to the final acceleration and the motion state and position of the virtual object in the current frame. Then, in the next frame, it is detected whether the virtual object is landing, until it is detected that the virtual object has landed, and the terminal may stop the calculation step of the orientation, acceleration, and the like of the virtual object in the sky region.
  • the terminal further controls the orientation of the virtual object during the falling process, so that the virtual object adjusts the body orientation according to different motion states, and more realistically simulates different orientations that real people may appear in the air. Further, the terminal further controls the body posture of the virtual object during the falling process, so that the virtual object makes a corresponding body posture for different scenes, and more realistically simulates the action of the real person in the real scene.
  • FIG. 10 is a schematic structural diagram of a virtual object control apparatus according to an embodiment of the present invention.
  • the device includes:
  • the determining module 1001 is configured to determine a viewing angle of the virtual scene when the touch operation on the control area is detected, where the control area is used to control a moving manner of the virtual object in the virtual scene;
  • the obtaining module 1002 is configured to acquire an acceleration of the virtual object during the falling process based on the touch operation of the control area, the viewing angle, and the acceleration of gravity;
  • the control module 1003 is configured to control the virtual object to fall in the virtual scene according to the acceleration.
  • the apparatus further comprises:
  • a detecting module configured to detect a location of the virtual object in the virtual scene
  • an execution module configured to perform the step of determining a perspective of the virtual scene when the virtual object is detected to be in a sky region in the virtual scene.
  • the acquiring module 1002 is further configured to: when detecting that the virtual object is located in a non-sky area in the virtual scene, acquire an acceleration corresponding to the touch operation based on a touch operation of the control area;
  • the control module 1003 is further configured to control the virtual object to move according to the acceleration in the virtual scene.
  • the apparatus further comprises:
  • An adjustment module configured to adjust an angle of view according to the angle of view adjustment operation when the angle of view adjustment operation is detected, where the angle of view adjustment operation is used to adjust a perspective of the virtual scene;
  • the adjusting module is further configured to adjust a first orientation of the virtual object according to the adjusted viewing angle, where the first orientation of the virtual object refers to a yaw angle in an attitude angle of the virtual object.
  • the obtaining module 1002 is configured to:
  • the first acceleration and the gravitational acceleration are vector-summed to obtain an acceleration of the virtual object.
  • the obtaining module 1002 is configured to use the gravitational acceleration as the acceleration of the virtual object during the falling process when the touch operation of the control region is not detected.
  • the obtaining module 1002 is configured to:
  • the first acceleration is obtained by using a first preset threshold and a direction of the viewing direction.
  • the first touch command is used to indicate that the virtual object is accelerated to move.
  • the obtaining module 1002 is configured to:
  • the touch command is the second touch command
  • the first acceleration is obtained in the opposite direction, and the direction is the direction of the viewing direction.
  • the second touch command is used to indicate that the virtual object is decelerated.
  • the obtaining module 1002 is configured to:
  • the touch command is the third touch command
  • the first preset threshold is obtained
  • the direction is the first acceleration of the left side of the virtual object
  • the third touch command is used to indicate that the virtual object is controlled to move to the left.
  • the obtaining module 1002 is configured to:
  • the touch command is the fourth touch command
  • the first preset threshold is obtained, and the direction is the first acceleration of the right side of the virtual object, and the fourth touch command is used to indicate that the virtual object is controlled to move to the right.
  • control module 1003 is configured to adjust a second orientation of the virtual object during a drop, and the second orientation of the virtual object refers to a pitch angle in an attitude angle of the virtual object.
  • control module 1003 is configured to:
  • the second orientation of the virtual object is adjusted during the falling process, and the adjusted pitch angle of the virtual object is the same as the pitch angle of the viewing angle;
  • the second orientation of the virtual object is adjusted during the falling process, and the adjusted pitch angle of the virtual object is 0°.
  • control module 1003 is further configured to adjust a body posture of the virtual object according to the touch operation during the falling process.
  • the device provided by the embodiment of the present invention detects the acceleration of the virtual object by detecting the touch operation of the virtual joystick area, and combines the touch operation, the angle of view, and the acceleration of gravity to realize the movement control of the virtual object, and the direction of the acceleration is not Fixed, the size is not zero, so that the virtual object can move in any direction, and the virtual object can be accelerated or decelerated to be able to truly simulate the action of the real person.
  • the virtual object control device provided in the above embodiment is only illustrated by the division of the above functional modules when implementing the control of the virtual object. In actual applications, the functions may be assigned to different functional modules according to needs. Upon completion, the internal structure of the device is divided into different functional modules to perform all or part of the functions described above.
  • the embodiment of the virtual object control device and the virtual object control method provided by the foregoing embodiments are in the same concept, and the specific implementation process is described in detail in the method embodiment, and details are not described herein again.
  • the electronic device 1100 can be provided as a terminal.
  • the electronic device 1100 can be relatively different due to different configurations or performances, and can include one or More than one central processing unit (CPU) 1101 and one or more memories 1102, wherein the memory 1102 stores at least one instruction that is loaded and executed by the processor 1101 to implement the following steps. :
  • the processor 1101 is further configured to:
  • the step of determining the angle of view of the virtual scene is performed.
  • the processor 1101 is further configured to:
  • the processor 1101 is further configured to:
  • the viewing angle adjustment operation When the viewing angle adjustment operation is detected, the viewing angle is adjusted according to the viewing angle adjusting operation, and the viewing angle adjusting operation is used to adjust the viewing angle of the virtual scene;
  • the first orientation of the virtual object is adjusted according to the adjusted perspective, and the first orientation of the virtual object refers to a yaw angle in the attitude angle of the virtual object.
  • the processor 1101 is configured to perform:
  • the first acceleration and the gravitational acceleration are vector-summed to obtain an acceleration of the virtual object.
  • the processor 1101 is further configured to:
  • the acceleration of gravity is taken as the acceleration of the virtual object during the falling process.
  • the processor 1101 is configured to perform:
  • the first acceleration is obtained by using a first preset threshold and a direction of the viewing direction.
  • the first touch command is used to indicate that the virtual object is accelerated to move.
  • the processor 1101 is configured to perform:
  • the touch command is the second touch command
  • the first acceleration is obtained in the opposite direction, and the direction is the direction of the viewing direction.
  • the second touch command is used to indicate that the virtual object is decelerated.
  • the processor 1101 is configured to perform:
  • the touch command is the third touch command
  • the first preset threshold is obtained
  • the direction is the first acceleration of the left side of the virtual object
  • the third touch command is used to indicate that the virtual object is controlled to move to the left.
  • the processor 1101 is configured to perform:
  • the touch command is the fourth touch command
  • the first preset threshold is obtained, and the direction is the first acceleration of the right side of the virtual object, and the fourth touch command is used to indicate that the virtual object is controlled to move to the right.
  • the processor 1101 is configured to perform:
  • the second orientation of the virtual object is adjusted during the drop, and the second orientation of the virtual object refers to a pitch angle in the pose angle of the virtual object.
  • the processor 1101 is configured to perform:
  • the second orientation of the virtual object is adjusted during the falling process, and the adjusted pitch angle of the virtual object is the same as the pitch angle of the viewing angle;
  • the second orientation of the virtual object is adjusted during the falling process, and the adjusted pitch angle of the virtual object is 0°.
  • the processor 1101 is further configured to: during a falling process, adjust a body posture of the virtual object according to the touch operation.
  • the electronic device 1100 can also have components such as a wired or wireless network interface, a keyboard, and an input and output device for input and output.
  • the electronic device 1100 can also include other components for implementing the functions of the device, and details are not described herein.
  • a computer readable storage medium storing a computer program, such as a memory storing a computer program, the computer program being implemented to perform the virtual object control method described above.
  • the computer readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), or a Compact Disc Read-Only Memory (CD-ROM). Tapes, floppy disks, and optical data storage devices.
  • the storage medium may be a read only memory, a magnetic disk or an optical disk or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
PCT/CN2018/117034 2018-02-09 2018-11-22 虚拟对象控制方法、装置、电子装置及存储介质 Ceased WO2019153837A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020541985A JP7166708B2 (ja) 2018-02-09 2018-11-22 仮想オブジェクト制御方法、装置、電子機器、及び記憶媒体
US16/893,773 US11161041B2 (en) 2018-02-09 2020-06-05 Virtual object control method and apparatus, electronic apparatus, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810132647.8A CN108245887A (zh) 2018-02-09 2018-02-09 虚拟对象控制方法、装置、电子装置及存储介质
CN201810132647.8 2018-02-09

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/893,773 Continuation US11161041B2 (en) 2018-02-09 2020-06-05 Virtual object control method and apparatus, electronic apparatus, and storage medium

Publications (1)

Publication Number Publication Date
WO2019153837A1 true WO2019153837A1 (zh) 2019-08-15

Family

ID=62744880

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/117034 Ceased WO2019153837A1 (zh) 2018-02-09 2018-11-22 虚拟对象控制方法、装置、电子装置及存储介质

Country Status (4)

Country Link
US (1) US11161041B2 (enExample)
JP (1) JP7166708B2 (enExample)
CN (1) CN108245887A (enExample)
WO (1) WO2019153837A1 (enExample)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112263830A (zh) * 2020-09-28 2021-01-26 上海米哈游天命科技有限公司 一种虚拟目标锁定方法、装置、设备和介质
JP2023507686A (ja) * 2020-11-19 2023-02-27 ▲騰▼▲訊▼科技(深▲セン▼)有限公司 仮想オブジェクトの制御方法、装置、機器、記憶媒体及びコンピュータプログラム製品

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108245887A (zh) * 2018-02-09 2018-07-06 腾讯科技(深圳)有限公司 虚拟对象控制方法、装置、电子装置及存储介质
CN108744507B (zh) * 2018-05-18 2023-03-24 腾讯科技(深圳)有限公司 虚拟对象下落控制方法、装置、电子装置及存储介质
CN109542222B (zh) * 2018-11-13 2021-12-14 深圳市创凯智能股份有限公司 三维视角控制方法、装置、设备以及可读存储介质
CN110354489B (zh) 2019-08-08 2022-02-18 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、终端及存储介质
CN111298440A (zh) * 2020-01-20 2020-06-19 腾讯科技(深圳)有限公司 虚拟环境中的虚拟角色控制方法、装置、设备及介质
CN111298441A (zh) * 2020-01-21 2020-06-19 腾讯科技(深圳)有限公司 虚拟道具的使用方法、装置、设备及存储介质
JP7233399B2 (ja) * 2020-06-23 2023-03-06 任天堂株式会社 ゲームプログラム、ゲーム装置、ゲームシステム、およびゲーム処理方法
CN111913645B (zh) * 2020-08-17 2022-04-19 广东申义实业投资有限公司 三维图像展示的方法、装置、电子设备及存储介质
CN112121417B (zh) * 2020-09-30 2022-04-15 腾讯科技(深圳)有限公司 虚拟场景中的事件处理方法、装置、设备及存储介质
US11534681B2 (en) * 2020-10-29 2022-12-27 Google Llc Virtual console gaming controller
CN112717410B (zh) * 2021-01-21 2023-03-14 腾讯科技(深圳)有限公司 虚拟对象控制方法、装置、计算机设备及存储介质
CN113101667B (zh) * 2021-05-13 2023-02-28 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、设备及计算机可读存储介质
CN113926187A (zh) * 2021-10-20 2022-01-14 腾讯科技(深圳)有限公司 虚拟场景中的对象控制方法、装置及终端设备
CN114367107B (zh) * 2021-11-02 2025-11-07 腾讯科技(深圳)有限公司 虚拟场景的交互控制方法、装置、摇杆设备及电子设备
CN114082189B (zh) * 2021-11-18 2025-06-10 腾讯科技(深圳)有限公司 虚拟角色控制方法、装置、设备、存储介质及产品
CN115155060B (zh) * 2022-06-20 2025-11-25 网易(杭州)网络有限公司 游戏中虚拟角色的控制方法、装置、终端设备及存储介质
CN115220576B (zh) * 2022-06-21 2024-10-01 北京字跳网络技术有限公司 画面视角控制的方法、装置、设备和存储介质
CN117942565A (zh) * 2024-01-19 2024-04-30 网易(杭州)网络有限公司 游戏控制方法、装置和电子设备
US20250306675A1 (en) * 2024-03-29 2025-10-02 Beijing Zitiao Network Technology Co., Ltd. Method and apparatus for moving virtual object, electronic device, and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101089041B1 (ko) * 2011-04-21 2011-12-07 주식회사인스타 이동형 낙하 훈련 시뮬레이터 및 그 방법
CN105460223A (zh) * 2015-12-08 2016-04-06 中国人民解放军空军空降兵学院 跳伞模拟训练系统及其模拟训练方法
CN206597301U (zh) * 2017-03-01 2017-10-31 蓝色智库(北京)科技发展有限公司 一种基于虚拟现实的实战伞降作战仿真模拟系统
CN206672404U (zh) * 2016-12-07 2017-11-24 深圳天网虚拟现实科技开发有限公司 基于虚拟现实的跳伞体验模拟装置
CN107472543A (zh) * 2017-08-29 2017-12-15 深圳威阿科技有限公司 一种跳伞训练模拟系统
CN108245887A (zh) * 2018-02-09 2018-07-06 腾讯科技(深圳)有限公司 虚拟对象控制方法、装置、电子装置及存储介质

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5622337A (en) * 1995-06-08 1997-04-22 Unruh; Peter J. Method and reference point apparatus for training free fall parachutists
US8277316B2 (en) * 2006-09-14 2012-10-02 Nintendo Co., Ltd. Method and apparatus for using a common pointing input to control 3D viewpoint and object targeting
JP2014208258A (ja) 2014-06-12 2014-11-06 株式会社スクウェア・エニックス ビデオゲーム処理装置、およびビデオゲーム処理プログラム
JP2017035215A (ja) 2015-08-07 2017-02-16 株式会社あかつき 情報処理装置、情報処理システム、及びキャラクタ移動制御プログラム
KR101810834B1 (ko) * 2015-10-27 2017-12-20 (주)아레스 가상현실기반 스카이다이빙 체감 시뮬레이터 시스템
US20170354887A1 (en) * 2016-06-08 2017-12-14 Bruce Bollermann Systems & methods for parachute flight simulation
US20180067547A1 (en) * 2016-09-06 2018-03-08 Russell-Hampson, Inc. Virtual reality motion simulation system
CN107506122B (zh) * 2017-08-18 2019-12-17 网易(杭州)网络有限公司 虚拟对象调控的方法、装置及电子设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101089041B1 (ko) * 2011-04-21 2011-12-07 주식회사인스타 이동형 낙하 훈련 시뮬레이터 및 그 방법
CN105460223A (zh) * 2015-12-08 2016-04-06 中国人民解放军空军空降兵学院 跳伞模拟训练系统及其模拟训练方法
CN206672404U (zh) * 2016-12-07 2017-11-24 深圳天网虚拟现实科技开发有限公司 基于虚拟现实的跳伞体验模拟装置
CN206597301U (zh) * 2017-03-01 2017-10-31 蓝色智库(北京)科技发展有限公司 一种基于虚拟现实的实战伞降作战仿真模拟系统
CN107472543A (zh) * 2017-08-29 2017-12-15 深圳威阿科技有限公司 一种跳伞训练模拟系统
CN108245887A (zh) * 2018-02-09 2018-07-06 腾讯科技(深圳)有限公司 虚拟对象控制方法、装置、电子装置及存储介质

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112263830A (zh) * 2020-09-28 2021-01-26 上海米哈游天命科技有限公司 一种虚拟目标锁定方法、装置、设备和介质
CN112263830B (zh) * 2020-09-28 2023-04-07 上海米哈游天命科技有限公司 一种虚拟目标锁定方法、装置、设备和介质
JP2023507686A (ja) * 2020-11-19 2023-02-27 ▲騰▼▲訊▼科技(深▲セン▼)有限公司 仮想オブジェクトの制御方法、装置、機器、記憶媒体及びコンピュータプログラム製品
US11803301B2 (en) 2020-11-19 2023-10-31 Tencent Technology (Shenzhen) Company Limited Virtual object control method and apparatus, device, storage medium, and computer program product
JP7391448B2 (ja) 2020-11-19 2023-12-05 ▲騰▼▲訊▼科技(深▲セン▼)有限公司 仮想オブジェクトの制御方法、装置、機器、記憶媒体及びコンピュータプログラム製品
US12366956B2 (en) 2020-11-19 2025-07-22 Tencent Technology (Shenzhen) Company Limited Virtual object control method and apparatus, device, storage medium, and computer program product

Also Published As

Publication number Publication date
US11161041B2 (en) 2021-11-02
JP2021512418A (ja) 2021-05-13
US20200298121A1 (en) 2020-09-24
CN108245887A (zh) 2018-07-06
JP7166708B2 (ja) 2022-11-08

Similar Documents

Publication Publication Date Title
WO2019153837A1 (zh) 虚拟对象控制方法、装置、电子装置及存储介质
KR102565710B1 (ko) 가상 장면 디스플레이 방법, 전자 장치 및 저장 매체
US11224813B2 (en) Method and apparatus for controlling falling of virtual object, electronic device, and storage medium
US12179104B2 (en) Method and apparatus for displaying virtual environment picture, device, and storage medium
US20240269555A1 (en) Viewing angle adjustment method and device, electronic device, and computer-readable storage medium
KR102592632B1 (ko) 가상 환경에서 마크 정보를 생성하는 방법 및 장치, 전자 장치 및 저장 매체
US20220032191A1 (en) Virtual object control method and apparatus, device, and medium
CN110665230B (zh) 虚拟世界中的虚拟角色控制方法、装置、设备及介质
US12165254B2 (en) Method and apparatus for displaying virtual environment picture, device, and storage medium
KR20200091897A (ko) 가상 객체 이동 제어 방법 및 장치, 전자 장치, 그리고 저장 매체
WO2019153836A1 (zh) 虚拟环境中虚拟对象的姿态确定方法、装置及介质
WO2023160068A1 (zh) 虚拟对象控制方法、装置、设备及介质
CN110141850B (zh) 动作控制方法、装置、电子设备及存储介质
HK40054046B (zh) 虚拟技能的控制方法、装置、设备及计算机可读存储介质
HK40017681A (en) Method, apparatus, device, and medium for controlling virtual character in virtual world
HK40017681B (en) Method, apparatus, device, and medium for controlling virtual character in virtual world
HK40027382A (en) Method and apparatus for displaying virtual environment screen, device and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18905865

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020541985

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18905865

Country of ref document: EP

Kind code of ref document: A1