WO2020173256A1 - 虚拟场景显示方法、电子设备及存储介质 - Google Patents

虚拟场景显示方法、电子设备及存储介质 Download PDF

Info

Publication number
WO2020173256A1
WO2020173256A1 PCT/CN2020/072853 CN2020072853W WO2020173256A1 WO 2020173256 A1 WO2020173256 A1 WO 2020173256A1 CN 2020072853 W CN2020072853 W CN 2020072853W WO 2020173256 A1 WO2020173256 A1 WO 2020173256A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
virtual object
aiming point
rotation speed
viewing angle
Prior art date
Application number
PCT/CN2020/072853
Other languages
English (en)
French (fr)
Inventor
吴硕桓
刘智洪
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to JP2021517776A priority Critical patent/JP7299312B2/ja
Priority to SG11202104916WA priority patent/SG11202104916WA/en
Priority to KR1020217013177A priority patent/KR102565710B1/ko
Publication of WO2020173256A1 publication Critical patent/WO2020173256A1/zh
Priority to US17/244,446 priority patent/US11883751B2/en
Priority to US18/523,979 priority patent/US20240091654A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/219Input arrangements for video game devices characterised by their sensors, purposes or types for aiming at specific areas on the display, e.g. light-guns
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/422Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle automatically for the purpose of assisting the player, e.g. automatic braking in a driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing

Definitions

  • This application relates to the field of computer technology, and in particular to a virtual scene display method, electronic equipment, and storage medium. Background technique
  • shooting games are a relatively popular game.
  • This type of game usually displays an aiming point in the center of the terminal screen. The user can adjust the area targeted by the aiming point by adjusting the viewing angle of the virtual scene, and adjust the currently displayed virtual scene.
  • the virtual scene display method usually uses a preset range around the target virtual object as the adsorption area when the target virtual object is detected, and when the aiming point is located in the adsorption area, the aiming point is directly moved to the target virtual object.
  • the user’s operation intention is not considered, and the aiming point is directly moved to the target virtual object.
  • the strength of auxiliary aiming is very high. If the user does not want to aim at the target virtual object or wants to move the aiming point away When the target virtual object moves in the direction, the excessive drag in the above method will make it difficult for the user to move the aiming point away from the target virtual object. Therefore, the virtual scene display is separated from the user's operation and cannot meet user needs. poor effect.
  • the embodiments of the present application provide a virtual scene display method, electronic device, and storage medium, which can solve the problems of separation from user operations, inability to meet user needs, and poor display effects in related technologies.
  • the technical solution is as follows:
  • a virtual scene display method which is applied to an electronic device, and the method includes:
  • the aiming point When the aiming point is located in the adsorption area of the target virtual object, acquiring the target rotation speed of the view angle of the virtual scene according to the position of the aiming point, the position of the target virtual object, and the viewing angle adjustment operation;
  • the target virtual scene is displayed.
  • a virtual scene display method which is applied to an electronic device, and the method includes:
  • the target area corresponding to the aiming point is acquired, the second display mode is a display mode based on the sight, and the first display mode is except the Display modes other than the second display mode;
  • a target virtual scene is displayed, in which the aiming point is located in the area where the target virtual object is located.
  • a virtual scene display device includes:
  • An acquiring module which is used to acquire the adsorption area of the target virtual object when the viewing angle adjustment operation is detected;
  • the acquisition module is further configured to, when the aiming point is located in the adsorption area of the target virtual object, obtain the information of the virtual scene according to the position of the aiming point, the position of the target virtual object, and the viewing angle adjustment operation.
  • the display module is configured to display the target virtual scene based on the target rotation speed of the viewing angle.
  • a virtual scene display device includes:
  • the second display mode is a display mode based on a sight
  • the first display mode is a display mode other than the second display mode
  • a target virtual scene is displayed, in which the aiming point is located in the area where the target virtual object is located.
  • an electronic device includes one or more processors and one or more memories, and at least one instruction is stored in the one or more memories, and the instruction is controlled by the one or more Multiple processors are loaded and executed to implement the operations performed by the virtual scene display method.
  • a computer-readable storage medium is provided, and at least one instruction is stored in the computer-readable storage medium, and the instruction is loaded and executed by a processor to implement operations performed by the virtual scene display method.
  • FIG. 1 is a schematic diagram of a terminal interface in which a virtual scene is not in a sight-based display mode according to an embodiment of the present application;
  • FIG. 2 is a schematic diagram of a terminal interface with a virtual scene in a sight-based display mode according to an embodiment of the present application
  • FIG. 3 is a flowchart of a virtual scene display method according to an embodiment of the present application
  • FIG. 4 is a schematic diagram of the relationship between the size of the adsorption area and the distance between virtual objects according to an embodiment of the present application
  • FIG. 5 is a schematic diagram of a terminal interface with a relatively close distance between an aiming point and a target virtual object according to an embodiment of the present application
  • FIG. 6 is a schematic diagram of three kinds of adsorption regions provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of a terminal interface provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a terminal interface on which a target virtual object is moving according to an embodiment of the present application
  • FIG. 9 is a flowchart of viewing angle adjustment provided by an embodiment of the present application.
  • FIG. 10 is a flowchart of a method for displaying a virtual scene provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of a target area corresponding to an aiming point according to an embodiment of the present application.
  • FIG. 12 is a schematic diagram of a target virtual scene provided by an embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of a virtual scene display device provided by an embodiment of the present application.
  • FIG. 14 is a schematic structural diagram of a virtual scene display device provided by an embodiment of the present application.
  • FIG. 15 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • the embodiments of this application mainly relate to electronic games or simulated training scenarios.
  • the user can perform operations on the terminal in advance. After the terminal detects the user's operation, it can download the game configuration file of the electronic game.
  • the configuration file may include the electronic game application, interface display data or virtual scene data, etc., so that the user can call the game configuration file when logging in the electronic game on the terminal to render and display the electronic game interface.
  • the user can perform a touch operation on the terminal. After the terminal detects the touch operation, it can obtain game data corresponding to the touch operation, and render and display the game data.
  • the game data may include virtual scene data, Behavior data of virtual objects in the virtual scene, etc.
  • the virtual scene involved in this application can be used to simulate a three-dimensional virtual space, or can be used to simulate a two-dimensional virtual space, and the three-dimensional virtual space or the two-dimensional virtual space can be an open space.
  • the virtual scene can be used to simulate A real environment in reality, for example, the virtual scene may include sky, land, ocean, etc., the land may include environmental elements such as deserts and cities, and the user can control virtual objects to move in the virtual scene.
  • the virtual object may be a virtual avatar used to represent the user, and the avatar may be in any form, for example, a person, an animal, etc., which is not limited in this application.
  • the virtual scene may also include other virtual objects, that is, the virtual scene may include multiple virtual objects, and each virtual object has its own shape and volume in the virtual scene and occupies a part of the space in the virtual scene.
  • the user can control the virtual object to fall freely in the sky of the virtual scene, glide or open a parachute to fall, etc., run, jump, crawl, bend forward, etc. on the land, and can also control The virtual object swims, floats, or dives in the ocean.
  • the user can also control the virtual object to take a vehicle to move in the virtual scene.
  • the virtual props can be cold weapons or hot weapons, which is not specifically limited in the embodiment of the application.
  • the terminal screen may display the perspective picture of the virtual object controlled by the current terminal, and the terminal screen may also display the aiming point of the virtual object controlled by the current terminal.
  • the aiming point may be used to mark the point of view picture of the virtual object controlled by the current terminal If the aiming target is in the virtual scene, the position of the aiming point in the virtual scene can be used as the attack point of the virtual object controlled by the current terminal.
  • the aiming point may be displayed at the center position of the terminal screen.
  • the aiming point may also be displayed at other positions, which is not specifically limited in the embodiment of the present application.
  • the display style of the aiming point may include multiple types, and the display style of the aiming point may be displayed by the system default, or it may be adjusted according to the user's setting.
  • the user sees the aiming point displayed on the terminal, he can judge whether the position of the virtual scene corresponding to the current aiming point is the area he wants to aim at. If not, the user can adjust the sight of the virtual scene through the viewing angle adjustment operation to adjust the aiming point Area.
  • the user usually wants to quickly and accurately adjust the aiming point to the body of other virtual objects in the virtual scene, so that the other virtual objects can be shot, slapped, or boxed.
  • the viewing angle adjustment operation may be multiple types of operations.
  • the viewing angle adjustment operation may be an operation for changing the position of the virtual object, that is, controlling the movement of the virtual object, so that the viewing angle is changed.
  • the user directly performs a viewing angle adjustment operation to change the viewing angle.
  • the embodiments of this application do not limit this.
  • the viewing angle adjustment operation may also include multiple operation modes.
  • the viewing angle adjustment operation may be a sliding operation, and the terminal detecting the sliding operation may be based on the sliding direction, sliding distance, and sliding speed of the sliding operation.
  • the sliding direction of the sliding operation may correspond to the rotation direction of the viewing angle
  • the sliding distance of the sliding operation may be positively related to the rotation angle of the viewing angle.
  • the sliding speed of the sliding operation may also be positively related to the rotation speed of the viewing angle. .
  • the viewing angle adjustment operation may also be a pressing operation.
  • a viewing angle adjustment area may be preset on the terminal, and the user may perform a pressing operation in the viewing angle adjustment area, and the terminal detects the During a pressing operation in the viewing angle adjustment area, the rotation direction, rotation speed, and rotation angle of the viewing angle corresponding to the pressing operation can be obtained based on the specific position of the pressing operation relative to the viewing angle adjustment area, the pressing force of the pressing operation, and the pressing time .
  • the direction of the pressing operation relative to the center of the viewing angle adjustment area may correspond to the rotation direction of the viewing angle
  • the pressing force of the pressing operation may be positively correlated with the rotation speed of the viewing angle
  • the pressing time of the pressing operation may be related to the rotation angle of the viewing angle. Positive correlation.
  • the viewing angle adjustment operation may also be a rotation operation on the terminal.
  • the angular velocity sensor for example, a gyroscope
  • the rotation direction of the rotation operation may be the rotation direction of the viewing angle
  • the rotation angle of the rotation operation may be positively related to the rotation angle of the viewing angle
  • the rotation speed of the rotation operation may be positively related to the rotation speed of the viewing angle.
  • the viewing angle adjustment operation may also be a key operation, a drag operation on the virtual joystick area, or a toggle operation on the real joystick device, etc., which is not specifically limited in this application.
  • the user's viewing angle adjustment operation on the viewing angle is a sliding operation, and during the sliding operation, the terminal The pressing force of the operation during the sliding operation is detected, and based on whether the pressing force is greater than the preset pressing force, it is determined whether to shoot or the like.
  • the virtual object can usually control the virtual props to fight with other virtual objects.
  • Some firearms can also be equipped with sights to observe the virtual scene based on the sights.
  • the sights can be mechanical sights, which refer to the observation equipment that is originally equipped on the firearms.
  • the sight may also be a sight that is subsequently equipped on the firearm prop, for example, a sight.
  • the sight may have a magnification, and the magnification may be 1, or a value greater than 1.
  • the scope may be a red dot scope, a holographic scope, a double scope, a quadruple scope, an eightfold scope, etc., wherein the red dot scope and the holographic scope have a magnification of 1, and a double The magnification of the scope, the four-fold scope, and the eight-fold scope is greater than 1.
  • the scope of the scope can also be other values, for example, the scope can also be a three-fold, six-fold, or fifteen-fold
  • the scope of the scope is not specifically limited in the embodiment of the present application.
  • the sight is used to assist the virtual object in aiming and shooting. Therefore, when the virtual object controls the virtual prop to aim or shoot, the display mode of the virtual scene can be switched to the sight-based display mode, which is convenient and more accurate Aim and shoot at enemy virtual objects.
  • the virtual scene is not in the sight-based display mode.
  • the user wants to control the virtual object to accurately shoot other virtual objects appearing in the virtual scene, and switch the display mode of the virtual scene to the display mode based on sights.
  • the sights on the virtual props can be controlled by the virtual objects. To observe the virtual scene.
  • Fig. 3 is a flowchart of a method for displaying a virtual scene provided by an embodiment of the present application. Referring to Fig. 3, the method may include the following steps:
  • the terminal detects whether a target virtual object is included in the virtual scene, and when the virtual scene includes the target virtual object, perform step 302; when the virtual scene does not include the target virtual object, perform step 305.
  • the user can perform a viewing angle adjustment operation on the terminal to adjust the viewing angle of the virtual scene, so that the position in the virtual scene corresponding to the aiming point is changed, so that the viewing angle adjustment operation can change the aiming of the virtual object currently controlled by the terminal Location and attack point.
  • the above content has already explained the operation mode and type of the viewing angle adjustment operation, and the embodiments of the present application will not repeat them here, and the operation mode and type of the viewing angle adjustment operation are not limited.
  • an assisted aiming service can be provided to assist the user to quickly move the aiming point to the virtual object to be aimed at, so as to reduce the user's operation difficulty. Therefore, when the viewing angle adjustment operation is detected, the terminal can detect whether the target virtual object is included in the virtual scene, so as to preliminarily determine whether an auxiliary aiming service needs to be provided.
  • the virtual scene does not include the target virtual object, that is, if no other virtual objects exist in the field of view of the virtual object controlled by the current terminal, the virtual object does not have a target to aim at or shoot, and the viewing angle is adjusted
  • the operation may only be an operation for the user to adjust the viewing angle without aiming, so there is no need to provide an auxiliary aiming service, and the following step 305 may be executed to directly adjust the viewing angle based on the viewing angle adjustment operation.
  • the virtual scene includes the target virtual object, that is, if there are other virtual objects in the field of view of the virtual object controlled by the current terminal, the virtual object may want to target the other virtual object, or may not want to target the other virtual object.
  • the target virtual object may be any virtual object other than the virtual object controlled by the current terminal.
  • the virtual object controlled by the current terminal may also be teamed up with other virtual objects as virtual objects in the same team.
  • the virtual object controlled by the current terminal does not need to be in the same team.
  • the virtual object is aimed or shot. Therefore, the target virtual object may also be any virtual object different from the team to which the virtual object controlled by the current terminal belongs.
  • the embodiment of the present application does not limit the specific judgment method of the target virtual object.
  • the terminal acquires the adsorption area of the target virtual object. After determining that there is a target virtual object in the field of view of the virtual object controlled by the current terminal, the terminal may further determine whether the target virtual object has conditions for assisting aiming. When judging whether to provide auxiliary aiming services, the distance between the aiming point and the target virtual object can be considered. If the aiming point is closer to the target virtual object, auxiliary aiming services can be provided for this viewing angle adjustment operation; if the aiming point is away from the target virtual object If it is farther away, there is no need to provide auxiliary targeting services, so that auxiliary targeting services can be provided, reducing the complexity of user operations, and ensuring the fairness of the electronic game.
  • an adsorption area may be set for the target virtual object, and the adsorption area is the location of the aiming point that can provide auxiliary aiming services when aiming the target virtual object. That is, when the aiming point is located in the adsorption area, it can assist the user to adjust the viewing angle, and move the position of the aiming point to the target virtual object, so as to achieve rapid aiming.
  • the adsorption area may be an area around the target virtual object,
  • the adsorption area may be an area with the target virtual object as the center and the size as the target size.
  • the target size may be preset by relevant technicians.
  • the target size may also be obtained based on the distance between the virtual object controlled by the current terminal and the target virtual object.
  • this step 302 may be: the terminal obtains the adsorption area of the target virtual object according to the distance between the virtual object controlled by the current terminal and the target virtual object, and the size of the adsorption area is positively correlated with the distance. The larger the distance, the larger the size of the adsorption area, and the smaller the distance, the smaller the size of the adsorption area.
  • the display size of the first virtual object is very small, and the display size of the adsorption area of the first virtual object is not too small, so that the user is also It is relatively easy to adjust the viewing angle through a control operation, so that the position of the aiming point is moved to the adsorption area of the target virtual object, so as to obtain the auxiliary effect of assisting aiming.
  • the size of the adsorption area of the target virtual object that is farther away is larger, but the display size of the adsorption area is smaller due to the longer distance; the size of the adsorption area of the target virtual object that is closer Smaller, just because the distance is closer, the display size of the adsorption area is larger. It can be seen that the edge of the adsorption area of the target virtual object with a longer distance is farther away from the target virtual object, and the edge of the adsorption area of the target virtual object with a closer distance is closer to the target virtual object. Therefore, The actual size of the adsorption area of the two is actually opposite to the current display effect.
  • the shape of the adsorption area may also be preset by relevant technicians.
  • the adsorption area may be a circular area centered on the target virtual object, or may be centered on the target virtual object.
  • a polygonal area such as a quadrilateral area.
  • the adsorption area may also be a spherical area centered on the target virtual object, or a cylindrical area or a polygonal area centered on the target virtual object, which is not limited in the embodiment of the application. .
  • the terminal obtains the target rotation speed of the view angle of the virtual scene according to the position of the aiming point, the position of the target virtual object, and the viewing angle adjustment operation.
  • the terminal After acquiring the adsorption area of the target virtual object, the terminal can determine the positional relationship between the aiming point and the adsorption area, and thereby determine whether an auxiliary aiming service needs to be provided according to the positional relationship. Specifically, the terminal can determine whether the aiming point is located in the adsorption area of the target virtual object, and if so, the terminal can perform step 303 to provide assisted aiming services; if not, the terminal can perform the following step 305, directly based on the viewing angle Adjust the operation to adjust the viewing angle.
  • the terminal can comprehensively consider the position of the aiming point, the position of the target virtual object, and the viewing angle adjustment operation to obtain the target rotation speed of the viewing angle of the virtual scene. Specifically, the terminal may obtain the target rotation speed of the perspective of the virtual scene based on the first rotation speed corresponding to the viewing angle adjustment operation and the second rotation speed corresponding to the position of the aiming point and the target virtual object.
  • the process of obtaining the target rotation speed of the terminal in step 303 can be implemented through the following steps 1 to 3:
  • Step 1 The terminal obtains the first rotation speed of the viewing angle of the virtual scene according to the viewing angle adjustment operation.
  • the terminal may obtain the first rotation speed corresponding to the viewing angle adjustment operation.
  • the terminal may obtain the first rotation speed in a different manner.
  • the terminal may adjust the operating direction and operating distance of the operation according to the viewing angle to obtain the first rotation speed of the viewing angle of the virtual scene; if the viewing angle adjustment operation is a pressing operation, the terminal may Adjust the pressing position and pressing force or pressing duration of the operation according to the angle of view to obtain the angle of view of the virtual scene First rotation speed; if the viewing angle adjustment operation is a rotation operation on the terminal, the terminal may obtain the first rotation speed of the viewing angle of the virtual scene according to the rotation speed of the terminal, or obtain the first rotation speed according to the rotation angle and rotation direction of the terminal Rotation speed.
  • the viewing angle adjustment operation may also be another type of operation, which is not limited in the embodiment of the present application.
  • the first rotation speed acquired by the terminal may be different.
  • the first rotation speed may be positively correlated with the distance between the aiming point and the center of the adsorption area. The greater the distance between the aiming point and the center of the adsorption area, the greater the first rotation speed; the smaller the distance between the aiming point and the center of the adsorption area, the lower the first rotation speed.
  • the target virtual object may include multiple adsorption areas. It can also be understood that the adsorption area includes multiple sub-regions. For different adsorption areas, the first rotation speed corresponding to the viewing angle adjustment operation is different. That is, when the aiming point is located in a different adsorption area, the terminal adjusts the operation according to the viewing angle to obtain different first rotation speeds.
  • the target virtual object may include a first adsorption area, a second adsorption area, and a third adsorption area. Accordingly, the process of acquiring the first rotation speed by the terminal may include the following three situations:
  • Case 1 When the aiming point is located in the first adsorption area of the target virtual object, the terminal obtains the first preset rotation speed of the viewing angle of the virtual scene according to the viewing angle adjustment operation.
  • the first preset rotation speed is the normal rotation speed corresponding to the viewing angle adjustment operation when the assisted aiming service is not provided.
  • the terminal may not adjust the rotation speed corresponding to the viewing angle adjustment operation.
  • the terminal can obtain the first preset rotation speed.
  • Case 2 When the aiming point is located in the second adsorption area of the target virtual object, the terminal obtains the second preset rotation speed of the viewing angle of the virtual scene according to the viewing angle adjustment operation, and the second preset rotation speed is less than the first The preset rotation speed.
  • the first adsorption area surrounds the second adsorption area. That is, the size of the second adsorption area is smaller than the first adsorption area, and the second adsorption area is closer to the target virtual object than the first adsorption area.
  • the terminal can be based on the viewing angle
  • the adjustment operation obtains a rotation speed that is less than the first preset rotation speed, so that when approaching the target virtual object, the rotation speed of the viewing angle can be reduced, thereby assisting the user to move the aiming point to the target virtual object more easily and easily Stay on the target virtual object, and it is not easy to move the aiming point to a position beyond the target virtual object.
  • the aiming point is very close to the target virtual object, and the rotation speed of the viewing angle needs to be reduced to facilitate the aiming point to stay on the target virtual object more easily.
  • different sensitivities may be set for the viewing angle adjustment operation, and the sensitivity may refer to the ratio between the moving distance of the control virtual object and the operating range or operating distance of the user, and the sensitivity may also refer to the viewing angle.
  • the ratio between the rotation speed of the user and the operating range or operating distance of the user, and the sensitivity is positively correlated with the preset rotation speed corresponding to the viewing angle adjustment operation. That is, when the sensitivity is high, the preset rotation speed corresponding to the user's viewing angle adjustment operation is high, and vice versa.
  • the preset rotation speed obtained by the terminal based on the viewing angle adjustment operation is different.
  • the viewing angle adjustment operation corresponds to the first sensitivity
  • the viewing angle adjustment operation corresponds to the second sensitivity
  • the second sensitivity is less than the first sensitivity.
  • the first sensitivity and the second sensitivity can be preset by relevant technicians according to requirements, or can be adjusted by users according to their own usage habits, which are not limited in the embodiment of the present application.
  • the second sensitivity may be obtained based on the first sensitivity, such as obtaining the product of the first sensitivity and the target coefficient, or, for example, obtaining the difference between the first sensitivity and the target value. This is not limited.
  • the second preset rotation speed may be obtained based on the first preset rotation speed.
  • the terminal may obtain the difference between the first preset rotation speed and the first value, and use the difference as The second preset rotation speed, the first value is a positive number.
  • the terminal may obtain the product of the first preset rotation speed and the first coefficient, and use the product as the second preset rotation speed, and the first coefficient is a positive number less than 1.
  • the terminal may also obtain the second preset rotation speed in other ways.
  • multiple value groups may be set for the viewing angle adjustment operation, and each value group Include multiple different values.
  • the value group corresponding to the viewing angle adjustment operation is different, and this application is implemented
  • the example does not limit the specific implementation method.
  • Case 3 When the aiming point is located in the third adsorption area of the target virtual object, the terminal obtains the third preset rotation speed of the viewing angle of the virtual scene according to the viewing angle adjustment operation, and the third preset rotation speed is less than the first A preset rotation speed, the third preset rotation speed is different from the second preset rotation speed.
  • the third case is the same as the second case above, and the process of obtaining the third preset rotation speed by the terminal is the same as the process of obtaining the second preset rotation speed in the second case above.
  • the difference is that in this case three, the third preset rotation speed acquired by the terminal is different from the second preset rotation speed, that is, the second sensitivity and the aiming point corresponding to the viewing angle adjustment operation when the aiming point is located in the third adsorption area
  • the third sensitivity corresponding to the viewing angle adjustment operation is different when located in the second adsorption area.
  • the third sensitivity is less than the first sensitivity.
  • a damping coefficient is added to the original sensitivity, and the damping coefficient refers to a coefficient that reduces the sensitivity.
  • the sensitivity corresponding to the viewing angle adjustment operation becomes smaller.
  • the preset rotation speed corresponding to the viewing angle adjustment operation also becomes smaller. It is not easy for the user to move the aiming point out of the third adsorption area.
  • the specific values of sensitivity and damping coefficient are not specifically limited.
  • the user can more accurately adjust the area targeted by the aiming point through the above-mentioned viewing angle adjustment operation, that is, the user can realize a small adjustment of the angle of view through a large control operation, thereby realizing a small adjustment of the aiming point, for example, Small adjustments to the body parts of the target virtual object to be aimed at can also prevent the user from moving the aiming point too quickly away from the target virtual object due to the excessive operation range.
  • the second adsorption area surrounds the third adsorption area. That is, the size of the third adsorption area is smaller than the size of the second adsorption area.
  • the third sensitivity may be less than the second sensitivity, that is, the third preset rotation speed is less than the second The preset rotation speed. In this way, the closer the aiming point is to the target virtual object, the smaller the sensitivity corresponding to the viewing angle adjustment operation, and the smaller the preset rotation speed corresponding to the viewing angle adjustment operation.
  • the third adsorption area may be the area where the target virtual object is located.
  • the third preset rotation speed may be lower than the second preset rotation speed, the user operation range can be avoided to a certain extent. Too large causes the aiming point to deviate from the target virtual object.
  • the third sensitivity may also be greater than the second sensitivity, that is, the third preset rotation speed may be greater than the second preset rotation speed.
  • the third sensitivity may also be acquired based on the second sensitivity, and the acquisition process is the same as the above-mentioned process of acquiring the second sensitivity based on the first sensitivity, and will not be repeated here in the embodiment of the present application.
  • the third preset rotation speed may also be acquired based on the first preset rotation speed.
  • the terminal may acquire the difference between the first preset rotation speed and the second value, and set the difference As the third preset rotation speed, the second value is a positive number.
  • the terminal may obtain the product of the first preset rotation speed and the second coefficient, and use the product as the third preset rotation speed, where the second coefficient is a positive number less than 1.
  • the third preset rotation speed may also be obtained based on the second preset rotation speed, in the same manner as in the foregoing implementation manner, and details are not repeated in the embodiment of the present application.
  • the target virtual object includes three adsorption regions, and the three adsorption regions are only an exemplary description, and the embodiment of the present application does not limit the shape and size of the three adsorption regions.
  • the adsorption area of the target virtual object includes three types.
  • the adsorption area of the target virtual object may also include more types, or may include less than three cases.
  • the setting of the adsorption area can be adjusted by relevant technical personnel according to requirements, which is not limited in the embodiment of the present application.
  • the terminal when determining whether the aiming point is located in the adsorption area of the target virtual object, the terminal may be provided with a collider for each virtual object, and the collider is used to detect the adsorption area of the virtual object, and the terminal may The ray detection is performed based on the direction of the aiming point, and if the ray collides with the collider of the virtual object, it can be obtained that the aiming point is located in the adsorption area of the virtual object.
  • the collider and ray detection method is only one possible implementation method, and the terminal can also judge according to the position coordinates of the aiming point and the coordinate range of the adsorption area. The embodiment of the present application does not limit which method is used.
  • Step 2 The terminal obtains a second rotation speed of the view angle of the virtual scene according to the position of the aiming point and the position of the target virtual object, and the direction of the second rotation speed is from the aiming point toward the target virtual object.
  • the terminal may also consider the positional relationship between the aiming point and the target virtual object to determine how to provide further assistance based on the rotation speed of the corresponding viewing angle of the user operation.
  • the terminal may obtain the distance between the aiming point and the target virtual object according to the position of the aiming point and the position of the target virtual object.
  • the distance between the aiming point and the target virtual object may be expressed in various ways, as follows: Three different ways of expressing this distance explain the second step:
  • Manner 1 The terminal obtains the second rotation speed of the viewing angle of the virtual scene according to the distance between the target virtual object and the aiming point projected on the terminal screen.
  • the distance between the aiming point and the target virtual object can be represented by the distance between the two projected on the terminal screen, and there is a conversion relationship between the distance and the second rotation speed, and the terminal can convert according to the distance and Relationship, calculate the second rotation speed.
  • the second rotation speed and the distance between the target virtual object and the aiming point projected on the terminal screen are negatively correlated. That is, the smaller the distance, the greater the second rotation speed; the greater the distance, the lower the second rotation speed.
  • the strength of auxiliary aiming can be continuously increased, assisting the user to quickly aim the target virtual object, and fully respecting the user's operation, and performing auxiliary aiming based on the user's operation.
  • the second rotation speed may also be positively related to the projected distance of the target virtual object and the aiming point on the terminal screen, which is not limited in this embodiment of the application.
  • Manner 2 The terminal obtains the second rotation speed of the viewing angle of the virtual scene according to the distance between the target virtual object and the aiming point in the virtual scene.
  • the distance between the aiming point and the target virtual object can be represented by the distance between the two in the virtual scene, and there is a conversion relationship between the distance and the second rotation speed, and the terminal can be based on the distance and the conversion relationship , Calculate the second rotation speed.
  • the second rotation speed and the distance between the target virtual object and the aiming point in the virtual scene are negatively correlated. That is, the smaller the distance, the greater the second rotation speed; the greater the distance, the lower the second rotation speed. In this way, when the aiming point is close to the target virtual object, the strength of the auxiliary aiming can be continuously improved, and the user can quickly aim at the target virtual object.
  • the second rotation speed may also be positively correlated with the distance between the target virtual object and the aiming point in the virtual scene, which is not limited in this embodiment of the application.
  • Manner 3 The terminal obtains the second rotation speed of the viewing angle of the virtual scene according to the angle between the connection direction of the virtual object controlled by the current terminal and the target virtual object and the direction of the aiming point.
  • the distance between the aiming point and the target virtual object can be represented by the distance between the two projected on the terminal screen, the included angle and the second rotation speed have a conversion relationship, and the terminal can be based on the included angle And the conversion relationship, calculate the second rotation speed.
  • the second rotation speed is negatively related to the included angle. That is, the larger the included angle, the lower the second rotation speed; the smaller the included angle, the greater the second rotation speed. In this way, when the aiming point is close to the target virtual object, the strength of auxiliary aiming can be continuously increased, and the user can quickly aim at the target virtual object.
  • the second rotation speed may also be positively correlated with the included angle, which is not limited in the embodiment of the present application.
  • the distance may also include other ways of expressing.
  • it may be the horizontal distance between the aiming point and the target virtual object projected on the terminal screen. It may be the horizontal distance component of the distance between the aiming point and the target virtual object projected on the terminal screen.
  • it may be the horizontal distance between the aiming point and the target virtual object in the virtual scene, and the horizontal distance may be the distance component in the horizontal direction of the distance between the aiming point and the target virtual object in the virtual scene.
  • the embodiments of this application do not limit the specific expression method used.
  • the process of acquiring the second rotation speed of the terminal in step 2 may also consider the operating direction of the viewing angle adjustment operation, that is, when the operating direction of the viewing angle adjusting operation is different, the terminal may also consider Location and The second rotation speed obtained by the position of the target virtual object may be different.
  • the second step may be: the terminal obtains the second rotation speed of the viewing angle of the virtual scene according to the position of the aiming point, the position of the target virtual object, and the operating direction of the viewing angle adjustment operation.
  • the viewing angle adjustment operation can be divided into two operating directions, one is to control the aiming point to move to the target virtual object, and the other is to control the aiming point to move in the opposite direction of the target virtual object.
  • the process of obtaining the second rotation speed by the terminal will be described.
  • Case 1 When the operating direction of the viewing angle adjustment operation indicates that the aiming point moves to the target virtual object, according to the position of the aiming point, the position of the target virtual object, and the first parameter, the third rotation speed is obtained as the virtual object.
  • the second rotation speed of the perspective of the scene is obtained as the virtual object.
  • the first parameter and the second parameter can be set by relevant technicians according to requirements, and the conversion relationship between the distance between the aiming point and the target virtual object, the first parameter and the third rotation speed can also be determined by The relevant technical personnel make settings, and the conversion relationship between the above-mentioned distance between the aiming point and the target virtual object, the second parameter, and the fourth rotation speed can also be set by the relevant technical personnel, which is not limited in the embodiment of the present application. If the viewing angle adjustment operation can make the aiming point close to the target virtual object, a larger second rotation speed, that is, the third rotation speed, can be provided, and the auxiliary aiming strength provided is stronger.
  • a lower second rotation speed that is, the fourth rotation speed
  • the assisted aiming strength provided is weak.
  • the direction of the third rotation speed and the fourth rotation speed are the same, and both start from the aiming point toward the target virtual object, but the speed is different.
  • the left side of the aiming point is the target virtual object. If the viewing angle adjustment operation instructs the aiming point to move to the positive left, the aiming point is close to the target virtual object, and the distance between the two is reduced.
  • the first rotation speed corresponding to the viewing angle adjustment operation is 30 degrees per second, and then a third rotation speed can be added to the first rotation speed.
  • the third rotation speed can be 10 degrees per second.
  • a rotation speed direction is the same.
  • the viewing angle adjustment process is embodied when the position of the aiming point in the virtual scene displayed in adjacent frames changes, it can be:
  • the viewing angle adjustment operation can control the aiming point to move 90 meters to the left in the next frame compared to the previous frame, then
  • the third rotation speed can make the aiming point move 30 meters to the left.
  • the viewing angle adjustment operation instructs the aiming point to move to the right, the aiming point is far away from the target virtual object, and the distance between the two increases.
  • the first rotation speed corresponding to the viewing angle adjustment operation is 30 degrees per second, you can A fourth rotation speed is added to the first rotation speed, and the fourth rotation speed may be 3 degrees per second.
  • the fourth rotation speed is opposite to the first rotation speed.
  • the viewing angle adjustment process is embodied as the position of the aiming point in the virtual scene displayed in adjacent frames changes, it may be: the viewing angle adjustment operation can control the aiming point to move 90 meters to the right in the next frame compared to the previous frame, then The third rotation speed can make the aiming point move 9 meters to the left, that is, the aiming point moves 9 meters to the right.
  • the fourth rotation speed is lower than the third rotation speed, and the strength of the auxiliary aiming becomes smaller.
  • the terminal adopts the above-mentioned way 1 or way 2 expression, considering that the distance between the aiming point and the target virtual object is the same, the virtual object currently controlled by the terminal and the target virtual object When the distances are different, if the angle of rotation required to move the aiming point to the target virtual object is different, the terminal can also obtain the distance between the virtual object controlled by the current terminal and the target virtual object, which is considered in the second step above Get the second rotation speed to this distance.
  • the distance between the virtual object controlled by the current terminal and the target virtual object, the distance between the aiming point and the target virtual object, and the second rotation speed may have a conversion relationship, thereby ensuring that the distance between the aiming point and the target virtual object When the distance between the objects is the same, but when the distance between the virtual object controlled by the current terminal and the target virtual object is different, a different second rotation speed is acquired.
  • the distance between the aiming point and the target virtual object is the same, if the distance between the virtual object controlled by the current terminal and the target virtual object is large, and the angle of view required to be rotated is small, then a small second rotation speed is acquired; If the current end If the distance between the virtual object controlled by the terminal and the target virtual object is small, and the angle required to rotate the viewing angle is large, a larger second rotation speed is obtained. In this way, the same auxiliary effect can be provided when the distance between the virtual object controlled by the current terminal and the target virtual object is different.
  • the terminal obtains the virtual scene according to the distance between the virtual object controlled by the current terminal and the target virtual object, and the distance between the target virtual object and the aiming point projected on the terminal screen
  • the second rotation speed of the viewing angle where the second rotation speed is negatively related to the distance between the virtual object controlled by the current terminal and the target virtual object.
  • the second method above may be: the terminal obtains the second view of the virtual scene according to the distance between the virtual object controlled by the current terminal and the target virtual object, and the distance between the target virtual object and the aiming point in the virtual scene
  • the rotation speed is negatively related to the distance between the virtual object controlled by the current terminal and the target virtual object.
  • the foregoing provides multiple implementation manners for the terminal to obtain the second rotation speed.
  • the terminal may adopt any implementation manner to obtain the second rotation speed, or any combination of the foregoing multiple implementation manners may be used to obtain the second rotation speed.
  • both the operating direction of the viewing angle adjustment operation and the distance between the virtual object controlled by the current terminal and the target virtual object can be considered, so that the terminal can adjust the operating direction of the operation based on the viewing angle, and the virtual object controlled by the current terminal
  • the distance between the target virtual objects, the position of the aiming point, and the position of the target virtual object are used to obtain the second rotation speed, which is not limited in the embodiment of the present application, and will not be repeated here.
  • the terminal may perform step one first, then perform step two, or perform step two first, then perform step one, or perform step one and step two at the same time.
  • the timing is not limited.
  • the terminal executes the method according to the position of the aiming point and the The step of obtaining the position of the target virtual object and the second rotation speed of the viewing angle of the virtual scene.
  • the step two is not executed, or zero is used as the second rotation speed. Accordingly, the following step three may be: the terminal uses the first rotation speed as the The target rotation speed of the perspective of the virtual scene.
  • the first rotation speed of the viewing angle corresponding to the viewing angle adjustment operation may be the normal rotation speed corresponding to the user operation, and on the basis of the first rotation speed, it may be based on the position of the aiming point and The position of the target virtual object is acquired, and then the second rotation speed is obtained, so as to synthesize the first rotation speed and the second rotation speed to obtain the target rotation speed.
  • the first rotation speed of the viewing angle corresponding to the viewing angle adjustment operation is lower than the rotation speed corresponding to the normal user operation.
  • the terminal On the basis of the first rotation speed, it may be based on the position of the aiming point and the position of the target virtual object , And then obtain the second rotation speed, so as to synthesize the first rotation speed and the second rotation speed to obtain the target rotation speed.
  • the terminal In the third attachment area, the terminal may only perform step one, the first rotation speed of the viewing angle corresponding to the viewing angle adjustment operation is lower than the rotation speed corresponding to the normal user operation, and the first rotation speed is used as the target rotation speed.
  • Step 3 The terminal obtains the target rotation speed of the viewing angle of the virtual scene based on the first rotation speed and the second rotation speed.
  • the two rotation speeds can be synthesized to obtain the target rotation speed of the viewing angle of the virtual scene.
  • the target rotation speed is the rotation speed required to adjust the viewing angle.
  • weights may be set for the first rotation speed and the second rotation speed, and the terminal may perform a weighted summation of the first rotation speed and the second rotation speed to obtain the target rotation speed of the view angle of the virtual scene .
  • the weights of the first rotation speed and the second rotation speed can be set by relevant technicians according to requirements, can also be obtained based on the position of the aiming point and the position of the target virtual object, or can be based on the virtual object and the virtual object controlled by the current terminal.
  • the distance between the target virtual objects is acquired, which is not limited in the embodiment of the present application.
  • the third step is: the terminal can sum the first rotation speed and the second rotation speed to obtain the target rotation speed of the view angle of the virtual scene .
  • both the first rotation speed and the second rotation speed may be vectors, and the directions of the first rotation speed and the second rotation speed may be the same or different.
  • the direction of the first rotation speed is the direction corresponding to the viewing angle adjustment operation, and the direction of the second rotation speed is The direction is the direction from the aiming point towards the target virtual object. Therefore, the third step may be: the terminal may perform vector summation of the first rotation speed and the second rotation speed to obtain the target rotation speed.
  • the following two extreme cases are taken as examples for description.
  • the directions of the first rotation speed and the second rotation speed are the same, the aiming point is directly to the left of the target virtual object, and the operating direction of the viewing angle adjustment operation is positive To the right, that is, to control the aiming point to move in the direction to the right, then the direction of the first rotation speed is the right direction, and the second rotation speed is the direction from the aiming point to the target virtual object, that is, the direction is positive.
  • the value of the target rotation speed may be the sum of the value of the first rotation speed and the value of the second rotation speed, and the direction of the target rotation speed is the positive right.
  • the aiming point is on the front left of the target virtual object
  • the operating direction of the viewing angle adjustment operation is the front left, that is, controlling the aiming point along the front Moving in the left direction
  • the direction of the first rotation speed is positive left
  • the second rotation speed is the direction starting from the aiming point toward the target virtual object, that is, it is positive right.
  • the value of the target rotation speed may be the difference between the value of the first rotation speed and the value of the second rotation speed
  • the direction of the target rotation speed depends on the magnitude relationship between the value of the first rotation speed and the value of the second rotation speed.
  • the first rotation speed may be greater than the second rotation speed, which can ensure respect for user operations, ensure fairness and impartiality of electronic games, and ensure a better gaming experience.
  • the above step 303 is a process of providing auxiliary aiming services when the aiming point is located in the adsorption area of the target virtual object.
  • the terminal may execute the following step 304 to adjust the view angle.
  • the terminal may perform step 305 without providing auxiliary aiming services, and perform normal angle adjustment directly based on the angle adjustment operation.
  • the terminal displays the target virtual scene based on the target rotation speed of the viewing angle.
  • the terminal can adjust the viewing angle based on the target rotation speed, and display the adjusted target virtual scene.
  • the specific process of the terminal adjusting the viewing angle of the virtual scene may be as follows: the terminal may calculate the rotation angle of the virtual scene viewing angle within a preset time interval according to the target rotation speed of the viewing angle; the terminal controls the viewing angle to rotate the rotation angle.
  • the preset time interval refers to the time interval between adjacent frames.
  • the preset time interval can be preset by a technician, or can be set and adjusted by the user according to the operating conditions of the device.
  • step 301 to step 304 is a dynamic viewing angle adjustment process.
  • the terminal can execute the above step 301 to step 304 in each frame, and after calculating the target rotation speed of the viewing angle in each frame, it can be based on this
  • the target rotation speed of the viewing angle is calculated from the rotation angle of the viewing angle from one frame to the next frame, and the viewing angle direction in the next frame is calculated, so that the target virtual scene of the next frame is rendered and displayed.
  • the terminal repeats the above detection, acquisition, adjustment and display process in the next frame.
  • the current terminal controls the virtual object
  • the field of view may also include multiple other virtual objects, and the multiple other virtual objects are candidate virtual objects, that is, any candidate virtual object may be selected as the target virtual object.
  • the terminal can select one of multiple candidate virtual objects as the target virtual object, so as to provide auxiliary aiming services for the process of aiming at the target virtual object.
  • the terminal selects one candidate virtual object from the multiple candidate virtual objects as the target virtual object; the terminal performs step 303 based on the selected target virtual object , That is, performing the step of obtaining the target rotation speed of the viewing angle of the virtual scene according to the position of the aiming point, the position of the target virtual object, and the viewing angle adjustment operation.
  • process of the terminal selecting the target virtual object from the multiple candidate virtual objects can be implemented in any of the following ways:
  • Manner 1 The terminal randomly selects one candidate virtual object from the multiple candidate virtual objects as the target virtual object.
  • Manner 2 The terminal selects the candidate virtual object with the smallest distance from the aiming point as the target virtual object according to the distances between the multiple candidate virtual objects and the aiming point in the virtual scene.
  • Manner 3 The terminal selects the candidate virtual object with the smallest distance from the aiming point as the target virtual object according to the distance between the multiple candidate virtual objects and the position where the aiming point is projected on the terminal screen.
  • Manner 4 The terminal selects the candidate virtual object with the smallest included angle as the target virtual object according to the angle between the connection direction of the multiple candidate virtual objects and the virtual object controlled by the current terminal and the direction of the aiming point.
  • the terminal can obtain the included angle, and then select the candidate virtual object with the smallest included angle as the target virtual object.
  • the process for the terminal to obtain the included angle can be implemented in multiple ways.
  • the terminal may obtain the multiple candidate virtual objects according to the distance between the multiple candidate virtual objects and the aiming point in the virtual scene, and the distance between the multiple candidate virtual objects and the virtual object controlled by the current terminal. The angle corresponding to the candidate virtual objects.
  • the terminal may obtain according to the distance between the multiple candidate virtual objects and the aiming point projected on the terminal screen, and the distance between the multiple candidate virtual objects and the virtual object controlled by the current terminal The angles corresponding to the multiple candidate virtual objects.
  • the terminal can also adopt other methods, for example, it can perform according to the position of multiple candidate virtual objects, the position of the virtual object controlled by the current terminal, and the direction of the aiming point.
  • the angle required to rotate the viewing angle when the aiming point moves to the area where the candidate virtual object is located is obtained, that is, the included angle.
  • the embodiment of the present application does not limit how to obtain the included angle.
  • the terminal can perform step 305 to perform a normal viewing angle adjustment process based on the viewing angle adjustment operation.
  • the terminal may obtain a first preset rotation speed, where the first preset rotation speed is the normal rotation speed corresponding to the viewing angle adjustment operation when the assisted aiming service is not provided, and the terminal adjusts the viewing angle according to the first preset rotation speed.
  • the adjustment is performed to display the target virtual scene after the viewing angle adjustment, and does not provide assistance for this viewing angle adjustment operation.
  • step 302 when the viewing angle adjustment operation is detected, the terminal can acquire the display mode of the virtual scene, so that the terminal can determine whether the display mode of the virtual scene is the display mode based on the sight.
  • step 302 is to execute the step of acquiring the adsorption area of the target virtual object;
  • step 305 is to obtain the first preset rotation speed of the viewing angle of the virtual scene according to the viewing angle adjustment operation.
  • the terminal can provide assistance for the virtual object when aiming at the target virtual object.
  • you want to accurately aim or shoot the target virtual object you will generally observe the virtual scene based on the sight of the virtual prop As well as the target virtual object in the virtual scene, it is sufficient to provide auxiliary aiming services in this display mode.
  • the virtual object may be moving or observing the virtual scene without intending to perform the target virtual object. Aiming, eliminating the need to provide auxiliary aiming.
  • an auxiliary aiming function may also be provided: movement following.
  • a fourth adsorption area may be provided for the target virtual object, and the fourth adsorption area may be the same as the first adsorption area and the second adsorption area.
  • any one of the third adsorption regions is the same, or may be different from the above three adsorption regions, and can be specifically set by relevant technicians according to requirements, which is not limited in the embodiment of the present application.
  • the terminal when the aiming point is located in the fourth adsorption area of the target virtual object and the target virtual object moves, the terminal can control the aiming point to move following the target virtual object.
  • the terminal may obtain the moving speed and moving direction of the target virtual object, and obtain the target following speed and target following direction of the aiming point according to the moving speed and moving direction of the target virtual object.
  • the target following speed may be less than the moving speed of the target virtual object, and the target following direction may be the same as the moving direction of the target virtual object.
  • the above-mentioned viewing angle adjustment process can be embodied as a process of updating the lens orientation in each frame (tick).
  • the terminal can determine whether there is a lens turning input, that is, whether there is Angle adjustment operation, if it is, you can continue to determine whether the virtual object lifts the mirror.
  • the virtual scene is in a sight-based display mode.
  • the virtual object lifts the mirror it can be judged whether the aiming point is located at the enemy’s adsorption In the area, if it is, it can provide auxiliary aiming. Specifically, according to the position of the aiming point, different auxiliary aiming functions can be provided.
  • the adsorption speed can be provided. Near, a magnetic force can be generated, and a magnetic force calculation can be performed, and the magnetic force calculation is used to control the aiming point to follow the target virtual object. If the aiming point is on the enemy, damping can be generated and the damping calculation can be performed, such as increasing the damping coefficient and reducing the sensitivity to reduce the rotation speed of the viewing angle.
  • auxiliary aiming when the viewing angle adjustment operation is detected, if the aiming point is in the adsorption area of the target virtual object, auxiliary aiming can be provided, and the angle adjustment operation and the position of the aiming point and the target virtual object can be integrated to obtain the rotation speed of the view angle of the virtual scene.
  • the viewing angle adjustment operation is taken into account in the above process, and the auxiliary aiming is provided on the basis of respecting the user operation, which can avoid the situation that the virtual scene display is separated from the user operation, which can satisfy the user
  • the demand not only respects the user's operation, but also provides auxiliary effects, and the display effect is better.
  • FIG. 2 described the specific process of providing auxiliary aiming services when the viewing angle adjustment operation is detected.
  • an auxiliary aiming function may also be provided: the terminal can switch the display mode of the virtual scene
  • the specific method can refer to the embodiment shown in FIG. 10 below.
  • FIG. 10 is a flowchart of a method for displaying a virtual scene provided by an embodiment of the present application. Referring to FIG. 10, the method may include the following steps:
  • the terminal When detecting that the virtual scene is switched from the first display mode to the second display mode, the terminal acquires the target area corresponding to the aiming point.
  • the second display mode is a display mode based on sights
  • Figure 2 shows a virtual scene in the second display mode.
  • the first display mode is a display mode other than the second display mode.
  • Fig. 1 shows a virtual scene in the first display mode.
  • the virtual object When the display mode of the virtual scene is switched from the first display mode to the second display mode, the virtual object may be aimed at or shooting other virtual objects in the virtual scene.
  • the terminal can determine the vicinity of the aiming point Whether there are other virtual objects, and if so, an auxiliary aiming service can be provided to move the aiming point to the area where the other virtual objects are located.
  • a target area corresponding to the aiming point may be set, and the target area is an area near the aiming point, that is, the target area is an area whose distance from the aiming point meets certain conditions.
  • the target area can be acquired, so that whether the target area includes other virtual objects is used as a condition for providing auxiliary aiming services.
  • the target area may be an area whose distance from the aiming point is less than a distance threshold.
  • the distance threshold can be set by relevant technical personnel according to requirements, which is not limited in the embodiment of the present application.
  • the process for the terminal to acquire the target area may be: the terminal acquires the aiming point as the center and the size as the preset size Inch target area.
  • the preset size can be set by relevant technical personnel, which is not limited in the embodiment of the present application.
  • the target area may be a circular area with the aiming point as the center and the radius as the target radius.
  • the process of the terminal acquiring the target area may be: the terminal acquiring a circular area with the aiming point as the center and the radius as the target radius.
  • the shape of the target area may also be other shapes, for example, a polygonal area.
  • the target area may be a circular area as an example for description, and the embodiment of the present application does not limit the shape of the target area.
  • the target area may be a circular area centered on the aiming point.
  • the terminal detects whether the target virtual object is included in the target area, if yes, execute step 1003, and if not, execute step 1007.
  • the terminal After acquiring the target area corresponding to the aiming point, the terminal can determine whether the target virtual object is included in the target area, and thereby determine whether an auxiliary aiming service needs to be provided according to the judgment result. Understandably, if the target area does not include the target virtual object, that is, there are no other virtual objects near the aiming point, there is no need to provide auxiliary aiming services, and the following step 1007 may be executed to switch the display mode directly. For example, as shown in Fig. 11, if the target virtual object is included in the target area, the assisted aiming service can be provided.
  • the virtual object may want to aim at the other virtual object when the display mode is switched this time, so auxiliary aiming services can be provided.
  • the following steps 1003 to 1006 can be performed.
  • the target virtual object may be any virtual object other than the virtual object controlled by the current terminal.
  • the virtual object controlled by the current terminal may also be teamed up with other virtual objects as virtual objects in the same team.
  • the virtual object controlled by the current terminal does not need to be in the same team.
  • the virtual object is aimed or shot. Therefore, the target virtual object may also be any virtual object different from the team to which the virtual object controlled by the current terminal belongs.
  • the embodiment of the present application does not limit the specific judgment method of the target virtual object.
  • the target area may also include multiple other virtual objects, and the multiple other virtual objects are all candidate virtual objects, that is, any candidate virtual object may be selected as the target virtual object.
  • the terminal can select one of multiple candidate virtual objects as the target virtual object, so as to provide auxiliary aiming services for the process of aiming at the target virtual object.
  • Steps 1003 to 1005 are the steps of acquiring the target rotation direction and target rotation angle of the view angle of the virtual scene according to the position of the target virtual object and the position of the aiming point.
  • process of the terminal selecting the target virtual object from the multiple candidate virtual objects can be implemented in any of the following ways:
  • Manner 1 The terminal randomly selects one candidate virtual object from the multiple candidate virtual objects as the target virtual object.
  • Manner 2 The terminal selects the candidate virtual object with the smallest distance from the aiming point as the target virtual object according to the distances between the multiple candidate virtual objects and the aiming point in the virtual scene.
  • Manner 3 The terminal selects the candidate virtual object with the smallest distance from the aiming point as the target virtual object according to the distance between the multiple candidate virtual objects and the position where the aiming point is projected on the terminal screen.
  • Manner 4 The terminal selects the candidate virtual object with the smallest included angle as the target virtual object according to the angle between the connection direction of the multiple candidate virtual objects and the virtual object controlled by the current terminal and the direction of the aiming point.
  • the terminal may obtain the included angle, and then select the candidate virtual object with the smallest included angle as the target virtual object.
  • the process for the terminal to obtain the included angle can be implemented in multiple ways.
  • the terminal may obtain the multiple candidate virtual objects according to the distance between the multiple candidate virtual objects and the aiming point in the virtual scene, and the distance between the multiple candidate virtual objects and the virtual object controlled by the current terminal. The angle corresponding to the candidate virtual objects.
  • the terminal may obtain according to the distance between the multiple candidate virtual objects and the aiming point projected on the terminal screen, and the distance between the multiple candidate virtual objects and the virtual object controlled by the current terminal The angles corresponding to the multiple candidate virtual objects.
  • the terminal can also adopt other methods, for example, it can perform according to the position of multiple candidate virtual objects, the position of the virtual object controlled by the current terminal, and the direction of the aiming point.
  • the angle required to rotate the viewing angle when the aiming point moves to the area where the candidate virtual object is located is obtained, that is, the included angle.
  • the embodiment of the present application does not limit how to obtain the included angle.
  • the selection process of the target virtual object can also be implemented in other ways. For example, it can be selected according to the horizontal distance between the aiming point and the target virtual object projected on the terminal screen, and the horizontal distance can be It is the horizontal distance component of the distance between the aiming point and the target virtual object projected on the terminal screen. Similarly, the selection can also be made according to the horizontal distance between the aiming point and the target virtual object in the virtual scene, and the horizontal distance may be the distance component in the horizontal direction of the distance between the aiming point and the target virtual object in the virtual scene.
  • the embodiment of the present application does not limit the specific representation mode.
  • the terminal obtains the target position of the target virtual object.
  • the terminal After the terminal determines to provide the assisted aiming service, it can first obtain the target position of the target virtual object.
  • the target position is the position to which the aiming point will move, so as to determine how based on the target position and the current position of the aiming point Rotate the angle of view to display the rotated target virtual scene.
  • the process of acquiring the target position may include multiple methods.
  • the target position may be acquired based on the positional relationship between the aiming point and the target virtual object.
  • the target position may also be a fixed position on the target virtual object. Three methods are used to describe the process of obtaining the target position, and the terminal may use any method to obtain the target position.
  • Manner 1 The terminal obtains the target position of the target virtual object according to the horizontal relationship between the aiming point and the position of the target virtual object projected on the terminal screen.
  • the terminal can obtain the projected position of the aiming point and the target virtual object on the terminal screen according to the position of the aiming point and the position of the target virtual object, so that according to the relationship between the projection positions of the two in the horizontal direction, Get the target position of the target virtual object.
  • the relationship between the projection positions of the two in the horizontal direction may include two cases. Accordingly, in the first method, the process of obtaining the target position by the terminal may be different, which is specifically as follows:
  • Case 1 When the horizontal position of the projection position of the aiming point in the horizontal direction is within the horizontal position range of the projection position of the target virtual object in the horizontal direction, the terminal compares the horizontal position range with the horizontal position of the aiming point The same position is used as the target position of the target virtual object.
  • the position corresponding to the horizontal position of the projected position of the aiming point can be directly used as the target position.
  • the aiming point after obtaining the target position, the aiming point can be controlled to move only in the horizontal direction.
  • Case 2 When the horizontal position of the projection position of the aiming point in the horizontal direction is outside the horizontal position range of the projection position of the target virtual object in the horizontal direction, the terminal corresponds to the position corresponding to the horizontal position of the aiming point or the target The location of the target part of the virtual object is used as the target location of the target virtual object.
  • the horizontal position can also correspond to different positions.
  • the position of the first part of the target virtual object may be used as the target position; when the horizontal position is below the horizontal position range, the target virtual object may be The position of the second part of the is used as the target position.
  • the first part and the second part can be set by relevant technicians.
  • the first part can be the head and the second part can be the feet.
  • the first part and the second part can also be Other parts, for example, the first part may also be the chest, and the second part may also be the legs, which is not limited in the embodiment of the present application.
  • the terminal may also acquire a fixed position (where the target part is located) on the target virtual object as the target position.
  • the location of the target part may be the location of the head or the neck, or the location of other parts, such as the center location, which is not limited in the embodiment of the present application.
  • Manner 2 The terminal obtains the target position of the target virtual object according to the relationship between the aiming point and the horizontal position of the target virtual object in the virtual scene. In the second manner, the terminal can obtain the relationship between the position of the aiming point and the target virtual object in the virtual scene in the virtual scene, and thereby obtain the target virtual object according to the position relationship. The target location of the phantom.
  • the relationship between the positions of the two in the virtual scene in the horizontal direction may include two situations.
  • the process for the terminal to obtain the target position may be different, specifically as follows:
  • Case 1 When the horizontal position of the position of the aiming point in the virtual scene in the horizontal direction is within the horizontal position range of the position of the target virtual object in the virtual scene in the horizontal direction, the terminal compares the horizontal position range with The position where the horizontal position of the aiming point is the same as the target position of the target virtual object.
  • the position corresponding to the horizontal position of the aiming point can be directly used as the target position.
  • the aiming point after obtaining the target position, the aiming point can be controlled to move only in the horizontal direction.
  • Case 2 When the horizontal position of the position of the aiming point in the virtual scene in the horizontal direction is outside the horizontal position range of the position of the target virtual object in the virtual scene in the horizontal direction, the terminal sets the horizontal position of the aiming point The corresponding position or the position of the target part of the target virtual object is used as the target position of the target virtual object.
  • the horizontal position of the aiming point and the horizontal position range can also correspond to a different position.
  • the position of the first part of the target virtual object may be used as the target position; when the horizontal position is below the horizontal position range, the target virtual object may be The position of the second part of the is used as the target position.
  • the first part and the second part can be set by relevant technicians.
  • the first part can be the head and the second part can be the feet.
  • the first part and the second part can also be Other parts, for example, the first part may also be the chest, and the second part may also be the legs, which is not limited in the embodiment of the present application.
  • the terminal may also acquire a fixed position (where the target part is located) on the target virtual object as the target position.
  • the location of the target part may be the location of the head or the neck, or the location of other parts, such as the center location, which is not limited in the embodiment of the present application.
  • Manner 3 The terminal obtains the location of the target part of the target virtual object as the target location.
  • the terminal does not need to determine the positional relationship between the aiming point and the target virtual object, and directly uses the fixed position on the target virtual object (the position of the target part) as the target position.
  • the target part may be preset by relevant technical personnel, which is not limited in the embodiment of the present application.
  • the above only provides three ways.
  • the process of obtaining the target position of the target virtual object can also be implemented in other ways. For example, it can be based on the aiming point and the position of the target virtual object in the virtual scene or the position projected on the terminal screen.
  • the embodiment of the present application does not limit the specific method.
  • the terminal starts from the aiming point and faces the target position as the target rotation direction.
  • the terminal may use the direction from the aiming point to the target position as the target of the virtual scene's perspective Direction of rotation.
  • the terminal uses the angle between the direction of the aiming point and the connection direction of the target position and the virtual object controlled by the current terminal as the target rotation angle.
  • the terminal After obtaining the target position of the target virtual object in the above step 1003, since the target position is the position to which the aiming point will be moved, the terminal can directly obtain the target rotation of the viewing angle based on the position of the aiming point and the target position. Angle, the angle of view rotating the target rotation angle can make the aiming point move to the target position.
  • the direction of the aiming point is the direction of the viewing angle. After the viewing angle is adjusted, the aiming point needs to be moved to the target position. After the viewing angle is adjusted, the target direction of the viewing angle is the connection direction between the target position and the virtual object controlled by the current terminal. The angle between the direction of the point and the connecting direction is the target rotation angle.
  • step 1004 and step 1005 are the process of obtaining the target rotation direction and target rotation angle of the view angle of the virtual scene according to the position of the aiming point and the target position.
  • the terminal may perform step 1004 first, and then perform step 1005, step 1005 may also be performed first, and then step 1004, the terminal may also perform step 1004 and step 1004 at the same time.
  • Step 1005 The embodiment of the present application does not limit the execution order of step 1004 and step 1005.
  • Steps 1003 to 1005 are the process of obtaining the target rotation direction and target rotation angle of the perspective of the virtual scene according to the position of the target virtual object and the position of the aiming point.
  • the terminal first obtains the target of the target virtual object. Position, and then obtain the target rotation direction and target rotation angle based on the target position and the position of the aiming point.
  • the terminal displays the target virtual scene based on the target rotation direction and the target rotation angle of the viewing angle, where the aiming point is located in the area where the target virtual object is located.
  • the terminal can obtain the target virtual scene according to the target rotation direction and the target rotation angle of the viewing angle, and thereby display the target virtual scene. This process is that the terminal adjusts the viewing angle based on the target rotation direction and the target rotation angle, and displays the adjusted target virtual scene.
  • the terminal may also obtain the target virtual scene according to the target rotation direction of the viewing angle, the target rotation angle, and the zoom ratio corresponding to the sight, so as to display the target virtual scene, and the target virtual scene is based on the zoom ratio
  • the zoomed virtual scene, and the aiming point in the target virtual scene is located in the area where the target object is located.
  • the terminal can control the viewing angle rotation, as shown in FIG.
  • the aiming point has moved to the target virtual object in the target virtual scene.
  • the terminal displays the virtual scene based on the zoom ratio corresponding to the sight.
  • step 1002 when the terminal detects that the target area does not include the target virtual object, there are no other virtual objects near the aiming point, and there is no need to provide auxiliary aiming services. Therefore, the terminal can directly adjust the current sight according to the zoom ratio corresponding to the sight.
  • the virtual scene is zoomed and displayed.
  • the display mode of the virtual scene when the display mode of the virtual scene is switched to the display mode based on the sight, if the target virtual object is included in the area corresponding to the aiming point, the viewing angle can be controlled to rotate, and the aiming point can be moved to the target virtual object
  • This area helps the user to aim at the target virtual object near the aiming point during the display mode switching process.
  • the auxiliary aiming service is provided based on the user operation, instead of ignoring the user operation and directly targeting the aiming point Drag and drop is performed. Therefore, the above-mentioned virtual scene display process is closely related to the user's operation, which can meet the needs of the user and has a better display effect.
  • step 302 is executed only when the virtual scene is in the sight-based display mode to obtain the adsorption area of the target virtual object and provide auxiliary aiming services.
  • no assisted aiming service is provided.
  • a possible scenario may be included: when the virtual scene is not in the sight-based display mode, the user performs a viewing angle adjustment operation, and the terminal detects the viewing angle adjustment After the operation, the operation can be adjusted according to the viewing angle to obtain the rotation speed of the viewing angle of the virtual scene, so as to adjust the viewing angle, and display the adjusted virtual scene.
  • the user continues to operate on the terminal, performs a display mode switching operation, and switches the display mode from the first display mode to the second display mode.
  • the second display mode is the above-mentioned sight-based display mode.
  • the terminal may obtain the corresponding target area based on the position of the aiming point, and if the target virtual object is included in the target area, it may obtain the target rotation direction and the target rotation angle of the perspective of the virtual scene, so as to aim in the adjusted virtual scene Pointing at the target virtual object realizes the effect of moving the aiming point to the target virtual object near the aiming point during the operation of raising the mirror.
  • auxiliary aiming can be provided, and the aiming point and the target can be considered comprehensively
  • the position and viewing angle adjustment operation of the virtual object obtains the target rotation speed of the viewing angle of the virtual scene, so that the viewing angle adjustment is performed based on the target rotation speed to display the adjusted virtual scene.
  • FIG. 13 is a schematic structural diagram of a virtual scene display device provided by an embodiment of the present application.
  • the device Can include:
  • the obtaining module 1301 is used to obtain the adsorption area of the target virtual object when the viewing angle adjustment operation is detected; the obtaining module 1301 is also used to, when the aiming point is located in the adsorption area of the target virtual object, according to the position of the aiming point, The position of the target virtual object and the viewing angle adjustment operation obtain the target rotation speed of the viewing angle of the virtual scene; the display module 1302 is configured to display the target virtual scene based on the target rotation speed of the viewing angle.
  • the device further includes:
  • the detection module is used to detect whether the target virtual object is included in the virtual scene
  • the acquiring module 1301 is further configured to acquire the first preset rotation speed of the viewing angle of the virtual scene according to the viewing angle adjustment operation when the target virtual object is not included in the virtual scene.
  • the obtaining module 1301 is used to:
  • the obtaining module 1301 is used to:
  • the first adsorption area surrounds the second adsorption area
  • the second adsorption area surrounds the third adsorption area
  • the obtaining module 1301 is used to:
  • the second rotation speed of the viewing angle of the virtual scene is acquired.
  • the second rotation speed and the distance between the target virtual object and the aiming point projected on the terminal screen are negatively correlated; or, the second rotation speed and the target virtual object and the aiming point are in a virtual position.
  • the distance in the scene is negatively related; or, the second rotation speed is negatively related to the included angle.
  • the obtaining module 1301 is also used to obtain the distance between the virtual object controlled by the current terminal and the target virtual object;
  • the acquiring module 1301 is also configured to acquire the virtual scene information according to the distance between the virtual object controlled by the current terminal and the target virtual object, and the distance between the target virtual object and the aiming point projected on the terminal screen.
  • the second rotation speed of the viewing angle where the second rotation speed is negatively related to the distance between the virtual object controlled by the current terminal and the target virtual object; or,
  • the acquisition module 1301 is further configured to acquire the perspective of the virtual scene according to the distance between the virtual object controlled by the current terminal and the target virtual object, and the distance between the target virtual object and the aiming point in the virtual scene First 2.
  • Rotation speed where the second rotation speed is negatively related to the distance between the virtual object controlled by the current terminal and the target virtual object.
  • the obtaining module 1301 is also used for:
  • the step of obtaining the second rotation speed of the viewing angle of the virtual scene according to the position of the aiming point and the position of the target virtual object is performed ;
  • the first rotation speed is taken as the target rotation speed of the viewing angle of the virtual scene.
  • the acquisition module 1301 is further configured to acquire the second rotation speed of the viewing angle of the virtual scene according to the position of the aiming point, the position of the target virtual object, and the operating direction of the viewing angle adjustment operation.
  • the obtaining module 1301 is also used for:
  • the third rotation speed is obtained as the viewing angle of the virtual scene The second rotation speed
  • the fourth rotation speed is obtained as the virtual The second rotation speed of the viewing angle of the scene, the fourth rotation speed is less than the third rotation speed.
  • the obtaining module 1301 is also used for:
  • the control module is configured to control the aiming point to move following the target virtual object when the aiming point is located in the fourth adsorption area of the target virtual object and the target virtual object moves.
  • the obtaining module 1301 is also used for:
  • the acquisition module 1301 is further configured to acquire the adsorption area of the target virtual object according to the distance between the virtual object controlled by the current terminal and the target virtual object, and the size of the adsorption area is proportional to the distance.
  • the device further includes:
  • the selection module is used to select one candidate virtual object as the target virtual object from the multiple candidate virtual objects when the aiming point is located in the adsorption area of multiple candidate virtual objects;
  • the obtaining module 1301 is configured to perform the step of obtaining the target rotation speed of the view angle of the virtual scene according to the position of the aiming point, the position of the target virtual object, and the viewing angle adjustment operation based on the selected target virtual object.
  • the selection module is used to:
  • the candidate virtual object with the smallest distance from the aiming point is selected as the target virtual object;
  • the candidate virtual object with the smallest angle is selected as the target virtual object.
  • the device provided by the embodiment of the present application can provide auxiliary aiming if the aiming point is in the adsorption area of the target virtual object when the angle adjustment operation is detected, the angle of view adjustment operation is integrated and the position of the aiming point and the target virtual object is obtained to obtain the angle of view of the virtual scene
  • the above process takes into account the viewing angle adjustment operation, and provides auxiliary aiming on the basis of respecting the user operation, which can avoid the situation where the virtual scene display is separated from the user operation. It can meet user needs, respects user operations, and also provides auxiliary effects, and the display effect is better.
  • the virtual scene display device provided in the above embodiment displays a virtual scene
  • the division of functional modules is illustrated by examples. In practical applications, the above-mentioned function allocation can be completed by different functional modules as required, that is, the internal structure of the electronic device is divided into different functional modules to complete all or part of the functions described above.
  • the virtual scene display device provided in the foregoing embodiment and the virtual scene display method embodiment belong to the same concept, and the specific implementation process is detailed in the method embodiment, which will not be repeated here.
  • FIG. 14 is a schematic structural diagram of a virtual scene display device provided by an embodiment of the present application.
  • the device may include:
  • the acquiring module 1401 is configured to acquire the target area corresponding to the aiming point when detecting that the virtual scene is switched from the first display mode to the second display mode, the second display mode is a display mode based on the sight, and the first display mode Is a display mode other than the second display mode;
  • the obtaining module 1401 is further configured to obtain the target rotation direction and the target rotation angle of the view angle of the virtual scene according to the position of the target virtual object and the position of the aiming point when the target virtual object is included in the target area;
  • the display module 1402 is configured to display a target virtual scene based on the target rotation direction and the target rotation angle of the perspective, where the aiming point is located in the area where the target virtual object is located.
  • the obtaining module 1401 is used to:
  • the target rotation direction and the target rotation angle of the view angle of the virtual scene are obtained.
  • the obtaining module 1401 is used to:
  • the location of the target part of the target virtual object is acquired as the target location.
  • the obtaining module 1401 is used to:
  • the position in the horizontal position range that is the same as the horizontal position of the aiming point is taken as The target position of the target virtual object; or,
  • the position corresponding to the horizontal position of the aiming point or the target of the target virtual object is used as the target position of the target virtual object.
  • the obtaining module 1401 is used to:
  • the angle between the direction of the aiming point and the connecting direction of the target position and the virtual object controlled by the current terminal is used as the target rotation angle.
  • the acquiring module 1401 is configured to acquire a target area centered at the aiming point and whose size is a preset size.
  • the device further includes:
  • the selection module is used to select one candidate virtual object as the target virtual object when multiple candidate virtual objects are included in the target area;
  • the acquiring module 1401 is further configured to perform the step of acquiring the target rotation direction and the target rotation angle of the viewing angle of the virtual scene according to the position of the target virtual object and the position of the aiming point based on the selected target virtual object.
  • the selection module is used to:
  • the candidate virtual object with the smallest distance from the aiming point Object is selected as the target virtual object; or, according to the distance between the multiple candidate virtual objects and the position where the aiming point is projected on the terminal screen, the candidate virtual object with the smallest distance from the aiming point is selected as the target virtual object; or,
  • the candidate virtual object with the smallest angle is selected as the target virtual object.
  • the viewing angle can be controlled to rotate, and the aiming point can be moved to this The area where the target virtual object is located, so as to help the user to aim the target virtual object near the aiming point during the display mode switching process.
  • the auxiliary aiming service is provided based on the user operation instead of ignoring the user operation. The aiming point is dragged, and therefore, the above-mentioned virtual scene display process is closely related to the user's operation, which can meet the needs of the user and has a better display effect.
  • the virtual scene display device provided in the above embodiment displays a virtual scene
  • only the division of the above functional modules is used as an example for illustration.
  • the above functions can be allocated by different functional modules as needed. , That is, divide the internal structure of the electronic device into different functional modules to complete all or part of the functions described above.
  • the virtual scene display device provided in the foregoing embodiment and the virtual scene display method embodiment belong to the same concept, and the specific implementation process is detailed in the method embodiment, which will not be repeated here.
  • the electronic device 1500 may have relatively large differences due to different configurations or performance, and may include one or more processors (central processing units, CPU) 1501 And one or more memories 1502. At least one instruction is stored in the memory 1502.
  • processors central processing units, CPU
  • memories 1502. At least one instruction is stored in the memory 1502.
  • the at least one instruction is loaded and executed by the processor 1501 to implement the following method: when the viewing angle adjustment operation is detected, acquiring the adsorption area of the target virtual object; when the aiming point is located at the target virtual In the adsorption area of the object, the target rotation speed of the perspective of the virtual scene is obtained according to the position of the aiming point, the position of the target virtual object, and the viewing angle adjustment operation; based on the target rotation speed of the perspective, the target virtual Scenes.
  • the at least one instruction is loaded and executed by the one or more processors 1501 to achieve: when the aiming point is located in the adsorption area of the target virtual object, acquiring the virtual object according to the viewing angle adjustment operation The first rotation speed of the viewing angle of the scene; acquiring the second rotation speed of the viewing angle of the virtual scene according to the position of the aiming point and the position of the target virtual object, and the direction of the second rotation speed is from the The aiming point starts and faces the target virtual object; and based on the first rotation speed and the second rotation speed, obtain the target rotation speed of the view angle of the virtual scene.
  • the at least one instruction is loaded and executed by the one or more processors 1501 to achieve: when the aiming point is located in the first adsorption area of the target virtual object, adjust the operation according to the viewing angle, Acquiring the first preset rotation speed of the viewing angle of the virtual scene; when the aiming point is located in the second adsorption area of the target virtual object, acquiring the second of the viewing angle of the virtual scene according to the viewing angle adjustment operation A preset rotation speed, the second preset rotation speed is less than the first preset rotation speed; when the aiming point is located in the third adsorption area of the target virtual object, according to the viewing angle adjustment operation, the virtual The third preset rotation speed of the viewing angle of the scene, the third preset rotation speed is less than the first preset rotation speed, and the third preset rotation speed is different from the second preset rotation speed; wherein, The first adsorption area is enclosed outside the second adsorption area, and the second adsorption area is enclosed outside the third adsorption area.
  • the at least one instruction is loaded and executed by the one or more processors 1501 to achieve: according to the distance between the target virtual object and the aiming point projected on the terminal screen, the information of the virtual scene The second rotation speed of the viewing angle; or, obtaining the second rotation speed of the viewing angle of the virtual scene according to the distance between the target virtual object and the aiming point in the virtual scene; or, according to the virtual object and the virtual object controlled by the current terminal The angle between the connecting direction of the target virtual object and the direction of the aiming point is used to obtain the second rotation speed of the viewing angle of the virtual scene.
  • the second rotation speed and the distance between the target virtual object and the aiming point projected on the terminal screen are negatively correlated; or, the second rotation speed and the target virtual object and the aiming point The distance in the virtual scene is negatively related; or, the second rotation speed is negatively related to the included angle.
  • the at least one instruction is also loaded and executed by the one or more processors 1501 to achieve: obtain the distance between the virtual object controlled by the current terminal and the target virtual object;
  • the at least one instruction is loaded and executed by the one or more processors 1501 to realize: according to the distance between the virtual object controlled by the current terminal and the target virtual object, and the target virtual The distance between the object and the aiming point projected on the terminal screen, obtaining the second rotation speed of the viewing angle of the virtual scene, and the distance between the second rotation speed and the virtual object controlled by the current terminal and the target virtual object Negative correlation; or, correspondingly, the at least one instruction is loaded and executed by the one or more processors 1501 to realize: according to the distance between the virtual object controlled by the current terminal and the target virtual object, and The distance between the target virtual object and the aiming point in the virtual scene, the second rotation speed of the perspective of the virtual scene is acquired, and the second rotation speed is related to the virtual object controlled by the current terminal and the target virtual The distance of the object is negatively
  • the at least one instruction is loaded and executed by the one or more processors 1501 to achieve: when the operating direction of the viewing angle adjustment operation indicates that the aiming point moves toward the target virtual object, The position of the aiming point, the position of the target virtual object, and the first parameter, and the third rotation speed is obtained as the second rotation speed of the viewing angle of the virtual scene; when the operating direction of the viewing angle adjustment operation indicates the aiming When the point moves in the opposite direction of the target virtual object, according to the position of the aiming point, the position of the target virtual object, and the second parameter, the fourth rotation speed is obtained as the second rotation of the view angle of the virtual scene Speed, the fourth rotation speed is less than the third rotation speed.
  • the at least one instruction is also loaded and executed by the one or more processors 1501 to achieve: when a viewing angle adjustment operation is detected, obtain the display mode of the virtual scene; when the virtual scene is in sight-based When the virtual scene is not in the sight-based display mode, according to the viewing angle adjustment operation, the first step of acquiring the viewing angle of the virtual scene is performed when the virtual scene is not in the sight-based display mode. Rotation speed.
  • the at least one instruction is also loaded and executed by the one or more processors 1501 to implement: when the aiming point is located in the adsorption area of a plurality of candidate virtual objects, from the plurality of candidate virtual objects Selecting a candidate virtual object as the target virtual object; based on the selected target virtual object, performing the adjustment operation according to the position of the aiming point, the position of the target virtual object, and the viewing angle to obtain the virtual scene The step of the target rotation speed of the perspective.
  • the at least one instruction is loaded and executed by the one or more processors 1501 to implement: randomly selecting a candidate virtual object from the multiple candidate virtual objects as the target virtual object; or Selecting the candidate virtual object with the smallest distance from the aiming point as the target virtual object according to the distance between the multiple candidate virtual objects and the aiming point in the virtual scene; or, according to the distance between the multiple candidate virtual objects and the aiming point The distance between the positions where the aiming point is projected on the terminal screen is selected, and the candidate virtual object with the smallest distance from the aiming point is selected as the target virtual object; or, according to the connection between the multiple candidate virtual objects and the virtual object controlled by the current terminal The angle between the line direction and the direction where the aiming point is located, and the candidate virtual object with the smallest angle is selected as the target virtual object.
  • the at least one instruction is loaded and executed by the processor 15011501 to implement the following method: When it is detected that the virtual scene is switched from the first display mode to the second display mode, obtain the corresponding target point A target area, the second display mode is a display mode based on a sight, and the first display mode is a display mode other than the second display mode; when the target area includes a target virtual object, according to The position of the target virtual object and the position of the aiming point are obtained, and the target rotation direction and the target rotation angle of the perspective of the virtual scene are acquired; the target virtual scene is displayed based on the target rotation direction and the target rotation angle of the perspective, so The aiming point in the target virtual scene is located in an area where the target virtual object is located.
  • the at least one instruction is loaded and executed by the one or more processors 1501 to implement: Acquiring the target position of the target virtual object; acquiring the target rotation direction and the target rotation angle of the view angle of the virtual scene according to the position of the aiming point and the target position.
  • the at least one instruction is loaded and executed by the one or more processors 1501 to realize: according to the horizontal relationship between the aiming point and the position of the target virtual object projected on the terminal screen, Acquiring the target position of the target virtual object; or, acquiring the target position of the target virtual object according to the relationship between the aiming point and the horizontal position of the target virtual object in the virtual scene; or, acquiring the The position of the target part of the target virtual object is used as the target position.
  • the at least one instruction is loaded and executed by the one or more processors 1501 to realize: when the projection position of the aiming point in the horizontal direction is located at the projection position of the target virtual object When within the horizontal position range in the horizontal direction, the position in the horizontal position range that is the same as the horizontal position of the aiming point is taken as the target position of the target virtual object; or, when the projection position of the aiming point is horizontal When the horizontal position of the target virtual object is outside the horizontal position range in the horizontal direction, the position corresponding to the horizontal position of the aiming point or the position of the target part of the target virtual object is taken as the The target position of the target virtual object.
  • the at least one instruction is loaded and executed by the one or more processors 1501 to realize: use a direction starting from the aiming point and toward the target position as the target rotation direction; The angle between the direction of the aiming point and the connecting direction of the target position and the virtual object controlled by the current terminal is used as the target rotation angle.
  • the at least one instruction is loaded and executed by the one or more processors 1501 to achieve: obtain a target area centered on the aiming point and whose size is a preset size.
  • the at least one instruction is also loaded and executed by the one or more processors 1501 to implement: when a plurality of candidate virtual objects are included in the target area, from the plurality of candidate virtual objects, Select a candidate virtual object as the target virtual object; based on the selected target virtual object, execute the target rotation according to the position of the target virtual object and the position of the aiming point to obtain the perspective of the virtual scene Steps for direction and target rotation angle.
  • the at least one instruction is loaded and executed by the one or more processors 1501 to implement: randomly selecting a candidate virtual object from the multiple candidate virtual objects as the target virtual object; or Selecting the candidate virtual object with the smallest distance from the aiming point as the target virtual object according to the distance between the multiple candidate virtual objects and the aiming point in the virtual scene; or, according to the distance between the multiple candidate virtual objects and the aiming point The distance between the positions where the aiming point is projected on the terminal screen is selected, and the candidate virtual object with the smallest distance from the aiming point is selected as the target virtual object; or, according to the connection between the multiple candidate virtual objects and the virtual object controlled by the current terminal The angle between the line direction and the direction where the aiming point is located, and the candidate virtual object with the smallest angle is selected as the target virtual object.
  • the electronic device may also have components such as a wired or wireless network interface, a keyboard, and an input/output interface to perform input and output.
  • the electronic device may also include other components for implementing device functions, which will not be repeated here.
  • the program can be stored in a computer-readable storage medium.
  • the storage medium can be read-only memory, magnetic disk or optical disk, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Optics & Photonics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

一种虚拟场景显示方法、电子设备及存储介质。所述方法包括:在检测到视角调整操作时,如果瞄准点在目标虚拟对象的吸附区域,可以提供辅助瞄准,综合视角调整操作以及瞄准点和目标虚拟对象的位置获取虚拟场景的视角的转动速度,考虑到了视角调整操作,在尊重用户操作的基础上提供了辅助瞄准,可以避免出现虚拟场景显示与用户操作脱离的情况,满足了用户需求,既尊重了用户操作,也提供了辅助效果,显示效果较好。

Description

虚拟场景显示方法、 电子设备及存储介质 本申请要求于 2019年 02月 26 日提交的申请号为 2019101432804、 发明名称为“虚拟场景显 示方法、 装置、 电子设备及存储介质”的中国专利申请的优先权, 其全部内容通过引用结合在 本申请中。 技术领域
本申请涉及计算机技术领域, 特别涉及一种虚拟场景显示方法、 电子设备及存储介质。 背景技术
随着计算机技术的发展以及终端功能的多样化,在终端上能够进行的游戏种类越来越多。 其中, 射击类游戏是一种比较盛行的游戏, 这类游戏通常在终端屏幕中心显示瞄准点, 用户 可以通过调整虚拟场景的视角来调整瞄准点所瞄准的区域, 调整当前显示的虚拟场景。
目前, 虚拟场景显示方法通常是检测到目标虚拟对象时, 直接以该目标虚拟对象周围的 预设范围作为吸附区域,当瞄准点位于吸附区域时直接将瞄准点移动到目标虚拟对象的身上。
上述虚拟场景显示方法中没有考虑用户操作意图, 且直接将瞄准点移动至目标虚拟对象 身上, 辅助瞄准的强度非常大, 如果用户并未想要瞄准该目标虚拟对象或想要将瞄准点向远 离目标虚拟对象的方向移动时, 上述方法中过强的拖拽会使得用户很难将瞄准点向远离目标 虚拟对象的方向移动, 因而, 虚拟场景显示与用户操作相脱离, 不能满足用户需求, 显示效 果差。
发明内容
本申请实施例提供了一种虚拟场景显示方法、 电子设备及存储介质, 可以解决相关技术 中与用户操作相脱离、 不能满足用户需求和显示效果差的问题。 所述技术方案如下:
一方面, 提供了一种虚拟场景显示方法, 应用于电子设备, 所述方法包括:
当检测到视角调整操作时, 获取目标虚拟对象的吸附区域;
当瞄准点位于所述目标虚拟对象的吸附区域时, 根据所述瞄准点的位置、 所述目标虚拟 对象的位置和所述视角调整操作, 获取所述虚拟场景的视角的目标转动速度;
基于所述视角的目标转动速度, 显示目标虚拟场景。
一方面, 提供了一种虚拟场景显示方法, 应用于电子设备, 所述方法包括:
当检测到虚拟场景从第一显示模式切换至第二显示模式时,获取瞄准点对应的目标区域, 所述第二显示模式为基于瞄具的显示模式, 所述第一显示模式为除所述第二显示模式之外的 显示模式;
当所述目标区域内包括目标虚拟对象时, 根据所述目标虚拟对象的位置和所述瞄准点的 位置, 获取所述虚拟场景的视角的目标转动方向和目标转动角度;
基于所述视角的目标转动方向和目标转动角度, 显示目标虚拟场景, 所述目标虚拟场景 中所述瞄准点位于所述目标虚拟对象所在区域。
一方面, 提供了一种虚拟场景显示装置, 所述装置包括:
获取模块, 用于当检测到视角调整操作时, 获取目标虚拟对象的吸附区域;
所述获取模块, 还用于当瞄准点位于所述目标虚拟对象的吸附区域时, 根据所述瞄准点 的位置、 所述目标虚拟对象的位置和所述视角调整操作, 获取所述虚拟场景的视角的目标转 动速度;
显示模块, 用于基于所述视角的目标转动速度, 显示目标虚拟场景。
一方面, 提供了一种虚拟场景显示装置, 所述装置包括:
当检测到虚拟场景从第一显示模式切换至第二显示模式时,获取瞄准点对应的目标区域, 所述第二显示模式为基于瞄具的显示模式, 所述第一显示模式为除所述第二显示模式之外的 显示模式;
当所述目标区域内包括目标虚拟对象时, 根据所述目标虚拟对象的位置和所述瞄准点的 位置, 获取所述虚拟场景的视角的目标转动方向和目标转动角度;
基于所述视角的目标转动方向和目标转动角度, 显示目标虚拟场景, 所述目标虚拟场景 中所述瞄准点位于所述目标虚拟对象所在区域。
一方面, 提供了一种电子设备, 所述电子设备包括一个或多个处理器和一个或多个存储 器, 所述一个或多个存储器中存储有至少一条指令, 所述指令由所述一个或多个处理器加载 并执行以实现所述虚拟场景显示方法所执行的操作。
一方面, 提供了一种计算机可读存储介质, 所述计算机可读存储介质中存储有至少一条 指令, 所述指令由处理器加载并执行以实现所述虚拟场景显示方法所执行的操作。
附图说明
为了更清楚地说明本申请实施例中的技术方案, 下面将对实施例描述中所需要使用的附 图作简单地介绍, 显而易见地, 下面描述中的附图仅仅是本申请的一些实施例, 对于本领域 普通技术人员来讲, 在不付出创造性劳动的前提下, 还可以根据这些附图获得其他的附图。
图 1 是本申请实施例提供的一种虚拟场景未处于基于瞄具的显示模式的终端界面示意 图;
图 2是本申请实施例提供的一种虚拟场景处于基于瞄具的显示模式的终端界面示意图; 图 3是本申请实施例提供的一种虚拟场景显示方法的流程图;
图 4 是本申请实施例提供的一种吸附区域的尺寸与虚拟对象之间的距离的关系的示意 图;
图 5 是本申请实施例提供的一种瞄准点与目标虚拟对象之间距离较近的终端界面示意 图;
图 6是本申请实施例提供的一种三种吸附区域的示意图;
图 7是本申请实施例提供的一种终端界面示意图;
图 8是本申请实施例提供的一种目标虚拟对象正在移动的终端界面示意图;
图 9是本申请实施例提供的一种视角调整流程图;
图 10是本申请实施例提供的一种虚拟场景显示方法的流程图;
图 11是本申请实施例提供的一种瞄准点对应的目标区域的示意图;
图 12是本申请实施例提供的一种目标虚拟场景的示意图;
图 13是本申请实施例提供的一种虚拟场景显示装置的结构示意图;
图 14是本申请实施例提供的一种虚拟场景显示装置的结构示意图;
图 15是本申请实施例提供的一种电子设备的结构示意图。
具体实施方式
为使本申请的目的、 技术方案和优点更加清楚, 下面将结合附图对本申请实施方式作进 一步地详细描述。
本申请实施例主要涉及电子游戏或者模拟训练场景, 以电子游戏场景为例, 用户可以提 前在该终端上进行操作, 该终端检测到用户的操作后, 可以下载电子游戏的游戏配置文件, 该游戏配置文件可以包括该电子游戏的应用程序、 界面显示数据或虚拟场景数据等, 以使得 该用户在该终端上登录电子游戏时可以调用该游戏配置文件,对电子游戏界面进行渲染显示。 用户可以在终端上进行触控操作, 该终端检测到触控操作后, 可以获取该触控操作所对应的 游戏数据, 并对该游戏数据进行渲染显示, 该游戏数据可以包括虚拟场景数据、 该虚拟场景 中虚拟对象的行为数据等。
本申请涉及到的虚拟场景可以用于模拟一个三維虚拟空间, 也可以用于模拟一个二維虚 拟空间, 该三維虚拟空间或二維虚拟空间可以是一个开放空间。 该虚拟场景可以用于模拟现 实中的真实环境, 例如, 该虚拟场景中可以包括天空、 陆地、 海洋等, 该陆地可以包括沙漠、 城市等环境元素, 用户可以控制虚拟对象在该虚拟场景中进行移动。 其中, 虚拟对象可以是 一个虚拟的用于代表用户的虚拟形象, 该虚拟形象可以是任一种形态, 例如, 人、 动物等, 本申请对此不限定。
该虚拟场景中还可以包括其他虚拟对象, 也即是该虚拟场景中可以包括多个虚拟对象, 每个虚拟对象在虚拟场景中具有自身的形状和体积, 占据虚拟场景中的一部分空间。 以射击 类游戏为例, 用户可以控制虚拟对象在该虚拟场景的天空中自由下落、 滑翔或者打开降落伞 进行下落等, 在陆地上中跑动、 跳动、 爬行、 弯腰前行等, 也可以控制虚拟对象在海洋中游 泳、 漂浮或者下潜等, 当然, 用户也可以控制虚拟对象乘坐载具在该虚拟场景中进行移动, 在此仅以上述场景进行举例说明, 本申请实施例对此不作具体限定。 用户也可以控制虚拟对 象使用虚拟道具与其他虚拟对象进行战斗, 该虚拟道具可以是冷兵器, 也可以是热兵器, 本 申请实施例对此不作具体限定。
终端屏幕显示的可以是当前终端控制的虚拟对象的视角画面, 该终端屏幕上也可以显示 该当前终端控制的虚拟对象的瞄准点, 该瞄准点可以用于标注当前终端控制的虚拟对象的视 角画面中的瞄准目标, 则该瞄准点在该虚拟场景中的位置即可作为当前终端控制的虚拟对象 的攻击落点。
具体地, 该瞄准点可以在该终端屏幕的中心位置显示, 当然, 该瞄准点也可以在其他位 置显示, 本申请实施例对此不作具体限定。 该瞄准点的显示样式可以包括多种, 则该瞄准点 显示时可以采用系统默认的显示样式, 也可以根据用户的设置进行调整。 用户看到终端上显 示的瞄准点, 可以判断当前瞄准点对应的虚拟场景的位置是否为自己想要瞄准的区域, 如果 不是, 用户可以通过视角调整操作调整虚拟场景的视角来调整该瞄准点瞄准的区域。 当然, 用户通常是希望能快速且精准地将瞄准点调整至该虚拟场景中的其他虚拟对象的身上, 从而 可以对该其他虚拟对象进行射击、 拍击或者拳击等。
对于视角调整操作, 该视角调整操作可以为多种类型的操作, 例如, 该视角调整操作可 以为对虚拟对象的位置进行改变的操作, 也即是控制虚拟对象移动, 从而使得视角发生改变。 又例如, 用户直接进行视角调整操作, 以改变视角。 本申请实施例对此不作限定。
该视角调整操作也可以包括多种操作方式, 在一种可能实现方式中, 该视角调整操作可 以是滑动操作, 终端检测到该滑动操作, 可以基于该滑动操作的滑动方向、 滑动距离以及滑 动速度, 获取该滑动操作对应的视角的转动方向、 转动角度以及转动速度。 例如, 该滑动操 作的滑动方向可以对应于视角的转动方向, 该滑动操作的滑动距离大小可以与视角的转动角 度正相关, 当然, 该滑动操作的滑动速度也可以与该视角的转动速度正相关。
在另一种可能实现方式中, 该视角调整操作也可以是按压操作, 具体地, 该终端上可以 预设有视角调整区域, 用户可以通过在该视角调整区域内进行按压操作, 终端检测到该视角 调整区域内的按压操作时, 可以基于该按压操作相对于该视角调整区域的具体位置、 该按压 操作的按压力度以及按压时间, 获取该按压操作对应的视角的转动方向、 转动速度以及转动 角度。 例如, 该按压操作相对于该视角调整区域的中心的方向可以对应于视角的转动方向, 该按压操作的按压力度可以与视角的转动速度正相关, 该按压操作的按压时间可以与视角的 转动角度正相关。
在再一种可能实现方式中, 该视角调整操作还可以是对终端的转动操作, 该终端中的角 速度传感器 (例如陀螺仪) 检测到该转动操作时, 可以根据该转动操作的转动方向、 转动角 度以及转动速度, 获取视角的转动方向、 转动角度以及转动速度。 例如, 该转动操作的转动 方向可以为视角的转动方向, 该转动操作的转动角度可以与该视角的转动角度正相关, 该转 动操作的转动速度可以与该视角的转动速度正相关。 当然, 该视角调整操作也可以是按键操 作、 对虚拟摇杆区域的拖拽操作或者对真实摇杆设备的拨动操作等, 本申请对此不作具体限 定。 当然, 在用户对虚拟对象进行控制时, 也可以通过上述几种视角调整操作的结合实现不 同的控制效果, 例如, 该用户对视角的视角调整操作为滑动操作, 而该在滑动操作时, 终端 检测到该滑动操作过程中操作的按压力度, 从而基于该按压力度是否大于预设的按压力度, 从而决定是否进行射击等。 上述仅为一种示例性说明, 具体实施中如何对上述几种视角调整 操作结合, 可以实现哪种控制效果, 本申请在此不做具体限定。
在上述电子游戏场景中, 虚拟对象通常可以控制虚拟道具与其他虚拟对象进行战斗。 在 一些枪械道具上还可以装备有瞄具, 从而基于瞄具来观察虚拟场景, 该瞄具可以为机械瞄具, 机械瞄具是指枪械道具上原本即装备有的观测设备。 该瞄具也可以为该枪械道具上后续配备 的瞄具, 例如, 瞄准镜。 其中, 该瞄准镜可以具有倍率, 该倍率可以是 1, 也可以是大于 1 的数值。 例如, 该瞄准镜可以是红点瞄准镜、 全息瞄准镜、 二倍瞄准镜、 四倍瞄准镜、 八倍 瞄准镜等, 其中, 红点瞄准镜和全息瞄准镜的倍率为 1, 而二倍瞄准镜、 四倍瞄准镜和八倍 瞄准镜的倍率则大于 1, 当然, 该瞄准镜的倍率还可以是其它数值, 例如, 该瞄准镜还可以 是三倍镜、 六倍镜、 十五倍镜等, 本申请实施例对该瞄准镜的倍率不作具体限定。
该瞄具即用于辅助虚拟对象进行瞄准和射击, 因而, 当虚拟对象控制虚拟道具进行瞄准 或射击时, 可以通过将虚拟场景的显示模式切换至基于瞄具的显示模式, 这样方便更准确地 对敌方虚拟对象进行瞄准和射击。 例如, 如图 1所示, 虚拟场景未处于基于瞄具的显示模式。 如图 2所示, 用户想要控制虚拟对象准确射击出现在虚拟场景中的其它虚拟对象, 将虚拟场 景的显示模式切换至基于瞄具的显示模式, 可以通过虚拟对象控制的虚拟道具上的瞄具来观 察虚拟场景。
图 3是本申请实施例提供的一种虚拟场景显示方法的流程图, 参见图 3, 该方法可以包 括以下步骤:
301、 当检测到视角调整操作时, 终端检测虚拟场景中是否包括目标虚拟对象, 当该虚拟 场景中包括目标虚拟对象时, 执行步骤 302; 当该虚拟场景中不包括目标虚拟对象时, 执行 步骤 305。
用户可以在终端上进行视角调整操作, 以对虚拟场景的视角进行调整, 从而使得瞄准点 对应的虚拟场景中的位置发生改变, 这样可以通过该视角调整操作, 改变当前终端控制的虚 拟对象的瞄准位置和攻击落点。 上述内容已经对该视角调整操作的操作方式和类型进行了说 明, 本申请实施例在此不多做贅述, 也对该视角调整操作的操作方式和类型不作限定,
在本申请实施例中, 在该用户进行上述视角调整操作时, 可提供辅助瞄准服务, 辅助用 户快速将瞄准点移动至想要瞄准的虚拟对象身上, 以降低用户的操作难度。 因而, 在检测到 视角调整操作时, 终端可以检测虚拟场景中是否包括目标虚拟对象, 从而初步判断是否需要 提供辅助瞄准服务。
可以理解地, 如果虚拟场景中不包括目标虚拟对象, 也即是, 如果当前终端控制的虚拟 对象的视野范围内没有其他虚拟对象存在, 则该虚拟对象并没有瞄准或射击的目标, 该视角 调整操作可能只是用户调整视角的操作, 并未瞄准, 从而可以无需提供辅助瞄准服务, 可以 执行下述步骤 305, 直接基于视角调整操作, 进行视角调整。 如果虚拟场景中包括目标虚拟 对象, 也即是, 如果当前终端控制的虚拟对象的视野范围内存在其他虚拟对象, 则该虚拟对 象可能想要瞄准该其他虚拟对象, 也可能不想要瞄准该其他虚拟对象, 则需要进一步判断是 否需要提供辅助瞄准服务, 可以执行下述步骤 302进一步进行判断。
在一种可能实现方式中, 该目标虚拟对象可以为当前终端控制的虚拟对象之外的任一虚 拟对象。 在另一种可能实现方式中, 当前终端控制的虚拟对象还可能与其他虚拟对象组队, 作为同一个队伍中的虚拟对象, 一般地, 当前终端控制的虚拟对象不需要对处于同一个队伍 的虚拟对象进行瞄准或射击, 因而, 该目标虚拟对象还可以为与当前终端控制的虚拟对象所 属的队伍不同的任一虚拟对象。 本申请实施例对该目标虚拟对象的具体判断方式不作限定。
302、 终端获取目标虚拟对象的吸附区域。 在判断当前终端控制的虚拟对象的视野范围内有目标虚拟对象后, 终端可进一步判断该 目标虚拟对象是否具备辅助瞄准的条件。 在判断是否提供辅助瞄准服务时, 可以考虑瞄准点 与目标虚拟对象之前的距离, 如果瞄准点距离目标虚拟对象较近, 可以为本次视角调整操作 提供辅助瞄准服务; 如果瞄准点距离目标虚拟对象较远, 则可以无需提供辅助瞄准服务, 从 而可以在提供辅助瞄准服务, 减少用户操作的复杂度的同时, 保证电子游戏的公平性。
具体地, 可以为目标虚拟对象设置吸附区域, 该吸附区域是对目标虚拟对象进行瞄准时 可以提供辅助瞄准服务的瞄准点所在位置。 也即是, 在瞄准点位于该吸附区域时, 可以辅助 用户进行视角调整, 将瞄准点的位置移动至目标虚拟对象身上, 以实现快速瞄准。 该吸附区 域可以为该目标虚拟对象周围的一个区域,
在一种可能实现方式中, 该吸附区域可以为以该目标虚拟对象为中心、 尺寸为目标尺寸 的区域。 其中, 该目标尺寸可以由相关技术人员预先设置, 在一种可能实现方式中, 该目标 尺寸也可以基于当前终端控制的虚拟对象和目标虚拟对象之间的距离获取。 相应地, 该步骤 302 可以为: 终端根据该当前终端控制的虚拟对象和该目标虚拟对象之间的距离, 获取该目 标虚拟对象的吸附区域, 该吸附区域的尺寸与该距离正相关。 距离越大, 该吸附区域的尺寸 越大, 距离越小, 该吸附区域的尺寸越小。 这样该第一虚拟对象与该第二虚拟对象之间距离 很远时, 该第一虚拟对象的显示尺寸很小, 而该第一虚拟对象的吸附区域的显示尺寸不会太 小, 从而用户也是可以比较容易地通过控制操作进行视角调整, 从而将瞄准点的位置移动至 该目标虚拟对象的吸附区域内, 从而得到辅助瞄准的辅助作用。
例如, 如图 4所示, 距离较远的目标虚拟对象的吸附区域的尺寸较大, 只是由于距离较 远, 该吸附区域的显示尺寸较小; 距离较近的目标虚拟对象的吸附区域的尺寸较小, 只是由 于距离较近, 该吸附区域的显示尺寸较大。 可以看出, 距离较远的目标虚拟对象的吸附区域 的边缘与目标虚拟对象之间相距较远, 距离较近的目标虚拟对象的吸附区域的边缘与目标虚 拟对象之间相距较近, 因而, 二者的吸附区域的真实尺寸其实与当前显示效果相反。
在一种可能实现方式中, 该吸附区域的形状也可以由相关技术人员预先设置, 例如, 该 吸附区域可以为以目标虚拟对象为中心的圆形区域, 也可以为以目标虚拟对象为中心的多边 形区域, 比如四边形区域, 当然, 该吸附区域也可以为以目标虚拟对象为中心的球形区域, 也可以为以目标虚拟对象为中心的圆柱区域或多边体区域, 本申请实施例对此不作限定。
303、 当瞄准点位于该目标虚拟对象的吸附区域时, 终端根据该瞄准点的位置、 该目标虚 拟对象的位置和该视角调整操作, 获取该虚拟场景的视角的目标转动速度。
终端获取到目标虚拟对象的吸附区域后, 可以判断瞄准点和吸附区域的位置关系, 从而 根据该位置关系, 判断是否需要提供辅助瞄准服务。 具体地, 终端可以判断瞄准点是否位于 该目标虚拟对象的吸附区域, 如果是, 则终端可以执行该步骤 303, 以提供辅助瞄准服务; 如果否, 则终端可以执行下述步骤 305, 直接基于视角调整操作, 进行视角调整。
终端可以综合考虑瞄准点的位置、 目标虚拟对象的位置和视角调整操作, 获取虚拟场景 的视角的目标转动速度。 具体地, 终端可以基于该视角调整操作对应的第一转动速度, 以及 该瞄准点和该目标虚拟对象的位置所对应的第二转动速度, 获取该虚拟场景的视角的目标转 动速度。
在一种可能实现方式中, 该步骤 303 中终端获取目标转动速度的过程可以通过下述步骤 一至步骤三实现:
步骤一、 终端根据该视角调整操作, 获取该虚拟场景的视角的第一转动速度。
在该步骤一中, 终端可以获取视角调整操作对应的第一转动速度, 视角调整操作的操作 方式不同或视角调整操作的类型不同时, 终端获取该第一转动速度的方式也可以不同。
例如, 如果视角调整操作为滑动操作或拖拽操作, 终端可以根据该视角调整操作的操作 方向和操作距离, 获取该虚拟场景的视角的第一转动速度; 如果视角调整操作为按压操作, 终端可以根据该视角调整操作的按压位置和按压力度或按压时长, 获取该虚拟场景的视角的 第一转动速度; 如果视角调整操作为对终端的转动操作, 终端可以根据终端的转动速度, 获 取该虚拟场景的视角的第一转动速度, 或根据终端的转动角度和转动方向, 获取该第一转动 速度。 当然, 该视角调整操作还可以为其他类型的操作, 本申请实施例对此不作限定。
在一种可能实现方式中, 瞄准点在该吸附区域中的位置不同时, 终端获取的第一转动速 度可以不同。 例如, 第一转动速度可以与瞄准点与吸附区域的中心的距离正相关。 瞄准点与 吸附区域的中心的距离越大, 第一转动速度越大; 瞄准点与吸附区域的中心的距离越小, 第 一转动速度越小。
在一个具体的可能实施例中, 该目标虚拟对象可以包括多个吸附区域, 也可以理解为该 吸附区域包括多个子区域, 不同的吸附区域, 该视角调整操作对应的第一转动速度不同。 也 即是, 当该瞄准点位于不同的吸附区域时, 终端根据该视角调整操作, 可以获取到不同的第 一转动速度。 具体地, 该目标虚拟对象可以包括第一吸附区域、 第二吸附区域和第三吸附区 域, 相应地, 该终端获取第一转动速度的过程可以包括如下三种情况:
情况一、 当该瞄准点位于该目标虚拟对象的第一吸附区域时, 终端根据该视角调整操作, 获取该虚拟场景的视角的第一预设转动速度。
其中,该第一预设转动速度为不提供辅助瞄准服务时视角调整操作对应的正常转动速度, 在瞄准点位于第一吸附区域时, 终端可以不针对视角调整操作对应的转动速度进行调整, 因 而, 终端可以获取第一预设转动速度。
情况二、 当该瞄准点位于该目标虚拟对象的第二吸附区域时, 终端根据该视角调整操作, 获取该虚拟场景的视角的第二预设转动速度, 该第二预设转动速度小于第一预设转动速度。
在情况二中, 该第一吸附区域包围于该第二吸附区域之外。 也即是, 该第二吸附区域的 尺寸小于该第一吸附区域, 该第二吸附区域相较于第一吸附区域距离目标虚拟对象更近, 在 瞄准点位于第二吸附区域时终端可以根据视角调整操作, 获取到小于第一预设转动速度的转 动速度, 这样在靠近目标虚拟对象时可以减小视角的转动速度, 从而可以辅助用户将瞄准点 更轻易地移动至目标虚拟对象身上, 并容易停留在目标虚拟对象身上, 而不容易将瞄准点移 动至越过目标虚拟对象的位置。 例如, 如图 5所示, 瞄准点距离目标虚拟对象很近, 需要减 小视角的转动速度, 便于更轻易地将瞄准点停留在目标虚拟对象身上。
在一种可能实现方式中, 可以为视角调整操作设置不同的灵敏度, 该灵敏度可以是指控 制虚拟对象的移动距离与用户的操作幅度或操作距离等之间的比例, 该灵敏度还可以是指视 角的转动速度与用户的操作幅度或操作距离等之间的比例, 该灵敏度与视角调整操作对应的 预设转动速度正相关。 也即是, 当灵敏度大时, 用户的视角调整操作所对应的预设转动速度 就大, 相反亦然。 在上述情况一和情况二中, 瞄准点位于的吸附区域不同时, 该视角调整操 作对应的灵敏度则可以不同, 因而, 终端基于该视角调整操作获取到的预设转动速度则不同。
例如, 当瞄准点位于第一吸附区域时, 视角调整操作对应于第一灵敏度, 当瞄准点位于 第二吸附区域时, 视角调整操作对应于第二灵敏度, 第二灵敏度小于第一灵敏度。 该第一灵 敏度和第二灵敏度可以由相关技术人员根据需求预先设置, 也可以由用户根据自身使用习惯 进行调整, 本申请实施例对此不作限定。 在一个具体的可能实施例中, 该第二灵敏度可以基 于第一灵敏度获取, 比如获取第一灵敏度与目标系数的乘积, 又比如, 获取第一灵敏度与目 标数值的差值, 本申请实施例对此不作限定。
在另一种可能实现方式中, 该第二预设转动速度可以基于第一预设转动速度获取, 例如, 终端可以获取第一预设转动速度与第一数值的差值, 将该差值作为第二预设转动速度, 该第 一数值为正数。 或者终端可以获取第一预设转动速度与第一系数的乘积, 将该乘积作为该第 二预设转动速度, 该第一系数为小于 1的正数。
上述仅提供了两种可能实现方式, 当然, 该情况二中, 终端还可以通过其他方式获取第 二预设转动速度, 例如, 可以为视角调整操作设置有多个数值组, 每个数值组中包括多个不 同的数值。 瞄准点位于的吸附区域不同时, 视角调整操作对应于的数值组不同, 本申请实施 例对具体采用哪种实现方式不作限定。
情况三、 当所述瞄准点位于该目标虚拟对象的第三吸附区域时, 终端根据该视角调整操 作, 获取该虚拟场景的视角的第三预设转动速度, 该第三预设转动速度小于第一预设转动速 度, 该第三预设转动速度与该第二预设转动速度不同。
该情况三与上述情况二中同理, 终端获取第三预设转动速度的过程与上述情况二中获取 第二预设转动速度的过程同理。 不同的是, 在该情况三中终端获取的第三预设转动速度与第 二预设转动速度不同, 也即是, 瞄准点位于第三吸附区域时视角调整操作对应的第二灵敏度 与瞄准点位于第二吸附区域时视角调整操作对应的第三灵敏度不同。 第三灵敏度小于第一灵 敏度。 例如, 会在原本的灵敏度的基础上增加一个阻尼系数, 该阻尼系数是指将灵敏度变小 的一个系数。 这样视角调整操作所对应的灵敏度变小, 相应地, 该视角调整操作所对应的预 设转动速度也变小了, 用户则不容易将瞄准点移动出第三吸附区域, 本申请实施例对该灵敏 度以及阻尼系数的具体取值不作具体限定。 通过上述灵敏度的改变, 用户可以通过上述视角 调整操作更精准地调整瞄准点瞄准的区域, 即用户可以通过大幅度地进行控制操作实现视角 的小幅度调整, 从而实现瞄准点的小幅度调整, 例如小幅度调整瞄准的目标虚拟对象的身体 部位, 也可以避免用户由于操作幅度过大导致瞄准点过快远离目标虚拟对象。
该第二吸附区域包围于该第三吸附区域之外。 也即是, 该第三吸附区域的尺寸小于第二 吸附区域的尺寸, 在一种可能实现方式中, 该第三灵敏度可以小于第二灵敏度, 也即是, 第 三预设转动速度小于第二预设转动速度。 这样瞄准点越靠近目标虚拟对象, 视角调整操作对 应的灵敏度越小, 视角调整操作对应的预设转动速度越小。 在一个具体的可能实施例中, 该 第三吸附区域可以为该目标虚拟对象所在区域, 这样通过第三预设转动速度小于第二预设转 动速度的设置, 可以在一定程度上避免用户操作幅度过大导致瞄准点脱离目标虚拟对象的情 况。 当然, 该第三灵敏度也可以大于第二灵敏度, 也即是第三预设转动速度可以大于第二预 设转动速度, 上述设置均可以由相关技术人员根据需求进行设置, 本申请实施例对此不作限 定。
进一步地, 该第三灵敏度也可以基于第二灵敏度获取, 该获取过程与上述基于第一灵敏 度获取第二灵敏度的过程同理, 本申请实施例在此不多做贅述。
在另一种可能实现方式中, 该第三预设转动速度也可以基于第一预设转动速度获取, 例 如, 终端可以获取第一预设转动速度与第二数值的差值, 将该差值作为第三预设转动速度, 该第二数值为正数。 或者终端可以获取第一预设转动速度与第二系数的乘积, 将该乘积作为 该第三预设转动速度, 该第二系数为小于 1的正数。
在再一种可能实现方式中, 该第三预设转动速度还可以基于第二预设转动速度获取, 方 式与上述实现方式中同理, 本申请实施例在此不多做贅述。
例如, 如图 6所示, 目标虚拟对象包括三个吸附区域, 该三个吸附区域仅为一种示例性 说明, 本申请实施例对该三个吸附区域的形状、 尺寸不作限定。
需要说明的是, 上述仅以目标虚拟对象的吸附区域包括三种为例进行说明, 该目标虚拟 对象的吸附区域还可以包括更多种, 或可包括小于三种的情况, 该目标虚拟对象的吸附区域 设置均可以由相关技术人员根据需求进行调整, 本申请实施例对此不作限定。
在一个具体的可能实施例中, 在上述判断瞄准点是否位于目标虚拟对象的吸附区域时, 终端可以为每个虚拟对象设置有碰撞器, 该碰撞器用于检测该虚拟对象的吸附区域, 终端可 以基于该瞄准点所在方向进行射线检测, 如果射线通过虚拟对象的碰撞器发生碰撞, 可以获 取瞄准点位于虚拟对象的吸附区域。 当然, 该碰撞器和射线检测的方式仅为一种可能实现方 式, 终端也可以根据瞄准点的位置坐标和吸附区域的坐标范围进行判断, 本申请实施例对具 体采用哪种方式不作限定。
步骤二、 终端根据该瞄准点的位置和该目标虚拟对象的位置, 获取该虚拟场景的视角的 第二转动速度, 该第二转动速度的方向为从该瞄准点出发朝向该目标虚拟对象。 除了视角调整操作, 终端还可以考虑瞄准点和目标虚拟对象之间的位置关系, 来判断如 何在用户操作对应的视角的转动速度的基础上进一步提供辅助。 具体地, 终端可以根据瞄准 点的位置和目标虚拟对象的位置, 获取瞄准点与目标虚拟对象之间的距离, 该瞄准点与目标 虚拟对象之间的距离可以通过多种方式表示, 下述通过该距离的三种不同表示方式对该步骤 二进行说明:
方式一、 终端根据该目标虚拟对象与该瞄准点投影在终端屏幕上的距离, 获取该虚拟场 景的视角的第二转动速度。
在该方式一中, 该瞄准点与目标虚拟对象之间的距离可以通过二者投影在终端屏幕上的 距离表示, 该距离和第二转动速度之间具有转换关系, 终端可以根据该距离和转换关系, 计 算第二转动速度。
在一种可能实现方式中, 该第二转动速度和该目标虚拟对象与该瞄准点投影在终端屏幕 上的距离负相关。 也即是, 距离越小, 第二转动速度越大; 距离越大, 第二转动速度越小。 这样在瞄准点靠近目标虚拟对象的过程中可以持续提高辅助瞄准的强度, 辅助用户快速对目 标虚拟对象进行瞄准, 也充分尊重用户操作, 在用户操作的基础上进行辅助瞄准。
在另一种可能实现方式中, 该第二转动速度也可以和目标虚拟对象与该瞄准点投影在终 端屏幕上的距离正相关, 本申请实施例对此不作限定。
方式二、 终端根据该目标虚拟对象与该瞄准点在虚拟场景中的距离, 获取该虚拟场景的 视角的第二转动速度。
在该方式二中, 该瞄准点与目标虚拟对象之间的距离可以通过二者在虚拟场景中的距离 表示, 该距离和第二转动速度之间具有转换关系, 终端可以根据该距离和转换关系, 计算第 二转动速度。
在一种可能实现方式中, 该第二转动速度和该目标虚拟对象与该瞄准点在虚拟场景中的 距离负相关。 也即是, 距离越小, 第二转动速度越大; 距离越大, 第二转动速度越小。 这样 在瞄准点靠近目标虚拟对象的过程中可以持续提高辅助瞄准的强度, 辅助用户快速对目标虚 拟对象进行瞄准。
在另一种可能实现方式中, 该第二转动速度也可以和该目标虚拟对象与该瞄准点在虚拟 场景中的距离正相关, 本申请实施例对此不作限定。
方式三、 终端根据该当前终端控制的虚拟对象与该目标虚拟对象的连线方向和该瞄准点 所在方向之间的夹角, 获取该虚拟场景的视角的第二转动速度。
在该方式三中, 该瞄准点与目标虚拟对象之间的距离可以通过二者投影在终端屏幕上的 距离表示, 该夹角和第二转动速度之间具有转换关系, 终端可以根据该夹角和转换关系, 计 算第二转动速度。
在一种可能实现方式中, 该第二转动速度与该夹角负相关。 也即是, 夹角越大, 第二转 动速度越小; 夹角越小, 第二转动速度越大。 这样在瞄准点靠近目标虚拟对象的过程中可以 持续提高辅助瞄准的强度, 辅助用户快速对目标虚拟对象进行瞄准。
在另一种可能实现方式中, 该第二转动速度也可以和该夹角正相关, 本申请实施例对此 不作限定。
上述仅提供了三种瞄准点与目标虚拟对象的距离的表示方式, 该距离还可以包括其他表 示方式, 例如, 可以为该瞄准点与目标虚拟对象投影在终端屏幕上的水平距离, 该水平距离 可以为该瞄准点与目标虚拟对象投影在终端屏幕上的距离在水平方向上的距离分量。同理地, 可以为该瞄准点与目标虚拟对象在虚拟场景中的水平距离, 该水平距离可以为该瞄准点与目 标虚拟对象在虚拟场景中的距离在水平方向上的距离分量。 本申请实施例对具体采用哪种表 示方式不作限定。
在一种可能实现方式中, 该步骤二中终端获取第二转动速度的过程还可以考虑视角调整 操作的操作方向, 也即是, 该视角调整操作的操作方向不同时, 该终端根据瞄准点的位置和 该目标虚拟对象的位置获取到的第二转动速度可以不同。
在这种实现方式中, 该步骤二可以为: 终端根据所述瞄准点的位置、 该目标虚拟对象的 位置以及该视角调整操作的操作方向, 获取该虚拟场景的视角的第二转动速度。
具体地, 该视角调整操作的操作方向可以分为两种, 一种为控制瞄准点向该目标虚拟对 象移动, 另一种为控制瞄准点向目标虚拟对象的反方向移动, 下面针对这两种情况中终端获 取第二转动速度的过程进行说明。
情况一: 当该视角调整操作的操作方向指示该瞄准点向该目标虚拟对象移动时, 根据该 瞄准点的位置、 该目标虚拟对象的位置和第一参数, 获取得到第三转动速度作为该虚拟场景 的视角的第二转动速度。
情况二:当该视角调整操作的操作方向指示该瞄准点向该目标虚拟对象的反方向移动时, 根据该瞄准点的位置、 该目标虚拟对象的位置和第二参数, 获取得到第四转动速度作为该虚 拟场景的视角的第二转动速度, 该第四转动速度小于该第三转动速度。
在上述两种情况中, 第一参数和第二参数可以由相关技术人员根据需求进行设置, 上述 瞄准点和目标虚拟对象之间的距离、 第一参数以及第三转动速度的转换关系也可以由相关技 术人员进行设置, 上述瞄准点和目标虚拟对象之间的距离、 第二参数以及第四转动速度的转 换关系也可以由相关技术人员进行设置, 本申请实施例对此不作限定。 如果本次视角调整操 作可以使得瞄准点靠近目标虚拟对象, 则可以提供一个较大的第二转动速度, 也即是第三转 动速度, 提供的辅助瞄准强度较强。 而如果本次视角调整操作使得瞄准点远离目标虚拟对象, 则可以提供一个较小的第二转动速度, 也即是第四转动速度, 提供的辅助瞄准强度较弱。 该 第三转动速度和第四转动速度的方向相同, 均为从瞄准点出发朝向目标虚拟对象的方向, 只 是速度大小不同。
例如, 如图 7所示, 瞄准点的左方为目标虚拟对象, 如果视角调整操作指示瞄准点向正 左方移动, 该瞄准点靠近目标虚拟对象, 二者之间的距离减小, 如果该视角调整操作对应的 第一转动速度为 30度每秒, 则可以在该第一转动速度的基础上增加一个第三转动速度, 第三 转动速度可以为 10度每秒, 第三转动速度和第一转动速度方向相同。该视角调整过程体现为 相邻帧显示的虚拟场景中瞄准点的位置变化时, 可以为: 视角调整操作可以控制瞄准点在下 一帧画面中相较于上一帧画面向左移动 90米,则该第三转动速度可以使得瞄准点向左边多移 动 30米。
如果视角调整操作指示瞄准点向正右方移动, 该瞄准点远离目标虚拟对象, 二者之间的 距离增大, 如果该视角调整操作对应的第一转动速度为 30度每秒, 则可以在该第一转动速度 的基础上增加一个第四转动速度, 第四转动速度可以为 3度每秒。 该第四转动速度与第一转 动速度方向相反。 该视角调整过程体现为相邻帧显示的虚拟场景中瞄准点的位置变化时, 可 以为:视角调整操作可以控制瞄准点在下一帧画面中相较于上一帧画面向右移动 90米, 则该 第三转动速度可以使得瞄准点向左边移动 9米, 也即是该瞄准点向右少移动了 9米。 第四转 动速度小于第三转动速度, 辅助瞄准的强度变小。
在一个具体的可能实施例中, 如果终端采用上述方式一或方式二的表示方式, 考虑到瞄 准点与目标虚拟对象之间的距离相同时, 而当前终端控制的虚拟对象和该目标虚拟对象的距 离不同时, 想要将瞄准点移动至目标虚拟对象身上视角所需转动的角度不同的情况, 终端还 可以获取该当前终端控制的虚拟对象和该目标虚拟对象的距离, 在上述步骤二中考虑到该距 离进行第二转动速度的获取。 在该实施例中, 当前终端控制的虚拟对象和该目标虚拟对象的 距离、 瞄准点与目标虚拟对象之间的距离以及第二转动速度之间可以具有转换关系, 从而保 证在瞄准点与目标虚拟对象之间的距离相同时, 而当前终端控制的虚拟对象和该目标虚拟对 象的距离不同时, 获取不同的第二转动速度。
例如, 瞄准点与目标虚拟对象之间的距离相同时, 如果当前终端控制的虚拟对象和该目 标虚拟对象的距离大, 视角所需转动的角度较小, 则获取较小的第二转动速度; 如果当前终 端控制的虚拟对象和该目标虚拟对象的距离小, 视角所需转动的角度较大, 则获取较大的第 二转动速度。 这样当前终端控制的虚拟对象和该目标虚拟对象的距离不同时可以提供相同的 辅助效果。
相应地, 上述方式一可以为: 终端根据该当前终端控制的虚拟对象和该目标虚拟对象之 间的距离, 以及该目标虚拟对象与该瞄准点投影在终端屏幕上的距离, 获取该虚拟场景的视 角的第二转动速度, 该第二转动速度与该当前终端控制的虚拟对象和该目标虚拟对象的距离 负相关。
上述方式二可以为:终端根据该当前终端控制的虚拟对象和该目标虚拟对象之间的距离, 以及该目标虚拟对象与该瞄准点在虚拟场景中的距离, 获取该虚拟场景的视角的第二转动速 度, 该第二转动速度与该当前终端控制的虚拟对象和该目标虚拟对象的距离负相关。
上述提供了多种终端获取第二转动速度的实现方式, 终端可以采取任一种实现方式获取 第二转动速度, 也可以任意结合上述多种实现方式获取第二转动速度, 例如, 在获取第二转 动速度的过程中可以既考虑视角调整操作的操作方向, 也考虑当前终端控制的虚拟对象与目 标虚拟对象之间的距离, 从而终端可以基于视角调整操作的操作方向、 当前终端控制的虚拟 对象与目标虚拟对象之间的距离、 该瞄准点的位置以及该目标虚拟对象的位置, 获取第二转 动速度, 本申请实施例对此不作限定, 在此也不多做贅述。
需要说明的是, 终端可以先执行步骤一, 再执行步骤二, 也可以先执行步骤二, 再执行 步骤一, 还可以同时执行步骤一和步骤二, 本申请实施例对该步骤一和步骤二的时序不作限 定。
在上述吸附区域包括三种吸附区域的情况下还有一种实现方式, 当该瞄准点位于该目标 虚拟对象的第一吸附区域或第二吸附区域时, 终端执行该根据该瞄准点的位置和该目标虚拟 对象的位置, 获取该虚拟场景的视角的第二转动速度的步骤。 当该瞄准点位于该目标虚拟对 象的第三吸附区域时, 不执行该步骤二, 或将零作为第二转动速度, 相应地, 下述步骤三可 以为: 终端将该第一转动速度作为该虚拟场景的视角的目标转动速度。
也即是, 在第一吸附区域内, 视角调整操作对应的视角的第一转动速度可以为用户操作 对应的正常转动速度, 而在该第一转动速度的基础上, 可以基于瞄准点的位置和目标虚拟对 象的位置, 再获取第二转动速度, 从而综合第一转动速度和第二转动速度, 获取目标转动速 度。 在第二吸附区域内, 视角调整操作对应的视角的第一转动速度比正常用户操作对应的转 动速度小, 在该第一转动速度的基础上, 可以基于瞄准点的位置和目标虚拟对象的位置, 再 获取第二转动速度, 从而综合第一转动速度和第二转动速度, 获取目标转动速度。 在第三吸 附区域内, 终端可以仅执行步骤一, 视角调整操作对应的视角的第一转动速度比正常用户操 作对应的转动速度小, 并将该第一转动速度作为目标转动速度。
步骤三、 终端基于该第一转动速度和该第二转动速度, 获取该虚拟场景的视角的目标转 动速度。
终端获取第一转动速度和第二转动速度后, 可以综合两个转动速度, 得到该虚拟场景的 视角的目标转动速度, 该目标转动速度即为该视角进行调整时所需依据的转动速度。
在一种可能实现方式中, 可以为该第一转动速度和第二转动速度设置权重, 终端可以对 该第一转动速度和第二转动速度进行加权求和, 得到虚拟场景的视角的目标转动速度。其中, 该第一转动速度和第二转动速度的权重可以由相关技术人员根据需求进行设置, 也可以基于 上述瞄准点的位置和目标虚拟对象的位置获取, 也可以基于当前终端控制的虚拟对象和目标 虚拟对象之间的距离获取, 本申请实施例对此不作限定。
当然, 如果第一转动速度和第二转动速度的权重均设置为 1, 该步骤三即为: 终端可以 对第一转动速度和第二转动速度进行求和, 得到虚拟场景的视角的目标转动速度。 其中, 上 述第一转动速度和第二转动速度均可以为矢量, 该第一转动速度和第二转动速度的方向可能 相同, 也可以不同。 该第一转动速度的方向为该视角调整操作对应的方向, 第二转动速度的 方向为从瞄准点出发朝向目标虚拟对象的方向, 因而, 该步骤三可以为: 终端可以对第一转 动速度和第二转动速度进行矢量求和, 得到目标转动速度。
例如, 下面以两种极限情况为例进行说明, 在情况一中, 第一转动速度和第二转动速度 的方向相同, 瞄准点在目标虚拟对象的正左方, 视角调整操作的操作方向为正右方, 也即是 控制瞄准点沿着向正右方的方向移动, 则第一转动速度的方向为正右方, 第二转动速度为从 瞄准点出发朝向目标虚拟对象的方向, 也即是正右方。 则目标转动速度的值可以为第一转动 速度的值与第二转动速度的值的和值, 且目标转动速度的方向为正右方。
在情况二中, 第一转动速度和第二转动速度的方向相反, 瞄准点在目标虚拟对象的正左 方, 视角调整操作的操作方向为正左方, 也即是控制瞄准点沿着向正左方的方向移动, 则第 一转动速度的方向为正左方, 第二转动速度为从瞄准点出发朝向目标虚拟对象的方向, 也即 是正右方。 则目标转动速度的值可以为第一转动速度的值与第二转动速度的值的差值, 且目 标转动速度的方向取决于第一转动速度的值与第二转动速度的值的大小关系。 如果第一转动 速度的值大于第二转动速度的值, 则目标转动速度的方向为正左方; 如果第一转动速度的值 小于第二转动速度的值, 则目标转动速度的方向为正右方; 如果第一转动速度的值等于第二 转动速度的值, 则目标转动速度为零。 在一种可能实现方式中, 第一转动速度可以大于第二 转动速度, 这样可以保证尊重用户操作, 保证电子游戏的公平性和公正性, 保证较好的游戏 体验。
上述步骤 303为当瞄准点位于该目标虚拟对象的吸附区域时提供辅助瞄准服务的过程, 在计算得到虚拟场景的视角的目标转动速度后, 终端可以执行下述步骤 304, 对视角进行调 整。 还有一种可能情况, 当瞄准点位于该目标虚拟对象的吸附区域外时, 终端可以执行步骤 305, 不提供辅助瞄准服务, 直接基于视角调整操作, 进行正常的视角调整。
304、 终端基于该视角的目标转动速度, 显示目标虚拟场景。
终端获取到视角的目标转动速度后, 可以基于该目标转动速度, 对视角进行调整, 显示 调整后的目标虚拟场景。 终端对虚拟场景的视角进行调整的具体过程可以为: 终端可以根据 该视角的目标转动速度, 计算虚拟场景的视角在预设时间间隔内的转动角度; 终端控制该视 角转动该转动角度。 该预设时间间隔是指相邻帧之间的时间间隔, 该预设时间间隔可以由技 术人员预先设置, 也可以由用户根据自身设备的运行情况进行设置调整。
需要说明的是, 上述步骤 301至步骤 304是一种动态的视角调整过程, 终端可以在每一 帧执行上述步骤 301 至步骤 304, 在每一帧计算得到视角的目标转动速度后, 可以基于该视 角的目标转动速度计算这一帧到下一帧时, 视角的转动角度, 计算得到下一帧时的视角方向, 从而对下一帧的目标虚拟场景进行渲染显示。 然后终端再在下一帧时重复上述检测、 获取以 及调整和显示过程。
需要说明的是, 上述步骤 301和步骤 302中, 提供了当瞄准点位于目标虚拟对象的吸附 区域时, 可以提供辅助瞄准服务的具体流程, 在一种可能场景中, 当前终端控制的虚拟对象 的视野范围内还可能包括多个其他虚拟对象, 该多个其他虚拟对象均为候选虚拟对象, 也即 是任一个候选虚拟对象均可能被选择作为目标虚拟对象。 在这种场景中, 终端可以从多个候 选虚拟对象中选择一个作为目标虚拟对象, 从而为瞄准该目标虚拟对象的过程提供辅助瞄准 服务。
具体地, 当该瞄准点位于多个候选虚拟对象的吸附区域时, 终端从该多个候选虚拟对象 中, 选择一个候选虚拟对象作为目标虚拟对象; 终端基于选中的该目标虚拟对象, 执行步骤 303, 也即是执行该根据该瞄准点的位置、 该目标虚拟对象的位置和该视角调整操作, 获取该 虚拟场景的视角的目标转动速度的步骤。
其中, 终端从多个候选虚拟对象中选择目标虚拟对象的过程可以通过下述任一种方式实 现:
方式一、 终端从该多个候选虚拟对象中, 随机选择一个候选虚拟对象作为目标虚拟对象。 方式二、 终端根据该多个候选虚拟对象与该瞄准点在虚拟场景中的距离, 选择与该瞄准 点距离最小的候选虚拟对象作为该目标虚拟对象。
方式三、终端根据该多个候选虚拟对象与该瞄准点投影在终端屏幕上的位置之间的距离, 选择与该瞄准点距离最小的候选虚拟对象作为目标虚拟对象。
方式四、 终端根据该多个候选虚拟对象与当前终端控制的虚拟对象的连线方向和该瞄准 点所在方向之间的夹角, 选择夹角最小的候选虚拟对象作为目标虚拟对象。
在该方式四中, 终端可以获取该夹角, 再选择夹角最小的候选虚拟对象作为目标虚拟对 象。 其中, 终端获取该夹角的过程可以通过多种方式实现。 在一种可能实现方式中, 终端可 以根据该多个候选虚拟对象与该瞄准点在虚拟场景中的距离, 以及该多个候选虚拟对象与当 前终端控制的虚拟对象之间的距离, 获取该多个候选虚拟对象对应的夹角。 在另一种可能实 现方式中, 终端可以根据该多个候选虚拟对象与该瞄准点投影在终端屏幕中的距离, 以及该 多个候选虚拟对象与当前终端控制的虚拟对象之间的距离, 获取该多个候选虚拟对象对应的 夹角。 当然, 上述仅以两种获取夹角的方式为例进行说明, 终端还可以采取其他方式, 例如, 可以根据多个候选虚拟对象的位置、 当前终端控制的虚拟对象的位置和瞄准点所在方向进行 模拟计算, 得到瞄准点移动至候选虚拟对象所在区域时该视角所需转动的角度, 也即是该夹 角。 本申请实施例对具体如何获取该夹角不作限定。
上述仅提供了四种可能实现方式,该目标虚拟对象的选择过程还可以采用其他方式实现, 例如, 可以根据该瞄准点与目标虚拟对象投影在终端屏幕上的水平距离进行选择, 该水平距 离可以为该瞄准点与目标虚拟对象投影在终端屏幕上的距离在水平方向上的距离分量。 同理 地, 也可以根据该瞄准点与目标虚拟对象在虚拟场景中的水平距离进行选择, 该水平距离可 以为该瞄准点与目标虚拟对象在虚拟场景中的距离在水平方向上的距离分量。 本申请实施例 对具体采用哪种表示方式不作限定。
305、 终端根据该视角调整操作, 获取该虚拟场景的视角的第一预设转动速度, 基于该第 一预设转动速度, 显示目标虚拟场景。
在上述虚拟场景中不包括目标虚拟对象的情况中, 或者瞄准点位于目标虚拟对象的吸附 区域外的情况中, 没有需要辅助瞄准的目标虚拟对象, 或者视野范围内的目标虚拟对象均距 离瞄准点较远从而不符合提供辅助瞄准的条件, 终端可以执行该步骤 305, 基于视角调整操 作, 进行正常的视角调整流程。
在这种情况下, 终端可以获取第一预设转动速度, 该第一预设转动速度为不提供辅助瞄 准服务时视角调整操作对应的正常转动速度,终端按照该第一预设转动速度对视角进行调整, 从而显示视角调整后的目标虚拟场景, 并未为本次的视角调整操作提供辅助。
在一种可能实现方式中, 还可以设置有: 当该虚拟场景处于基于瞄具的显示模式时, 提 供辅助瞄准服务; 当该虚拟场景未处于基于瞄具的显示模式时, 不提供辅助瞄准服务。 在这 种方式中, 上述步骤 302之前, 当检测到视角调整操作时, 终端可以获取虚拟场景的显示模 式, 这样终端可以判断该虚拟场景的显示模式是否为基于瞄具的显示模式。 当该虚拟场景处 于基于瞄具的显示模式时, 终端执行步骤 302, 也即是执行该获取目标虚拟对象的吸附区域 的步骤; 当该虚拟场景未处于基于瞄具的显示模式时, 终端执行该步骤 305, 也即是根据该 视角调整操作, 获取该虚拟场景的视角的第一预设转动速度。
在上述方式中, 可以理解地, 虚拟对象在瞄准目标虚拟对象时终端可以为其提供辅助, 想要精确地对目标虚拟对象进行瞄准或射击时, 一般会基于虚拟道具的瞄具来观察虚拟场景 以及虚拟场景中的目标虚拟对象, 因而在这种显示模式下提供辅助瞄准服务即可, 在其他的 显示模式下, 该虚拟对象可能在移动或观察虚拟场景而并未想要对目标虚拟对象进行瞄准, 从而无需提供辅助瞄准。
在一种可能实现方式中, 还可以提供一种辅助瞄准功能: 移动跟随。 具体地, 可以为目 标虚拟对象设置有第四吸附区域, 该第四吸附区域可以与上述第一吸附区域、 第二吸附区域 或第三吸附区域中任一个吸附区域相同, 也可以与上述三个吸附区域中均不同, 具体均可以 由相关技术人员根据需求进行设置, 本申请实施例对此不作限定。 在这种实现方式中, 当该 瞄准点位于该目标虚拟对象的第四吸附区域且该目标虚拟对象发生移动时, 终端可以控制该 瞄准点跟随该目标虚拟对象进行移动。
具体地, 终端可以获取目标虚拟对象的移动速度和移动方向, 根据该目标虚拟对象的移 动速度和移动方向, 获取该瞄准点的目标跟随速度和目标跟随方向, 在一个具体的可能实施 例中, 该目标跟随速度可以小于该目标虚拟对象的移动速度, 该目标跟随方向可以与目标虚 拟对象的移动方向相同。
例如, 如图 8所示, 目标虚拟对象正在向左移动, 终端可以控制瞄准点同步向左移动, 跟随该目标虚拟对象。
如图 9所示, 在一个具体示例中, 上述视角调整过程可以体现为对每一帧 (tick) 中镜头 朝向进行更新的过程, 终端可以判断是否有镜头转向的输入, 也即是检测是否有视角调整操 作, 如果是, 则可以继续判断虚拟对象是否举镜, 该虚拟对象举镜时虚拟场景处于基于瞄具 的显示模式, 在虚拟对象有举镜时, 可以判断瞄准点是否位于敌人的吸附区域内, 如果是, 则可以提供辅助瞄准, 具体地, 根据瞄准点的位置不同, 可以提供不同的辅助瞄准功能, 瞄 准点在敌人较远处时, 可以提供吸附速度, 在瞄准点在敌人较近处, 可以产生磁力, 并进行 磁力计算, 该磁力计算即用于控制瞄准点跟随目标虚拟对象。 瞄准点如果在敌人身上, 则可 以产生阻尼并进行阻尼计算, 比如在灵敏度上增加阻尼系数, 减小灵敏度, 以减小视角的转 动速度。
本申请实施例中在检测到视角调整操作时, 如果瞄准点在目标虚拟对象的吸附区域, 可 以提供辅助瞄准, 综合视角调整操作以及瞄准点和目标虚拟对象的位置获取虚拟场景的视角 的转动速度, 从而进行视角调整, 对虚拟场景进行显示, 上述过程中考虑到了视角调整操作, 在尊重用户操作的基础上提供了辅助瞄准, 可以避免出现虚拟场景显示与用户操作脱离的情 况, 这样可以满足用户需求, 既尊重了用户操作, 也提供了辅助效果, 显示效果较好。
上述图 2所示实施例中对检测到视角调整操作时提供辅助瞄准服务的具体流程进行了说 明, 在上述方法中还可以提供一种辅助瞄准功能: 终端可以在虚拟场景的显示模式进行切换 时提供辅助瞄准, 具体方法可以参见下述图 10所示实施例。
图 10是本申请实施例提供的一种虚拟场景显示方法的流程图, 参见图 10, 该方法可以 包括以下步骤:
1001、 当检测到虚拟场景从第一显示模式切换至第二显示模式时, 终端获取瞄准点对应 的目标区域。
其中, 该第二显示模式为基于瞄具的显示模式, 图 2即示出了一种第二显示模式下的虚 拟场景。 该第一显示模式为除该第二显示模式之外的显示模式, 图 1即示出了一种第一显示 模式下的虚拟场景。
在虚拟场景的显示模式从第一显示模式切换至第二显示模式时, 该虚拟对象可能是要对 虚拟场景中的其它虚拟对象进行瞄准或射击, 在这种情况下, 终端可以判断瞄准点附近是否 有其他虚拟对象, 如果是, 可以提供辅助瞄准服务, 将瞄准点移动至该其他虚拟对象所在区 域。
具体地, 可以设置有瞄准点对应的目标区域, 该目标区域为该瞄准点附近的区域, 也即 是, 该目标区域为与该瞄准点的距离符合一定条件的区域。 在虚拟场景的显示模式从第一显 示模式切换至第二显示模式时, 可以获取该目标区域, 从而将根据该目标区域中是否包括其 他虚拟对象作为是否提供辅助瞄准服务的条件。 例如, 该目标区域可以为与该瞄准点的距离 小于距离阈值的区域。 其中, 该距离阈值可以由相关技术人员根据需求进行设置, 本申请实 施例对此不作限定。
其中, 该终端获取目标区域的过程可以为: 终端获取以该瞄准点为中心、 尺寸为预设尺 寸的目标区域。 该预设尺寸可以由相关技术人员进行设置, 本申请实施例对此不作限定。 在一个具体的可能实施例中, 该目标区域可以为以该瞄准点为圆心、 半径为目标半径的 圆形区域。 相应地, 该终端获取目标区域的过程可以为: 终端获取以该瞄准点为圆心、 半径 为目标半径的圆形区域。 当然, 该目标区域的形状还可以为其他形状, 例如, 多边形区域, 在此仅以该目标区域可以为圆形区域为例进行说明, 本申请实施例对该目标区域的形状不作 限定。 例如, 如图 11所示, 目标区域可以为以瞄准点为中心的一个圆形区域。
1002、 终端检测该目标区域内是否包括目标虚拟对象, 如果是, 则执行步骤 1003, 如果 否, 则执行步骤 1007。
终端在获取到瞄准点对应的目标区域后,可以判断该目标区域内是否包括目标虚拟对象, 从而根据判断结果, 判断是否需要提供辅助瞄准服务。 可以理解地, 如果该目标区域内不包 括目标虚拟对象, 也即是, 瞄准点附近没有其他虚拟对象, 可以无需提供辅助瞄准服务, 可 以执行下述步骤 1007, 直接进行显示模式的切换。 例如, 如图 11所示, 目标区域中包括目 标虚拟对象, 则可以提供辅助瞄准服务。 而如果目标区域内包括目标虚拟对象, 也即是, 瞄 准点附近有其他虚拟对象, 则该虚拟对象本次进行显示模式切换即可能是想要瞄准该其他虚 拟对象, 因而可以提供辅助瞄准服务, 可以执行下述步骤 1003至步骤 1006。
在一种可能实现方式中, 该目标虚拟对象可以为当前终端控制的虚拟对象之外的任一虚 拟对象。 在另一种可能实现方式中, 当前终端控制的虚拟对象还可能与其他虚拟对象组队, 作为同一个队伍中的虚拟对象, 一般地, 当前终端控制的虚拟对象不需要对处于同一个队伍 的虚拟对象进行瞄准或射击, 因而, 该目标虚拟对象还可以为与当前终端控制的虚拟对象所 属的队伍不同的任一虚拟对象。 本申请实施例对该目标虚拟对象的具体判断方式不作限定。
在一种可能场景中, 该目标区域内还可能包括多个其他虚拟对象, 该多个其他虚拟对象 均为候选虚拟对象, 也即是任一个候选虚拟对象均可能被选择作为目标虚拟对象。 在这种场 景中, 终端可以从多个候选虚拟对象中选择一个作为目标虚拟对象, 从而为瞄准该目标虚拟 对象的过程提供辅助瞄准服务。
具体地, 当该目标区域内包括多个候选虚拟对象时, 终端从该多个候选虚拟对象中, 选 择一个候选虚拟对象作为该目标虚拟对象; 终端基于选中的该目标虚拟对象, 执行下述步骤 1003至步骤 1005, 也即是该根据该目标虚拟对象的位置和该瞄准点的位置, 获取该虚拟场景 的视角的目标转动方向和目标转动角度的步骤。
其中, 终端从多个候选虚拟对象中选择目标虚拟对象的过程可以通过下述任一种方式实 现:
方式一、 终端从该多个候选虚拟对象中, 随机选择一个候选虚拟对象作为目标虚拟对象。 方式二、 终端根据该多个候选虚拟对象与该瞄准点在虚拟场景中的距离, 选择与该瞄准 点距离最小的候选虚拟对象作为该目标虚拟对象。
方式三、终端根据该多个候选虚拟对象与该瞄准点投影在终端屏幕上的位置之间的距离, 选择与该瞄准点距离最小的候选虚拟对象作为目标虚拟对象。
方式四、 终端根据该多个候选虚拟对象与当前终端控制的虚拟对象的连线方向和该瞄准 点所在方向之间的夹角, 选择夹角最小的候选虚拟对象作为目标虚拟对象。
在该方式四中, 终端可以获取该夹角, 再选择夹角最小的候选虚拟对象作为目标虚拟对 象。 其中, 终端获取该夹角的过程可以通过多种方式实现。 在一种可能实现方式中, 终端可 以根据该多个候选虚拟对象与该瞄准点在虚拟场景中的距离, 以及该多个候选虚拟对象与当 前终端控制的虚拟对象之间的距离, 获取该多个候选虚拟对象对应的夹角。 在另一种可能实 现方式中, 终端可以根据该多个候选虚拟对象与该瞄准点投影在终端屏幕中的距离, 以及该 多个候选虚拟对象与当前终端控制的虚拟对象之间的距离, 获取该多个候选虚拟对象对应的 夹角。 当然, 上述仅以两种获取夹角的方式为例进行说明, 终端还可以采取其他方式, 例如, 可以根据多个候选虚拟对象的位置、 当前终端控制的虚拟对象的位置和瞄准点所在方向进行 模拟计算, 得到瞄准点移动至候选虚拟对象所在区域时该视角所需转动的角度, 也即是该夹 角。 本申请实施例对具体如何获取该夹角不作限定。
上述仅提供了四种可能实现方式,该目标虚拟对象的选择过程还可以采用其他方式实现, 例如, 可以根据该瞄准点与目标虚拟对象投影在终端屏幕上的水平距离进行选择, 该水平距 离可以为该瞄准点与目标虚拟对象投影在终端屏幕上的距离在水平方向上的距离分量。 同理 地, 也可以根据该瞄准点与目标虚拟对象在虚拟场景中的水平距离进行选择, 该水平距离可 以为该瞄准点与目标虚拟对象在虚拟场景中的距离在水平方向上的距离分量。 本申请实施例 对具体采用哪种表示方式不作限定。
1003、 终端获取该目标虚拟对象的目标位置。
终端经过判断, 确定提供辅助瞄准服务后, 可以先获取目标虚拟对象的目标位置, 该目 标位置即为该瞄准点将要移动至的位置, 从而基于该目标位置以及该瞄准点当前的位置, 判 断如何进行视角转动, 显示转动后的目标虚拟场景。
具体地, 该目标位置的获取过程可以包括多种方式, 该目标位置可以基于该瞄准点和目 标虚拟对象的位置关系获取, 该目标位置也可以一个该目标虚拟对象身上的一个固定位置, 下面通过三种方式对该目标位置的获取过程进行说明, 终端可以采用任一种方式获取目标位 置。
方式一、 终端根据该瞄准点和该目标虚拟对象投影在终端屏幕上的位置在水平方向上的 关系, 获取该目标虚拟对象的目标位置。 在该方式一中, 终端可以根据瞄准点的位置和目标 虚拟对象的位置, 获取该瞄准点和目标虚拟对象投影在终端屏幕上的位置, 从而根据二者的 投影位置在水平方向上的关系, 获取该目标虚拟对象的目标位置。
具体地, 二者的投影位置在水平方向上的关系可能包括两种情况, 相应地, 该方式一中, 终端获取目标位置的过程可以不同, 具体如下:
情况一: 当该瞄准点的投影位置在水平方向上的水平位置位于该目标虚拟对象的投影位 置在水平方向上的水平位置范围内时, 终端将该水平位置范围中与该瞄准点的水平位置相同 的位置作为该目标虚拟对象的目标位置。
如果根据二者的投影位置确定瞄准点沿着水平方向即可移动至目标虚拟对象身上, 则可 以直接将该瞄准点的投影位置的水平位置对应的位置作为该目标位置。 在该情况一中, 获取 该目标位置后, 即可控制瞄准点仅在水平方向上移动即可。
情况二: 当该瞄准点的投影位置在水平方向上的水平位置位于该目标虚拟对象的投影位 置在水平方向上的水平位置范围外时, 终端将该瞄准点的水平位置对应的位置或该目标虚拟 对象的目标部位所在位置作为该目标虚拟对象的目标位置。
在该情况二中, 如果根据二者的投影位置确定瞄准点沿着水平方向无法移动至目标虚拟 对象身上, 在一种可能实现方式中, 该瞄准点的水平位置与该水平位置范围的关系不同时, 该水平位置还可以对应于不同的位置。 例如, 当该水平位置在该水平位置范围的上方时, 可 以将该目标虚拟对象的第一部位所在位置作为目标位置; 当该水平位置在该水平位置范围的 下方时, 可以将该目标虚拟对象的第二部位所在位置作为目标位置。 其中, 该第一部位和第 二部位可以由相关技术人员进行设置, 例如, 该第一部位可以为头部, 第二部位可以为脚部, 当然, 该第一部位和第二部位还可以为其他部位, 比如第一部位还可以为胸部, 第二部位还 可以为腿部, 本申请实施例对此不作限定。
在另一种可能实现方式, 终端也可以获取目标虚拟对象身上的一个固定位置 (目标部位 所在位置) 作为目标位置。 例如, 目标部位所在位置可以为头部或颈部所在位置, 也可以为 其他部位所在位置, 比如中心位置, 本申请实施例对此不作限定。
方式二、 终端根据该瞄准点和该目标虚拟对象在该虚拟场景中的水平位置的关系, 获取 该目标虚拟对象的目标位置。 在该方式二中, 终端可以根据该瞄准点和目标虚拟对象在虚拟 场景中的位置, 获取二者的位置在水平方向上的关系, 从而根据该位置关系, 获取该目标虚 拟对象的目标位置。
与上述方式一中同理地, 二者在虚拟场景中的位置在水平方向上的关系可能包括两种情 况, 相应地, 该方式二中, 终端获取目标位置的过程可以不同, 具体如下:
情况一: 当该瞄准点在虚拟场景中的位置在水平方向上的水平位置位于该目标虚拟对象 在虚拟场景中的位置在水平方向上的水平位置范围内时, 终端将该水平位置范围中与该瞄准 点的水平位置相同的位置作为该目标虚拟对象的目标位置。
如果根据二者的位置确定瞄准点沿着水平方向即可移动至目标虚拟对象身上, 则可以直 接将该瞄准点的水平位置对应的位置作为该目标位置。 在该情况一中, 获取该目标位置后, 即可控制瞄准点仅在水平方向上移动即可。
情况二: 当该瞄准点在虚拟场景中的位置在水平方向上的水平位置位于该目标虚拟对象 在虚拟场景中的位置在水平方向上的水平位置范围外时, 终端将该瞄准点的水平位置对应的 位置或该目标虚拟对象的目标部位所在位置作为该目标虚拟对象的目标位置。
在该情况二中, 如果根据二者在虚拟场景中的位置确定瞄准点沿着水平方向无法移动至 目标虚拟对象身上, 在一种可能实现方式中, 该瞄准点的水平位置与该水平位置范围的关系 不同时, 该水平位置还可以对应于不同的位置。 例如, 当该水平位置在该水平位置范围的上 方时, 可以将该目标虚拟对象的第一部位所在位置作为目标位置; 当该水平位置在该水平位 置范围的下方时, 可以将该目标虚拟对象的第二部位所在位置作为目标位置。 其中, 该第一 部位和第二部位可以由相关技术人员进行设置, 例如, 该第一部位可以为头部, 第二部位可 以为脚部, 当然, 该第一部位和第二部位还可以为其他部位, 比如第一部位还可以为胸部, 第二部位还可以为腿部, 本申请实施例对此不作限定。
在另一种可能实现方式, 终端也可以获取目标虚拟对象身上的一个固定位置 (目标部位 所在位置) 作为目标位置。 例如, 目标部位所在位置可以为头部或颈部所在位置, 也可以为 其他部位所在位置, 比如中心位置, 本申请实施例对此不作限定。
方式三、 终端获取该目标虚拟对象的目标部位所在位置作为目标位置。
在该方式三中, 终端可无需判断瞄准点和目标虚拟对象的位置关系, 直接将目标虚拟对 象身上的固定位置 (目标部位所在位置) 作为目标位置。 其中, 该目标部位可以有相关技术 人员预先设置, 本申请实施例对此不作限定。
上述仅提供了三种方式, 该获取目标虚拟对象的目标位置的过程还可以通过其他方式实 现, 例如, 可以根据该瞄准点和目标虚拟对象在虚拟场景中的位置或投影在终端屏幕上的位 置在竖直方向上的位置关系, 本申请实施例对具体采用哪种方式不作限定。
1004、 终端将从该瞄准点出发、 朝向该目标位置的方向作为该目标转动方向。
上述步骤 1003中在获取该目标虚拟对象的目标位置后, 由于该目标位置为该瞄准点将要 移动至的位置, 因而, 终端可以将该瞄准点到该目标位置的方向作为虚拟场景的视角的目标 转动方向。
1005、 终端根据该瞄准点所在方向和该目标位置与该当前终端控制的虚拟对象的连线方 向之间的夹角作为该目标转动角度。
上述步骤 1003中在获取该目标虚拟对象的目标位置后, 由于该目标位置为该瞄准点将要 移动至的位置, 因而, 终端可以直接基于该瞄准点的位置和该目标位置, 获取视角的目标转 动角度, 该视角转动该目标转动角度则可以使得瞄准点移动至该目标位置。
该瞄准点所在方向即为视角的方向, 视角调整后需要将瞄准点移动至目标位置上, 则视 角调整后视角的目标方向为目标位置与当前终端控制的虚拟对象的连线方向, 因而该瞄准点 所在方向和该连线方向的夹角即为目标转动角度。
需要说明的是, 该步骤 1004和步骤 1005为根据该瞄准点的位置和该目标位置, 获取该 虚拟场景的视角的目标转动方向和目标转动角度的过程, 终端可以先执行步骤 1004, 再执行 步骤 1005, 也可以先执行步骤 1005, 再执行步骤 1004, 终端还可以同时执行该步骤 1004和 步骤 1005, 本申请实施例对该步骤 1004和步骤 1005的执行顺序不作限定。
该步骤 1003至步骤 1005为根据该目标虚拟对象的位置和该瞄准点的位置, 获取该虚拟 场景的视角的目标转动方向和目标转动角度的过程, 该过程中, 终端先获取目标虚拟对象的 目标位置, 再基于该目标位置和瞄准点的位置获取目标转动方向和目标转动角度。
1006、 终端基于该视角的目标转动方向和目标转动角度, 显示目标虚拟场景, 该目标虚 拟场景中该瞄准点位于该目标虚拟对象所在区域。
终端可以根据视角的目标转动方向和目标转动角度, 获取目标虚拟场景, 从而显示该目 标虚拟场景。 该过程也即是终端基于该目标转动方向和目标转动角度, 对视角进行调整, 显 示调整后的目标虚拟场景。
在一种可能实现方式中, 终端还可以根据视角的目标转动方向、 目标转动角度以及瞄具 对应的缩放比例, 获取目标虚拟场景, 从而显示该目标虚拟场景, 该目标虚拟场景为按照该 缩放比例进行缩放后的虚拟场景, 且该目标虚拟场景中瞄准点位于该目标对象所在区域。
例如, 在用户将虚拟场景的显示模式切换至基于瞄具的显示模式时, 在图 11所示的瞄准 点对应的目标区域中包括目标虚拟对象, 因而, 终端可以控制视角转动, 显示如图 12所示的 目标虚拟场景, 在目标虚拟场景中, 瞄准点已经移动至目标虚拟对象身上。
1007、 终端基于瞄具对应的缩放比例, 对虚拟场景进行显示。
在上述步骤 1002中, 终端检测到目标区域内不包括目标虚拟对象时, 瞄准点附近没有其 他虚拟对象, 可以无需提供辅助瞄准服务, 因而, 终端可以直接按照瞄具对应的缩放比例, 对当前的虚拟场景进行缩放显示。
本申请实施例通过在将虚拟场景的显示模式切换至基于瞄具的显示模式时, 如果瞄准点 对应的区域中包括目标虚拟对象时, 可以控制视角进行转动, 将瞄准点移动至该目标虚拟对 象所在区域, 从而在显示模式切换过程中帮助用户对瞄准点附近的目标虚拟对象进行瞄准, 考虑到了显示模式切换的用户操作, 基于用户操作提供辅助瞄准服务, 而不是无视用户操作, 直接对瞄准点进行拖拽, 因而, 上述虚拟场景显示过程与用户操作密切联系, 可以满足用户 需求, 显示效果较好。
上述所有可选技术方案, 可以采用任意结合形成本申请的可选实施例, 在此不再—贅 述。
在一种可能实现方式中, 上述图 3所示实施例中, 当虚拟场景处于基于瞄具的显示模式 时才执行步骤 302, 获取目标虚拟对象的吸附区域, 提供辅助瞄准服务, 而当该虚拟场景未 处于基于瞄具的显示模式时, 不提供辅助瞄准服务。 则结合上述图 3所示实施例和图 10所示 实施例, 可以包括一种可能场景: 当该虚拟场景未处于基于瞄具的显示模式时, 用户进行视 角调整操作, 终端检测到该视角调整操作后, 可以根据该视角调整操作, 获取虚拟场景的视 角的转动速度, 从而进行视角调整, 并显示调整后的虚拟场景。 用户在终端上继续进行操作, 进行了显示模式切换操作, 将显示模式从第一显示模式切换为第二显示模式, 该第二显示模 式即为上述基于瞄具的显示模式, 在这个切换过程中, 终端可以基于瞄准点的位置, 获取对 应的目标区域, 如果该目标区域内包括目标虚拟对象, 则可以获取该虚拟场景的视角的目标 转动方向和目标转动角度, 从而调整后的虚拟场景中瞄准点对准该目标虚拟对象, 实现了举 镜操作过程中将瞄准点移动至瞄准点附近的目标虚拟对象身上的效果。 在该虚拟场景处于基 于瞄具的显示模式且检测到视角调整操作时, 如果当前终端控制的虚拟对象的瞄准点位于目 标虚拟对象的吸附区域, 则可以提供辅助瞄准, 可以综合考虑瞄准点和目标虚拟对象的位置 以及视角调整操作, 获取虚拟场景的视角的目标转动速度, 从而基于该目标转动速度进行视 角调整, 以对调整的虚拟场景进行显示。
上述所有可选技术方案, 可以采用任意结合形成本申请的可选实施例, 在此不再—贅 述。
图 13 是本申请实施例提供的一种虚拟场景显示装置的结构示意图, 参见图 13, 该装置 可以包括:
获取模块 1301, 用于当检测到视角调整操作时, 获取目标虚拟对象的吸附区域; 该获取模块 1301, 还用于当瞄准点位于该目标虚拟对象的吸附区域时, 根据该瞄准点的 位置、 该目标虚拟对象的位置和该视角调整操作, 获取虚拟场景的视角的目标转动速度; 显示模块 1302, 用于基于该视角的目标转动速度, 显示目标虚拟场景。
在一种可能实现方式中, 该装置还包括:
检测模块, 用于检测虚拟场景中是否包括目标虚拟对象;
该获取模块 1301, 还用于当该虚拟场景中包括目标虚拟对象时, 执行该获取目标虚拟对 象的吸附区域的步骤;
该获取模块 1301,还用于当该虚拟场景中不包括目标虚拟对象时,根据该视角调整操作, 获取该虚拟场景的视角的第一预设转动速度。
在一种可能实现方式中, 该获取模块 1301用于:
当瞄准点位于该目标虚拟对象的吸附区域时, 根据该视角调整操作, 获取该虚拟场景的 视角的第一转动速度;
根据该瞄准点的位置和该目标虚拟对象的位置,获取该虚拟场景的视角的第二转动速度, 该第二转动速度的方向为从该瞄准点出发朝向该目标虚拟对象;
基于该第一转动速度和该第二转动速度, 获取该虚拟场景的视角的目标转动速度。
在一种可能实现方式中, 该获取模块 1301用于:
当该瞄准点位于该目标虚拟对象的第一吸附区域时, 根据该视角调整操作, 获取该虚拟 场景的视角的第一预设转动速度;
当该瞄准点位于该目标虚拟对象的第二吸附区域时, 根据该视角调整操作, 获取该虚拟 场景的视角的第二预设转动速度, 该第二预设转动速度小于第一预设转动速度;
当所述瞄准点位于该目标虚拟对象的第三吸附区域时, 根据该视角调整操作, 获取该虚 拟场景的视角的第三预设转动速度, 该第三预设转动速度小于第一预设转动速度, 该第三预 设转动速度与该第二预设转动速度不同;
其中, 该第一吸附区域包围于该第二吸附区域之外, 该第二吸附区域包围于该第三吸附 区域之外。
在一种可能实现方式中, 该获取模块 1301用于:
根据该目标虚拟对象与该瞄准点投影在终端屏幕上的距离, 获取该虚拟场景的视角的第 二转动速度; 或,
根据该目标虚拟对象与该瞄准点在虚拟场景中的距离, 获取该虚拟场景的视角的第二转 动速度; 或,
根据当前终端控制的虚拟对象与该目标虚拟对象的连线方向和该瞄准点所在方向之间的 夹角, 获取该虚拟场景的视角的第二转动速度。
在一种可能实现方式中, 该第二转动速度和该目标虚拟对象与该瞄准点投影在终端屏幕 上的距离负相关; 或, 该第二转动速度和该目标虚拟对象与该瞄准点在虚拟场景中的距离负 相关; 或, 该第二转动速度与该夹角负相关。
在一种可能实现方式中,该获取模块 1301还用于获取该当前终端控制的虚拟对象和该目 标虚拟对象的距离;
相应地,该获取模块 1301还用于根据该当前终端控制的虚拟对象和该目标虚拟对象之间 的距离, 以及该目标虚拟对象与该瞄准点投影在终端屏幕上的距离, 获取该虚拟场景的视角 的第二转动速度, 该第二转动速度与该当前终端控制的虚拟对象和该目标虚拟对象的距离负 相关; 或,
相应地,该获取模块 1301还用于根据该当前终端控制的虚拟对象和该目标虚拟对象之间 的距离, 以及该目标虚拟对象与该瞄准点在虚拟场景中的距离, 获取该虚拟场景的视角的第 二转动速度,该第二转动速度与该当前终端控制的虚拟对象和该目标虚拟对象的距离负相关。 在一种可能实现方式中, 该获取模块 1301还用于:
当该瞄准点位于该目标虚拟对象的第一吸附区域或第二吸附区域时, 执行该根据该瞄准 点的位置和该目标虚拟对象的位置, 获取该虚拟场景的视角的第二转动速度的步骤;
当该瞄准点位于该目标虚拟对象的第三吸附区域时, 将该第一转动速度作为该虚拟场景 的视角的目标转动速度。
在一种可能实现方式中, 该获取模块 1301还用于根据该瞄准点的位置、该目标虚拟对象 的位置以及该视角调整操作的操作方向, 获取该虚拟场景的视角的第二转动速度。
在一种可能实现方式中, 该获取模块 1301还用于:
当该视角调整操作的操作方向指示该瞄准点向该目标虚拟对象移动时, 根据该瞄准点的 位置、 该目标虚拟对象的位置和第一参数, 获取得到第三转动速度作为该虚拟场景的视角的 第二转动速度;
当该视角调整操作的操作方向指示该瞄准点向该目标虚拟对象的反方向移动时, 根据该 瞄准点的位置、 该目标虚拟对象的位置和第二参数, 获取得到第四转动速度作为该虚拟场景 的视角的第二转动速度, 该第四转动速度小于该第三转动速度。
在一种可能实现方式中, 该获取模块 1301还用于:
控制模块, 用于当该瞄准点位于该目标虚拟对象的第四吸附区域且该目标虚拟对象发生 移动时, 控制该瞄准点跟随该目标虚拟对象进行移动。
在一种可能实现方式中, 该获取模块 1301还用于:
当检测到视角调整操作时, 获取虚拟场景的显示模式;
当该虚拟场景处于基于瞄具的显示模式时,执行该获取目标虚拟对象的吸附区域的步骤; 当该虚拟场景未处于基于瞄具的显示模式时, 根据该视角调整操作, 获取该虚拟场景的 视角的第一转动速度。
在一种可能实现方式中,该获取模块 1301还用于根据当前终端控制的虚拟对象和该目标 虚拟对象之间的距离, 获取该目标虚拟对象的吸附区域, 该吸附区域的尺寸与该距离正相关。
在一种可能实现方式中, 该装置还包括:
选择模块, 用于当该瞄准点位于多个候选虚拟对象的吸附区域时, 从该多个候选虚拟对 象中, 选择一个候选虚拟对象作为目标虚拟对象;
该获取模块 1301用于基于选中的该目标虚拟对象, 执行该根据该瞄准点的位置、该目标 虚拟对象的位置和该视角调整操作, 获取该虚拟场景的视角的目标转动速度的步骤。
在一种可能实现方式中, 该选择模块用于:
从该多个候选虚拟对象中, 随机选择一个候选虚拟对象作为目标虚拟对象; 或, 根据该多个候选虚拟对象与该瞄准点在虚拟场景中的距离, 选择与该瞄准点距离最小的 候选虚拟对象作为该目标虚拟对象; 或,
根据该多个候选虚拟对象与该瞄准点投影在终端屏幕上的位置之间的距离, 选择与该瞄 准点距离最小的候选虚拟对象作为目标虚拟对象; 或,
根据该多个候选虚拟对象与当前终端控制的虚拟对象的连线方向和该瞄准点所在方向之 间的夹角, 选择夹角最小的候选虚拟对象作为目标虚拟对象。
本申请实施例提供的装置, 在检测到视角调整操作时, 如果瞄准点在目标虚拟对象的吸 附区域, 可以提供辅助瞄准, 综合视角调整操作以及瞄准点和目标虚拟对象的位置获取虚拟 场景的视角的转动速度, 从而进行视角调整, 对虚拟场景进行显示, 上述过程中考虑到了视 角调整操作, 在尊重用户操作的基础上提供了辅助瞄准, 可以避免出现虚拟场景显示与用户 操作脱离的情况, 这样可以满足用户需求, 既尊重了用户操作, 也提供了辅助效果, 显示效 果较好。
需要说明的是: 上述实施例提供的虚拟场景显示装置在显示虚拟场景时, 仅以上述各功 能模块的划分进行举例说明, 实际应用中, 可以根据需要而将上述功能分配由不同的功能模 块完成, 即将电子设备的内部结构划分成不同的功能模块, 以完成以上描述的全部或者部分 功能。 另外, 上述实施例提供的虚拟场景显示装置与虚拟场景显示方法实施例属于同一构思, 其具体实现过程详见方法实施例, 这里不再贅述。
图 14是本申请实施例提供的一种虚拟场景显示装置的结构示意图, 参见图 14, 该装置 可以包括:
获取模块 1401, 用于当检测到虚拟场景从第一显示模式切换至第二显示模式时, 获取瞄 准点对应的目标区域, 该第二显示模式为基于瞄具的显示模式, 该第一显示模式为除该第二 显示模式之外的显示模式;
该获取模块 1401, 还用于当该目标区域内包括目标虚拟对象时, 根据该目标虚拟对象的 位置和该瞄准点的位置, 获取该虚拟场景的视角的目标转动方向和目标转动角度;
显示模块 1402, 用于基于该视角的目标转动方向和目标转动角度, 显示目标虚拟场景, 该目标虚拟场景中该瞄准点位于该目标虚拟对象所在区域。
在一种可能实现方式中, 该获取模块 1401用于:
获取该目标虚拟对象的目标位置;
根据该瞄准点的位置和该目标位置, 获取该虚拟场景的视角的目标转动方向和目标转动 角度。
在一种可能实现方式中, 该获取模块 1401用于:
根据该瞄准点和该目标虚拟对象投影在终端屏幕上的位置在水平方向上的关系, 获取该 目标虚拟对象的目标位置; 或,
根据该瞄准点和该目标虚拟对象在该虚拟场景中的水平位置的关系, 获取该目标虚拟对 象的目标位置; 或,
获取该目标虚拟对象的目标部位所在位置作为该目标位置。
在一种可能实现方式中, 该获取模块 1401用于:
当该瞄准点的投影位置在水平方向上的水平位置位于该目标虚拟对象的投影位置在水平 方向上的水平位置范围内时, 将该水平位置范围中与该瞄准点的水平位置相同的位置作为该 目标虚拟对象的目标位置; 或,
当该瞄准点的投影位置在水平方向上的水平位置位于该目标虚拟对象的投影位置在水平 方向上的水平位置范围外时, 将该瞄准点的水平位置对应的位置或该目标虚拟对象的目标部 位所在位置作为该目标虚拟对象的目标位置。
在一种可能实现方式中, 该获取模块 1401用于:
将从该瞄准点出发、 朝向该目标位置的方向作为该目标转动方向;
根据该瞄准点所在方向和该目标位置与该当前终端控制的虚拟对象的连线方向之间的夹 角作为该目标转动角度。
在一种可能实现方式中, 该获取模块 1401用于获取以该瞄准点为中心、 尺寸为预设尺寸 的目标区域。
在一种可能实现方式中, 该装置还包括:
选择模块, 用于当该目标区域内包括多个候选虚拟对象时, 从该多个候选虚拟对象中, 选择一个候选虚拟对象作为该目标虚拟对象;
该获取模块 1401还用于基于选中的该目标虚拟对象,执行该根据该目标虚拟对象的位置 和该瞄准点的位置, 获取该虚拟场景的视角的目标转动方向和目标转动角度的步骤。
在一种可能实现方式中, 该选择模块用于:
从该多个候选虚拟对象中, 随机选择一个候选虚拟对象作为目标虚拟对象; 或, 根据该多个候选虚拟对象与该瞄准点在虚拟场景中的距离, 选择与该瞄准点距离最小的 候选虚拟对象作为该目标虚拟对象; 或, 根据该多个候选虚拟对象与该瞄准点投影在终端屏幕上的位置之间的距离, 选择与该瞄 准点距离最小的候选虚拟对象作为目标虚拟对象; 或,
根据该多个候选虚拟对象与当前终端控制的虚拟对象的连线方向和该瞄准点所在方向之 间的夹角, 选择夹角最小的候选虚拟对象作为目标虚拟对象。
本申请实施例提供的装置, 在将虚拟场景的显示模式切换至基于瞄具的显示模式时, 如 果瞄准点对应的区域中包括目标虚拟对象时, 可以控制视角进行转动, 将瞄准点移动至该目 标虚拟对象所在区域, 从而在显示模式切换过程中帮助用户对瞄准点附近的目标虚拟对象进 行瞄准, 考虑到了显示模式切换的用户操作, 基于用户操作提供辅助瞄准服务, 而不是无视 用户操作, 直接对瞄准点进行拖拽, 因而, 上述虚拟场景显示过程与用户操作密切联系, 可 以满足用户需求, 显示效果较好。
需要说明的是: 上述实施例提供的虚拟场景显示装置在显示虚拟场景时, 仅以上述各功 能模块的划分进行举例说明, 实际应用中, 可以根据需要而将上述功能分配由不同的功能模 块完成, 即将电子设备的内部结构划分成不同的功能模块, 以完成以上描述的全部或者部分 功能。 另外, 上述实施例提供的虚拟场景显示装置与虚拟场景显示方法实施例属于同一构思, 其具体实现过程详见方法实施例, 这里不再贅述。
图 15是本申请实施例提供的一种电子设备的结构示意图, 该电子设备 1500可因配置或 性能不同而产生比较大的差异, 可以包括一个或一个以上处理器 (central processing units , CPU) 1501和一个或一个以上的存储器 1502。 该存储器 1502中存储有至少一条指令。
在一种可能实现方式中, 该至少一条指令由该处理器 1501加载并执行以实现下述方法: 当检测到视角调整操作时, 获取目标虚拟对象的吸附区域; 当瞄准点位于所述目标虚拟对象 的吸附区域时, 根据所述瞄准点的位置、 所述目标虚拟对象的位置和所述视角调整操作, 获 取虚拟场景的视角的目标转动速度; 基于所述视角的目标转动速度, 显示目标虚拟场景。
可选地, 所述至少一条指令由所述一个或多个处理器 1501加载并执行以实现: 当瞄准点 位于所述目标虚拟对象的吸附区域时, 根据所述视角调整操作, 获取所述虚拟场景的视角的 第一转动速度; 根据所述瞄准点的位置和所述目标虚拟对象的位置, 获取所述虚拟场景的视 角的第二转动速度, 所述第二转动速度的方向为从所述瞄准点出发朝向所述目标虚拟对象; 基于所述第一转动速度和所述第二转动速度, 获取所述虚拟场景的视角的目标转动速度。
可选地, 所述至少一条指令由所述一个或多个处理器 1501加载并执行以实现: 当所述瞄 准点位于所述目标虚拟对象的第一吸附区域时, 根据所述视角调整操作, 获取所述虚拟场景 的视角的第一预设转动速度; 当所述瞄准点位于所述目标虚拟对象的第二吸附区域时, 根据 所述视角调整操作, 获取所述虚拟场景的视角的第二预设转动速度, 所述第二预设转动速度 小于第一预设转动速度; 当所述瞄准点位于所述目标虚拟对象的第三吸附区域时, 根据所述 视角调整操作, 获取所述虚拟场景的视角的第三预设转动速度, 所述第三预设转动速度小于 第一预设转动速度, 所述第三预设转动速度与所述第二预设转动速度不同; 其中, 所述第一 吸附区域包围于所述第二吸附区域之外, 所述第二吸附区域包围于所述第三吸附区域之外。
可选地, 所述至少一条指令由所述一个或多个处理器 1501加载并执行以实现: 根据所 述目标虚拟对象与所述瞄准点投影在终端屏幕上的距离, 获取所述虚拟场景的视角的第二转 动速度; 或, 根据所述目标虚拟对象与所述瞄准点在虚拟场景中的距离, 获取所述虚拟场景 的视角的第二转动速度; 或, 根据当前终端控制的虚拟对象与所述目标虚拟对象的连线方向 和所述瞄准点所在方向之间的夹角, 获取所述虚拟场景的视角的第二转动速度。
可选地, 所述第二转动速度和所述目标虚拟对象与所述瞄准点投影在终端屏幕上的距离 负相关; 或, 所述第二转动速度和所述目标虚拟对象与所述瞄准点在虚拟场景中的距离负相 关; 或, 所述第二转动速度与所述夹角负相关。
可选地, 所述至少一条指令还由所述一个或多个处理器 1501加载并执行以实现: 获取所述当前终端控制的虚拟对象和所述目标虚拟对象的距离; 相应地, 所所述至少一条指令由所述一个或多个处理器 1501加载并执行以实现: 根据所述当前终端控制的虚拟对象和所述目标虚拟对象之间的距离, 以及所述目标虚拟 对象与所述瞄准点投影在终端屏幕上的距离, 获取所述虚拟场景的视角的第二转动速度, 所 述第二转动速度与所述当前终端控制的虚拟对象和所述目标虚拟对象的距离负相关; 或, 相应地, 所述至少一条指令由所述一个或多个处理器 1501加载并执行以实现: 根据所述当前终端控制的虚拟对象和所述目标虚拟对象之间的距离, 以及所述目标虚拟 对象与所述瞄准点在虚拟场景中的距离, 获取所述虚拟场景的视角的第二转动速度, 所述第 二转动速度与所述当前终端控制的虚拟对象和所述目标虚拟对象的距离负相关。
可选地, 所述至少一条指令还由所述一个或多个处理器 1501加载并执行以实现: 当所述瞄准点位于所述目标虚拟对象的第一吸附区域或第二吸附区域时, 执行所述根据 所述瞄准点的位置和所述目标虚拟对象的位置, 获取所述虚拟场景的视角的第二转动速度的 步骤; 当所述瞄准点位于所述目标虚拟对象的第三吸附区域时, 将所述第一转动速度作为所 述虚拟场景的视角的目标转动速度。
可选地, 所述至少一条指令由所述一个或多个处理器 1501加载并执行以实现: 根据所述瞄准点的位置、 所述目标虚拟对象的位置以及所述视角调整操作的操作方向, 获取所述虚拟场景的视角的第二转动速度。
可选地, 所述至少一条指令由所述一个或多个处理器 1501加载并执行以实现: 当所述视 角调整操作的操作方向指示所述瞄准点向所述目标虚拟对象移动时,根据所述瞄准点的位置、 所述目标虚拟对象的位置和第一参数, 获取得到第三转动速度作为所述虚拟场景的视角的第 二转动速度; 当所述视角调整操作的操作方向指示所述瞄准点向所述目标虚拟对象的反方向 移动时, 根据所述瞄准点的位置、 所述目标虚拟对象的位置和第二参数, 获取得到第四转动 速度作为所述虚拟场景的视角的第二转动速度, 所述第四转动速度小于所述第三转动速度。
可选地, 所述至少一条指令还由所述一个或多个处理器 1501加载并执行以实现: 当检测 到视角调整操作时, 获取虚拟场景的显示模式; 当所述虚拟场景处于基于瞄具的显示模式时, 执行所述获取目标虚拟对象的吸附区域的步骤; 当所述虚拟场景未处于基于瞄具的显示模式 时, 根据所述视角调整操作, 获取所述虚拟场景的视角的第一转动速度。
可选地, 所述至少一条指令还由所述一个或多个处理器 1501加载并执行以实现: 当所述 瞄准点位于多个候选虚拟对象的吸附区域时, 从所述多个候选虚拟对象中, 选择一个候选虚 拟对象作为目标虚拟对象; 基于选中的所述目标虚拟对象, 执行所述根据所述瞄准点的位置、 所述目标虚拟对象的位置和所述视角调整操作,获取虚拟场景的视角的目标转动速度的步骤。
可选地, 所述至少一条指令由所述一个或多个处理器 1501加载并执行以实现: 从所述多 个候选虚拟对象中, 随机选择一个候选虚拟对象作为目标虚拟对象; 或, 根据所述多个候选 虚拟对象与所述瞄准点在虚拟场景中的距离, 选择与所述瞄准点距离最小的候选虚拟对象作 为所述目标虚拟对象; 或, 根据所述多个候选虚拟对象与所述瞄准点投影在终端屏幕上的位 置之间的距离, 选择与所述瞄准点距离最小的候选虚拟对象作为目标虚拟对象; 或, 根据所 述多个候选虚拟对象与当前终端控制的虚拟对象的连线方向和所述瞄准点所在方向之间的夹 角, 选择夹角最小的候选虚拟对象作为目标虚拟对象。
在另一种可能实现方式中,该至少一条指令由该处理器 15011501加载并执行以实现下述 方法: 当检测到虚拟场景从第一显示模式切换至第二显示模式时, 获取瞄准点对应的目标区 域, 所述第二显示模式为基于瞄具的显示模式, 所述第一显示模式为除所述第二显示模式之 外的显示模式; 当所述目标区域内包括目标虚拟对象时, 根据所述目标虚拟对象的位置和所 述瞄准点的位置, 获取所述虚拟场景的视角的目标转动方向和目标转动角度; 基于所述视角 的目标转动方向和目标转动角度, 显示目标虚拟场景, 所述目标虚拟场景中所述瞄准点位于 所述目标虚拟对象所在区域。
可选地, 所述至少一条指令由所述一个或多个处理器 1501加载并执行以实现: 获取所述目标虚拟对象的目标位置; 根据所述瞄准点的位置和所述目标位置, 获取所述 虚拟场景的视角的目标转动方向和目标转动角度。
可选地, 所述至少一条指令由所述一个或多个处理器 1501加载并执行以实现: 根据所述瞄准点和所述目标虚拟对象投影在终端屏幕上的位置在水平方向上的关系, 获 取所述目标虚拟对象的目标位置; 或, 根据所述瞄准点和所述目标虚拟对象在所述虚拟场景 中的水平位置的关系, 获取所述目标虚拟对象的目标位置; 或, 获取所述目标虚拟对象的目 标部位所在位置作为所述目标位置。
可选地, 所述至少一条指令由所述一个或多个处理器 1501加载并执行以实现: 当所述瞄 准点的投影位置在水平方向上的水平位置位于所述目标虚拟对象的投影位置在水平方向上的 水平位置范围内时, 将所述水平位置范围中与所述瞄准点的水平位置相同的位置作为所述目 标虚拟对象的目标位置; 或, 当所述瞄准点的投影位置在水平方向上的水平位置位于所述目 标虚拟对象的投影位置在水平方向上的水平位置范围外时, 将所述瞄准点的水平位置对应的 位置或所述目标虚拟对象的目标部位所在位置作为所述目标虚拟对象的目标位置。
可选地, 所述至少一条指令由所述一个或多个处理器 1501加载并执行以实现: 将从所述 瞄准点出发、 朝向所述目标位置的方向作为所述目标转动方向; 根据所述瞄准点所在方向和 所述目标位置与所述当前终端控制的虚拟对象的连线方向之间的夹角作为所述目标转动角 度。
可选地, 所述至少一条指令由所述一个或多个处理器 1501加载并执行以实现: 获取以所述瞄准点为中心、 尺寸为预设尺寸的目标区域。
可选地, 所述至少一条指令还由所述一个或多个处理器 1501加载并执行以实现: 当所述 目标区域内包括多个候选虚拟对象时, 从所述多个候选虚拟对象中, 选择一个候选虚拟对象 作为所述目标虚拟对象; 基于选中的所述目标虚拟对象, 执行所述根据所述目标虚拟对象的 位置和所述瞄准点的位置,获取所述虚拟场景的视角的目标转动方向和目标转动角度的步骤。
可选地, 所述至少一条指令由所述一个或多个处理器 1501加载并执行以实现: 从所述多 个候选虚拟对象中, 随机选择一个候选虚拟对象作为目标虚拟对象; 或, 根据所述多个候选 虚拟对象与所述瞄准点在虚拟场景中的距离, 选择与所述瞄准点距离最小的候选虚拟对象作 为所述目标虚拟对象; 或, 根据所述多个候选虚拟对象与所述瞄准点投影在终端屏幕上的位 置之间的距离, 选择与所述瞄准点距离最小的候选虚拟对象作为目标虚拟对象; 或, 根据所 述多个候选虚拟对象与当前终端控制的虚拟对象的连线方向和所述瞄准点所在方向之间的夹 角, 选择夹角最小的候选虚拟对象作为目标虚拟对象。
当然, 该电子设备还可以具有有线或无线网络接口、 键盘以及输入输出接口等部件, 以 便进行输入输出, 该电子设备还可以包括其他用于实现设备功能的部件, 在此不做贅述。
在示例性实施例中, 还提供了一种计算机可读存储介质, 例如包括指令的存储器, 上述 指令可由处理器执行以完成上述实施例中的虚拟场景显示方法。 例如, 该计算机可读存储介 质可以是只读存储器 (Read-Only Memory, ROM)、 随机存取存储器 (Random Access Memory, RAM)、 只读光盘 (Compact Disc Read-Only Memory, CD-ROM)、 磁带、 软盘和光数据存储 设备等。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成, 也可以通过程序来指令相关的硬件完成, 该程序可以存储于一种计算机可读存储介质中, 上 述提到的存储介质可以是只读存储器, 磁盘或光盘等。
上述仅为本申请的较佳实施例, 并不用以限制本申请, 凡在本申请的精神和原则之内, 所作的任何修改、 等同替换、 改进等, 均应包含在本申请的保护范围之内。

Claims

权 利 要 求 书
1、 一种虚拟场景显示方法, 其特征在于, 应用于电子设备, 所述方法包括:
当检测到视角调整操作时, 获取目标虚拟对象的吸附区域;
当瞄准点位于所述目标虚拟对象的吸附区域时, 根据所述瞄准点的位置、 所述目标虚拟 对象的位置和所述视角调整操作, 获取虚拟场景的视角的目标转动速度;
基于所述视角的目标转动速度, 显示目标虚拟场景。
2、 根据权利要求 1所述的方法, 其特征在于, 所述当瞄准点位于所述目标虚拟对象的吸 附区域时, 根据所述瞄准点的位置、 所述目标虚拟对象的位置和所述视角调整操作, 获取虚 拟场景的视角的目标转动速度, 包括:
当瞄准点位于所述目标虚拟对象的吸附区域时, 根据所述视角调整操作, 获取所述虚拟 场景的视角的第一转动速度;
根据所述瞄准点的位置和所述目标虚拟对象的位置, 获取所述虚拟场景的视角的第二转 动速度, 所述第二转动速度的方向为从所述瞄准点出发朝向所述目标虚拟对象;
基于所述第一转动速度和所述第二转动速度,获取所述虚拟场景的视角的目标转动速度。
3、 根据权利要求 2所述的方法, 其特征在于, 所述根据所述视角调整操作, 获取所述虚 拟场景的视角的第一转动速度, 包括:
当所述瞄准点位于所述目标虚拟对象的第一吸附区域时, 根据所述视角调整操作, 获取 所述虚拟场景的视角的第一预设转动速度;
当所述瞄准点位于所述目标虚拟对象的第二吸附区域时, 根据所述视角调整操作, 获取 所述虚拟场景的视角的第二预设转动速度, 所述第二预设转动速度小于第一预设转动速度; 当所述瞄准点位于所述目标虚拟对象的第三吸附区域时, 根据所述视角调整操作, 获取 所述虚拟场景的视角的第三预设转动速度, 所述第三预设转动速度小于第一预设转动速度, 所述第三预设转动速度与所述第二预设转动速度不同;
其中, 所述第一吸附区域包围于所述第二吸附区域之外, 所述第二吸附区域包围于所述 第三吸附区域之外。
4、 根据权利要求 2所述的方法, 其特征在于, 所述根据所述瞄准点的位置和所述目标虚 拟对象的位置, 获取所述虚拟场景的视角的第二转动速度, 包括:
根据所述目标虚拟对象与所述瞄准点投影在终端屏幕上的距离, 获取所述虚拟场景的视 角的第二转动速度; 或,
根据所述目标虚拟对象与所述瞄准点在虚拟场景中的距离, 获取所述虚拟场景的视角的 第二转动速度; 或,
根据当前终端控制的虚拟对象与所述目标虚拟对象的连线方向和所述瞄准点所在方向之 间的夹角, 获取所述虚拟场景的视角的第二转动速度。
5、 根据权利要求 4所述的方法, 其特征在于, 所述第二转动速度和所述目标虚拟对象与 所述瞄准点投影在终端屏幕上的距离负相关; 或, 所述第二转动速度和所述目标虚拟对象与 所述瞄准点在虚拟场景中的距离负相关; 或, 所述第二转动速度与所述夹角负相关。
6、 根据权利要求 4所述的方法, 其特征在于, 所述方法还包括:
获取所述当前终端控制的虚拟对象和所述目标虚拟对象的距离;
相应地, 所述根据所述目标虚拟对象与所述瞄准点投影在终端屏幕上的距离, 获取所述 虚拟场景的视角的第二转动速度, 包括:
根据所述当前终端控制的虚拟对象和所述目标虚拟对象之间的距离, 以及所述目标虚拟 对象与所述瞄准点投影在终端屏幕上的距离, 获取所述虚拟场景的视角的第二转动速度, 所 述第二转动速度与所述当前终端控制的虚拟对象和所述目标虚拟对象的距离负相关; 或, 相应地, 所述根据所述目标虚拟对象与所述瞄准点在虚拟场景中的距离, 获取所述虚拟 场景的视角的第二转动速度, 包括:
根据所述当前终端控制的虚拟对象和所述目标虚拟对象之间的距离, 以及所述目标虚拟 对象与所述瞄准点在虚拟场景中的距离, 获取所述虚拟场景的视角的第二转动速度, 所述第 二转动速度与所述当前终端控制的虚拟对象和所述目标虚拟对象的距离负相关。
7、 根据权利要求 2所述的方法, 其特征在于, 所述方法还包括:
当所述瞄准点位于所述目标虚拟对象的第一吸附区域或第二吸附区域时, 执行所述根据 所述瞄准点的位置和所述目标虚拟对象的位置, 获取所述虚拟场景的视角的第二转动速度的 步骤;
当所述瞄准点位于所述目标虚拟对象的第三吸附区域时, 将所述第一转动速度作为所述 虚拟场景的视角的目标转动速度。
8、 根据权利要求 2所述的方法, 其特征在于, 所述根据所述瞄准点的位置和所述目标虚 拟对象的位置, 获取所述虚拟场景的视角的第二转动速度, 包括:
根据所述瞄准点的位置、 所述目标虚拟对象的位置以及所述视角调整操作的操作方向, 获取所述虚拟场景的视角的第二转动速度。
9、 根据权利要求 8所述的方法, 其特征在于, 所述根据所述瞄准点的位置、 所述目标虚 拟对象的位置以及所述视角调整操作的操作方向,获取所述虚拟场景的视角的第二转动速度, 包括:
当所述视角调整操作的操作方向指示所述瞄准点向所述目标虚拟对象移动时, 根据所述 瞄准点的位置、 所述目标虚拟对象的位置和第一参数, 获取得到第三转动速度作为所述虚拟 场景的视角的第二转动速度;
当所述视角调整操作的操作方向指示所述瞄准点向所述目标虚拟对象的反方向移动时, 根据所述瞄准点的位置、 所述目标虚拟对象的位置和第二参数, 获取得到第四转动速度作为 所述虚拟场景的视角的第二转动速度, 所述第四转动速度小于所述第三转动速度。
10、 根据权利要求 1所述的方法, 其特征在于, 所述方法还包括:
当检测到视角调整操作时, 获取虚拟场景的显示模式;
当所述虚拟场景处于基于瞄具的显示模式时, 执行所述获取目标虚拟对象的吸附区域的 步骤;
当所述虚拟场景未处于基于瞄具的显示模式时, 根据所述视角调整操作, 获取所述虚拟 场景的视角的第一转动速度。
11、 根据权利要求 1所述的方法, 其特征在于, 所述方法还包括:
当所述瞄准点位于多个候选虚拟对象的吸附区域时, 从所述多个候选虚拟对象中, 选择 一个候选虚拟对象作为目标虚拟对象;
基于选中的所述目标虚拟对象, 执行所述根据所述瞄准点的位置、 所述目标虚拟对象的 位置和所述视角调整操作, 获取虚拟场景的视角的目标转动速度的步骤。
12、 根据权利要求 11所述的方法, 其特征在于, 所述从所述多个候选虚拟对象中, 选择 一个候选虚拟对象作为目标虚拟对象, 包括:
从所述多个候选虚拟对象中, 随机选择一个候选虚拟对象作为目标虚拟对象; 或, 根据所述多个候选虚拟对象与所述瞄准点在虚拟场景中的距离, 选择与所述瞄准点距离 最小的候选虚拟对象作为所述目标虚拟对象; 或,
根据所述多个候选虚拟对象与所述瞄准点投影在终端屏幕上的位置之间的距离, 选择与 所述瞄准点距离最小的候选虚拟对象作为目标虚拟对象; 或,
根据所述多个候选虚拟对象与当前终端控制的虚拟对象的连线方向和所述瞄准点所在方 向之间的夹角, 选择夹角最小的候选虚拟对象作为目标虚拟对象。
13、 一种虚拟场景显示方法, 其特征在于, 应用于电子设备, 所述方法包括: 当检测到虚拟场景从第一显示模式切换至第二显示模式时,获取瞄准点对应的目标区域, 所述第二显示模式为基于瞄具的显示模式, 所述第一显示模式为除所述第二显示模式之外的 显示模式;
当所述目标区域内包括目标虚拟对象时, 根据所述目标虚拟对象的位置和所述瞄准点的 位置, 获取所述虚拟场景的视角的目标转动方向和目标转动角度;
基于所述视角的目标转动方向和目标转动角度, 显示目标虚拟场景, 所述目标虚拟场景 中所述瞄准点位于所述目标虚拟对象所在区域。
14、 根据权利要求 13所述的方法, 其特征在于, 所述根据所述目标虚拟对象的位置和所 述瞄准点的位置, 获取所述虚拟场景的视角的目标转动方向和目标转动角度, 包括:
获取所述目标虚拟对象的目标位置;
根据所述瞄准点的位置和所述目标位置, 获取所述虚拟场景的视角的目标转动方向和目 标转动角度。
15、根据权利要求 14所述的方法, 其特征在于, 所述获取所述目标虚拟对象的目标位置, 包括:
根据所述瞄准点和所述目标虚拟对象投影在终端屏幕上的位置在水平方向上的关系, 获 取所述目标虚拟对象的目标位置; 或,
根据所述瞄准点和所述目标虚拟对象在所述虚拟场景中的水平位置的关系, 获取所述目 标虚拟对象的目标位置; 或,
获取所述目标虚拟对象的目标部位所在位置作为所述目标位置。
16、 根据权利要求 15所述的方法, 其特征在于, 所述根据所述瞄准点和所述目标虚拟对 象投影在终端屏幕上的位置在水平方向上的关系, 获取所述目标虚拟对象的目标位置, 包括: 当所述瞄准点的投影位置在水平方向上的水平位置位于所述目标虚拟对象的投影位置在 水平方向上的水平位置范围内时, 将所述水平位置范围中与所述瞄准点的水平位置相同的位 置作为所述目标虚拟对象的目标位置; 或,
当所述瞄准点的投影位置在水平方向上的水平位置位于所述目标虚拟对象的投影位置在 水平方向上的水平位置范围外时, 将所述瞄准点的水平位置对应的位置或所述目标虚拟对象 的目标部位所在位置作为所述目标虚拟对象的目标位置。
17、 根据权利要求 14所述的方法, 其特征在于, 所述根据所述瞄准点的位置和所述目标 位置, 获取所述虚拟场景的视角的目标转动方向和目标转动角度, 包括:
将从所述瞄准点出发、 朝向所述目标位置的方向作为所述目标转动方向;
根据所述瞄准点所在方向和所述目标位置与所述当前终端控制的虚拟对象的连线方向之 间的夹角作为所述目标转动角度。
18、 根据权利要求 13所述的方法, 其特征在于, 所述获取瞄准点对应的目标区域, 包括: 获取以所述瞄准点为中心、 尺寸为预设尺寸的目标区域。
19、 根据权利要求 13所述的方法, 其特征在于, 所述方法还包括:
当所述目标区域内包括多个候选虚拟对象时, 从所述多个候选虚拟对象中, 选择一个候 选虚拟对象作为所述目标虚拟对象;
基于选中的所述目标虚拟对象, 执行所述根据所述目标虚拟对象的位置和所述瞄准点的 位置, 获取所述虚拟场景的视角的目标转动方向和目标转动角度的步骤。
20、 根据权利要求 19所述的方法, 其特征在于, 所述从所述多个候选虚拟对象中, 选择 一个候选虚拟对象作为所述目标虚拟对象, 包括:
从所述多个候选虚拟对象中, 随机选择一个候选虚拟对象作为目标虚拟对象; 或, 根据所述多个候选虚拟对象与所述瞄准点在虚拟场景中的距离, 选择与所述瞄准点距离 最小的候选虚拟对象作为所述目标虚拟对象; 或,
根据所述多个候选虚拟对象与所述瞄准点投影在终端屏幕上的位置之间的距离, 选择与 所述瞄准点距离最小的候选虚拟对象作为目标虚拟对象; 或,
根据所述多个候选虚拟对象与当前终端控制的虚拟对象的连线方向和所述瞄准点所在方 向之间的夹角, 选择夹角最小的候选虚拟对象作为目标虚拟对象。
21、 一种电子设备, 其特征在于, 所述电子设备包括一个或多个处理器和一个或多个存 储器, 所述一个或多个存储器中存储有至少一条指令, 所述至少一条指令由所述一个或多个 处理器加载并执行以实现如下方法:
当检测到视角调整操作时, 获取目标虚拟对象的吸附区域;
当瞄准点位于所述目标虚拟对象的吸附区域时, 根据所述瞄准点的位置、 所述目标虚拟 对象的位置和所述视角调整操作, 获取虚拟场景的视角的目标转动速度;
基于所述视角的目标转动速度, 显示目标虚拟场景。
22、 根据权利要求 21所述的电子设备, 其特征在于, 所述至少一条指令由所述一个或多 个处理器加载并执行以实现:
当瞄准点位于所述目标虚拟对象的吸附区域时, 根据所述视角调整操作, 获取所述虚拟 场景的视角的第一转动速度;
根据所述瞄准点的位置和所述目标虚拟对象的位置, 获取所述虚拟场景的视角的第二转 动速度, 所述第二转动速度的方向为从所述瞄准点出发朝向所述目标虚拟对象;
基于所述第一转动速度和所述第二转动速度,获取所述虚拟场景的视角的目标转动速度。
23、 根据权利要求 22所述的电子设备, 其特征在于, 所述至少一条指令由所述一个或多 个处理器加载并执行以实现:
当所述瞄准点位于所述目标虚拟对象的第一吸附区域时, 根据所述视角调整操作, 获取 所述虚拟场景的视角的第一预设转动速度;
当所述瞄准点位于所述目标虚拟对象的第二吸附区域时, 根据所述视角调整操作, 获取 所述虚拟场景的视角的第二预设转动速度, 所述第二预设转动速度小于第一预设转动速度; 当所述瞄准点位于所述目标虚拟对象的第三吸附区域时, 根据所述视角调整操作, 获取 所述虚拟场景的视角的第三预设转动速度, 所述第三预设转动速度小于第一预设转动速度, 所述第三预设转动速度与所述第二预设转动速度不同;
其中, 所述第一吸附区域包围于所述第二吸附区域之外, 所述第二吸附区域包围于所述 第三吸附区域之外。
24、 根据权利要求 22所述的电子设备, 其特征在于, 所述至少一条指令由所述一个或多 个处理器加载并执行以实现:
根据所述目标虚拟对象与所述瞄准点投影在终端屏幕上的距离, 获取所述虚拟场景的视 角的第二转动速度; 或,
根据所述目标虚拟对象与所述瞄准点在虚拟场景中的距离, 获取所述虚拟场景的视角的 第二转动速度; 或,
根据当前终端控制的虚拟对象与所述目标虚拟对象的连线方向和所述瞄准点所在方向之 间的夹角, 获取所述虚拟场景的视角的第二转动速度。
25、 根据权利要求 24所述的电子设备, 其特征在于, 所述第二转动速度和所述目标虚拟 对象与所述瞄准点投影在终端屏幕上的距离负相关; 或, 所述第二转动速度和所述目标虚拟 对象与所述瞄准点在虚拟场景中的距离负相关; 或, 所述第二转动速度与所述夹角负相关。
26、 根据权利要求 24所述的电子设备, 其特征在于, 所述至少一条指令还由所述一个或 多个处理器加载并执行以实现:
获取所述当前终端控制的虚拟对象和所述目标虚拟对象的距离;
相应地, 所所述至少一条指令由所述一个或多个处理器加载并执行以实现:
根据所述当前终端控制的虚拟对象和所述目标虚拟对象之间的距离, 以及所述目标虚拟 对象与所述瞄准点投影在终端屏幕上的距离, 获取所述虚拟场景的视角的第二转动速度, 所 述第二转动速度与所述当前终端控制的虚拟对象和所述目标虚拟对象的距离负相关; 或, 相应地, 所述至少一条指令由所述一个或多个处理器加载并执行以实现:
根据所述当前终端控制的虚拟对象和所述目标虚拟对象之间的距离, 以及所述目标虚拟 对象与所述瞄准点在虚拟场景中的距离, 获取所述虚拟场景的视角的第二转动速度, 所述第 二转动速度与所述当前终端控制的虚拟对象和所述目标虚拟对象的距离负相关。
27、 根据权利要求 22所述的电子设备, 其特征在于, 所述至少一条指令还由所述一个或 多个处理器加载并执行以实现:
当所述瞄准点位于所述目标虚拟对象的第一吸附区域或第二吸附区域时, 执行所述根据 所述瞄准点的位置和所述目标虚拟对象的位置, 获取所述虚拟场景的视角的第二转动速度的 步骤;
当所述瞄准点位于所述目标虚拟对象的第三吸附区域时, 将所述第一转动速度作为所述 虚拟场景的视角的目标转动速度。
28、 根据权利要求 22所述的电子设备, 其特征在于, 所述至少一条指令由所述一个或多 个处理器加载并执行以实现:
根据所述瞄准点的位置、 所述目标虚拟对象的位置以及所述视角调整操作的操作方向, 获取所述虚拟场景的视角的第二转动速度。
29、 根据权利要求 28所述的电子设备, 其特征在于, 所述至少一条指令由所述一个或多 个处理器加载并执行以实现:
当所述视角调整操作的操作方向指示所述瞄准点向所述目标虚拟对象移动时, 根据所述 瞄准点的位置、 所述目标虚拟对象的位置和第一参数, 获取得到第三转动速度作为所述虚拟 场景的视角的第二转动速度;
当所述视角调整操作的操作方向指示所述瞄准点向所述目标虚拟对象的反方向移动时, 根据所述瞄准点的位置、 所述目标虚拟对象的位置和第二参数, 获取得到第四转动速度作为 所述虚拟场景的视角的第二转动速度, 所述第四转动速度小于所述第三转动速度。
30、 根据权利要求 21所述的电子设备, 其特征在于, 所述至少一条指令还由所述一个或 多个处理器加载并执行以实现:
当检测到视角调整操作时, 获取虚拟场景的显示模式;
当所述虚拟场景处于基于瞄具的显示模式时, 执行所述获取目标虚拟对象的吸附区域的 步骤;
当所述虚拟场景未处于基于瞄具的显示模式时, 根据所述视角调整操作, 获取所述虚拟 场景的视角的第一转动速度。
31、 根据权利要求 21所述的电子设备, 其特征在于, 所述至少一条指令还由所述一个或 多个处理器加载并执行以实现:
当所述瞄准点位于多个候选虚拟对象的吸附区域时, 从所述多个候选虚拟对象中, 选择 一个候选虚拟对象作为目标虚拟对象;
基于选中的所述目标虚拟对象, 执行所述根据所述瞄准点的位置、 所述目标虚拟对象的 位置和所述视角调整操作, 获取虚拟场景的视角的目标转动速度的步骤。
32、 根据权利要求 31所述的电子设备, 其特征在于, 所述至少一条指令由所述一个或多 个处理器加载并执行以实现: 从所述多个候选虚拟对象中, 随机选择一个候选虚拟对象作为目标虚拟对象; 或, 根据所述多个候选虚拟对象与所述瞄准点在虚拟场景中的距离, 选择与所述瞄准点距离 最小的候选虚拟对象作为所述目标虚拟对象; 或,
根据所述多个候选虚拟对象与所述瞄准点投影在终端屏幕上的位置之间的距离, 选择与 所述瞄准点距离最小的候选虚拟对象作为目标虚拟对象; 或,
根据所述多个候选虚拟对象与当前终端控制的虚拟对象的连线方向和所述瞄准点所在方 向之间的夹角, 选择夹角最小的候选虚拟对象作为目标虚拟对象。
33、 一种电子设备, 其特征在于, 所述电子设备包括一个或多个处理器和一个或多个存 储器, 所述一个或多个存储器中存储有至少一条指令, 所述至少一条指令由所述一个或多个 处理器加载并执行以实现如下方法:
当检测到虚拟场景从第一显示模式切换至第二显示模式时,获取瞄准点对应的目标区域, 所述第二显示模式为基于瞄具的显示模式, 所述第一显示模式为除所述第二显示模式之外的 显示模式;
当所述目标区域内包括目标虚拟对象时, 根据所述目标虚拟对象的位置和所述瞄准点的 位置, 获取所述虚拟场景的视角的目标转动方向和目标转动角度;
基于所述视角的目标转动方向和目标转动角度, 显示目标虚拟场景, 所述目标虚拟场景 中所述瞄准点位于所述目标虚拟对象所在区域。
34、 根据权利要求 33所述的电子设备, 其特征在于, 所述至少一条指令由所述一个或多 个处理器加载并执行以实现:
获取所述目标虚拟对象的目标位置;
根据所述瞄准点的位置和所述目标位置, 获取所述虚拟场景的视角的目标转动方向和目 标转动角度。
35、 根据权利要求 34所述的电子设备, 其特征在于, 所述至少一条指令由所述一个或多 个处理器加载并执行以实现:
根据所述瞄准点和所述目标虚拟对象投影在终端屏幕上的位置在水平方向上的关系, 获 取所述目标虚拟对象的目标位置; 或,
根据所述瞄准点和所述目标虚拟对象在所述虚拟场景中的水平位置的关系, 获取所述目 标虚拟对象的目标位置; 或,
获取所述目标虚拟对象的目标部位所在位置作为所述目标位置。
36、 根据权利要求 35所述的电子设备, 其特征在于, 所述至少一条指令由所述一个或多 个处理器加载并执行以实现:
当所述瞄准点的投影位置在水平方向上的水平位置位于所述目标虚拟对象的投影位置在 水平方向上的水平位置范围内时, 将所述水平位置范围中与所述瞄准点的水平位置相同的位 置作为所述目标虚拟对象的目标位置; 或,
当所述瞄准点的投影位置在水平方向上的水平位置位于所述目标虚拟对象的投影位置在 水平方向上的水平位置范围外时, 将所述瞄准点的水平位置对应的位置或所述目标虚拟对象 的目标部位所在位置作为所述目标虚拟对象的目标位置。
37、 根据权利要求 34所述的电子设备, 其特征在于, 所述至少一条指令由所述一个或多 个处理器加载并执行以实现:
将从所述瞄准点出发、 朝向所述目标位置的方向作为所述目标转动方向;
根据所述瞄准点所在方向和所述目标位置与所述当前终端控制的虚拟对象的连线方向之 间的夹角作为所述目标转动角度。
38、 根据权利要求 33所述的电子设备, 其特征在于, 所述至少一条指令由所述一个或多 个处理器加载并执行以实现:
获取以所述瞄准点为中心、 尺寸为预设尺寸的目标区域。
39、 根据权利要求 33所述的电子设备, 其特征在于, 所述至少一条指令还由所述一个或 多个处理器加载并执行以实现:
当所述目标区域内包括多个候选虚拟对象时, 从所述多个候选虚拟对象中, 选择一个候 选虚拟对象作为所述目标虚拟对象;
基于选中的所述目标虚拟对象, 执行所述根据所述目标虚拟对象的位置和所述瞄准点的 位置, 获取所述虚拟场景的视角的目标转动方向和目标转动角度的步骤。
40、 根据权利要求 39所述的电子设备, 其特征在于, 所述至少一条指令由所述一个或多 个处理器加载并执行以实现:
从所述多个候选虚拟对象中, 随机选择一个候选虚拟对象作为目标虚拟对象; 或, 根据所述多个候选虚拟对象与所述瞄准点在虚拟场景中的距离, 选择与所述瞄准点距离 最小的候选虚拟对象作为所述目标虚拟对象; 或,
根据所述多个候选虚拟对象与所述瞄准点投影在终端屏幕上的位置之间的距离, 选择与 所述瞄准点距离最小的候选虚拟对象作为目标虚拟对象; 或,
根据所述多个候选虚拟对象与当前终端控制的虚拟对象的连线方向和所述瞄准点所在方 向之间的夹角, 选择夹角最小的候选虚拟对象作为目标虚拟对象。
41、 一种计算机可读存储介质, 其特征在于, 所述计算机可读存储介质中存储有至少一 条指令,所述指令由处理器加载并执行以实现如权利要求 1至权利要求 12任一项所述的虚拟 场景显示方法所执行的操作; 或如权利要求 13至权利要求 20所述的虚拟场景显示方法所执 行的操作。
PCT/CN2020/072853 2019-02-26 2020-01-17 虚拟场景显示方法、电子设备及存储介质 WO2020173256A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
JP2021517776A JP7299312B2 (ja) 2019-02-26 2020-01-17 仮想シーン表示方法、電子デバイス及びコンピュータプログラム
SG11202104916WA SG11202104916WA (en) 2019-02-26 2020-01-17 Virtual scene display method, electronic device and storage medium
KR1020217013177A KR102565710B1 (ko) 2019-02-26 2020-01-17 가상 장면 디스플레이 방법, 전자 장치 및 저장 매체
US17/244,446 US11883751B2 (en) 2019-02-26 2021-04-29 Virtual scene display method, electronic device, and storage medium
US18/523,979 US20240091654A1 (en) 2019-02-26 2023-11-30 Display mode in virtual scene

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910143280.4A CN109847336B (zh) 2019-02-26 2019-02-26 虚拟场景显示方法、装置、电子设备及存储介质
CN201910143280.4 2019-02-26

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/244,446 Continuation US11883751B2 (en) 2019-02-26 2021-04-29 Virtual scene display method, electronic device, and storage medium

Publications (1)

Publication Number Publication Date
WO2020173256A1 true WO2020173256A1 (zh) 2020-09-03

Family

ID=66899083

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/072853 WO2020173256A1 (zh) 2019-02-26 2020-01-17 虚拟场景显示方法、电子设备及存储介质

Country Status (6)

Country Link
US (2) US11883751B2 (zh)
JP (1) JP7299312B2 (zh)
KR (1) KR102565710B1 (zh)
CN (1) CN109847336B (zh)
SG (1) SG11202104916WA (zh)
WO (1) WO2020173256A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024164471A1 (zh) * 2023-02-08 2024-08-15 网易(杭州)网络有限公司 瞄准方法、装置、计算机设备及存储介质

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109847336B (zh) 2019-02-26 2021-08-06 腾讯科技(深圳)有限公司 虚拟场景显示方法、装置、电子设备及存储介质
CN110300266B (zh) * 2019-07-04 2021-04-02 珠海西山居移动游戏科技有限公司 一种镜头移动方法及系统、一种计算设备及存储介质
CN110732135B (zh) * 2019-10-18 2022-03-08 腾讯科技(深圳)有限公司 虚拟场景显示方法、装置、电子设备及存储介质
CN110841276B (zh) * 2019-10-31 2021-05-14 腾讯科技(深圳)有限公司 虚拟道具的控制方法和装置、存储介质及电子装置
CN111061362A (zh) * 2019-11-21 2020-04-24 珠海剑心互动娱乐有限公司 自适应视角方法、装置、计算设备和存储介质
CN111097170B (zh) * 2019-12-11 2022-11-22 腾讯科技(深圳)有限公司 吸附框的调整方法和装置、存储介质及电子装置
CN111111168B (zh) * 2019-12-16 2021-03-26 腾讯科技(深圳)有限公司 虚拟道具的控制方法和装置、存储介质及电子装置
CN111265858B (zh) * 2020-01-15 2022-04-12 腾讯科技(深圳)有限公司 操作控制方法、装置、存储介质及电子装置
CN111672119B (zh) 2020-06-05 2023-03-10 腾讯科技(深圳)有限公司 瞄准虚拟对象的方法、装置、设备及介质
CN111714887B (zh) * 2020-06-24 2024-01-30 网易(杭州)网络有限公司 游戏视角调整方法、装置、设备及存储介质
CN112764654B (zh) * 2021-01-29 2022-10-25 北京达佳互联信息技术有限公司 组件的吸附操作方法、装置、终端及存储介质
CN113440853B (zh) * 2021-07-08 2022-07-29 腾讯科技(深圳)有限公司 虚拟场景中虚拟技能的控制方法、装置、设备及存储介质
CN113835521B (zh) * 2021-09-02 2022-11-25 北京城市网邻信息技术有限公司 场景视角的切换方法、装置、电子设备及可读介质
JP7406274B1 (ja) 2022-10-18 2023-12-27 亮太 高尾 ゲームプログラム、ゲーム方法及びゲーム装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180221773A1 (en) * 2014-08-29 2018-08-09 Gree, Inc. Shooting game program with aiming based on a plurality of position inputs
CN108415639A (zh) * 2018-02-09 2018-08-17 腾讯科技(深圳)有限公司 视角调整方法、装置、电子装置及计算机可读存储介质
CN109224439A (zh) * 2018-10-22 2019-01-18 网易(杭州)网络有限公司 游戏瞄准的方法及装置、存储介质、电子装置
CN109847336A (zh) * 2019-02-26 2019-06-07 腾讯科技(深圳)有限公司 虚拟场景显示方法、装置、电子设备及存储介质

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3345600B2 (ja) * 2000-04-10 2002-11-18 コナミ株式会社 ゲームシステムおよびコンピュータ読取可能な記憶媒体
US6852032B2 (en) * 2000-12-06 2005-02-08 Nikon Corporation Game machine, method of performing game and computer-readable medium
JP2006122123A (ja) * 2004-10-26 2006-05-18 Game Republic:Kk ゲーム装置及びプログラム
US9327191B2 (en) * 2006-05-08 2016-05-03 Nintendo Co., Ltd. Method and apparatus for enhanced virtual camera control within 3D video games or other computer graphics presentations providing intelligent automatic 3D-assist for third person viewpoints
JP2008093307A (ja) 2006-10-13 2008-04-24 Sega Corp 電子遊戯装置、電子遊戯用制御方法およびゲームプログラム
US8777708B2 (en) * 2008-06-27 2014-07-15 Microsoft Corporation Targeting control in a simulated environment
JP5622447B2 (ja) * 2010-06-11 2014-11-12 任天堂株式会社 情報処理プログラム、情報処理装置、情報処理システム及び情報処理方法
JP5800473B2 (ja) * 2010-06-11 2015-10-28 任天堂株式会社 情報処理プログラム、情報処理装置、情報処理システム及び情報処理方法
US9561436B2 (en) * 2013-02-26 2017-02-07 Gree, Inc. Shooting game control method and game system
WO2014163220A1 (ko) * 2013-04-05 2014-10-09 그리 가부시키가이샤 온라인 슈팅 게임 제공 장치 및 방법
US20160158641A1 (en) * 2013-04-25 2016-06-09 Phillip B. Summons Targeting system and method for video games
US11465040B2 (en) * 2013-12-11 2022-10-11 Activision Publishing, Inc. System and method for playing video games on touchscreen-based devices
US20150231509A1 (en) * 2014-02-17 2015-08-20 Xaviant, LLC (a GA Limited Liability Company) System, Method, and Apparatus for Smart Targeting
US9004997B1 (en) * 2014-03-12 2015-04-14 Wargaming.Net Llp Providing enhanced game mechanics
US9764226B2 (en) * 2014-03-12 2017-09-19 Wargaming.Net Limited Providing enhanced game mechanics
US20150375110A1 (en) * 2014-06-30 2015-12-31 AlternativaPlatform Ltd. Systems and methods for shooting in online action games using automatic weapon aiming
CN105148520A (zh) * 2015-08-28 2015-12-16 上海甲游网络科技有限公司 一种射击游戏的自动瞄准的方法及装置
CN105597325B (zh) * 2015-10-30 2018-07-06 广州银汉科技有限公司 协助瞄准的方法与系统
CN107661630A (zh) * 2017-08-28 2018-02-06 网易(杭州)网络有限公司 一种射击游戏的控制方法及装置、存储介质、处理器、终端
CN107913515B (zh) * 2017-10-25 2019-01-08 网易(杭州)网络有限公司 信息处理方法及装置、存储介质、电子设备
CN108211342A (zh) * 2018-01-19 2018-06-29 腾讯科技(深圳)有限公司 视角调整方法和装置、存储介质及电子装置
CN108310764B (zh) * 2018-02-09 2022-01-11 鲸彩在线科技(大连)有限公司 辅助定位方法、装置及设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180221773A1 (en) * 2014-08-29 2018-08-09 Gree, Inc. Shooting game program with aiming based on a plurality of position inputs
CN108415639A (zh) * 2018-02-09 2018-08-17 腾讯科技(深圳)有限公司 视角调整方法、装置、电子装置及计算机可读存储介质
CN109224439A (zh) * 2018-10-22 2019-01-18 网易(杭州)网络有限公司 游戏瞄准的方法及装置、存储介质、电子装置
CN109847336A (zh) * 2019-02-26 2019-06-07 腾讯科技(深圳)有限公司 虚拟场景显示方法、装置、电子设备及存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024164471A1 (zh) * 2023-02-08 2024-08-15 网易(杭州)网络有限公司 瞄准方法、装置、计算机设备及存储介质

Also Published As

Publication number Publication date
JP2022501147A (ja) 2022-01-06
KR20210068535A (ko) 2021-06-09
US20240091654A1 (en) 2024-03-21
SG11202104916WA (en) 2021-06-29
US20210245062A1 (en) 2021-08-12
CN109847336A (zh) 2019-06-07
CN109847336B (zh) 2021-08-06
JP7299312B2 (ja) 2023-06-27
US11883751B2 (en) 2024-01-30
KR102565710B1 (ko) 2023-08-09

Similar Documents

Publication Publication Date Title
WO2020173256A1 (zh) 虚拟场景显示方法、电子设备及存储介质
US11977713B2 (en) Viewing angle adjustment method and device, electronic device, and computer-readable storage medium
US11219828B2 (en) Virtual scene display method, electronic apparatus, and storage medium
US11305190B2 (en) Location indication information display method, electronic apparatus, and storage medium
JP7166708B2 (ja) 仮想オブジェクト制御方法、装置、電子機器、及び記憶媒体
JP5597837B2 (ja) プログラム、情報記憶媒体、及び、画像生成装置
AU2020256524A1 (en) Operation control method and apparatus, and electronic device and storage medium
CN111414080B (zh) 虚拟对象的位置显示方法、装置、设备及存储介质
JP2021515331A (ja) インタフェース表示方法、装置、電子機器及びコンピュータプログラム
CN111399639B (zh) 虚拟环境中运动状态的控制方法、装置、设备及可读介质
JP2022525172A (ja) 仮想オブジェクトの制御方法、装置、コンピュータ機器及びプログラム
CN111481934B (zh) 虚拟环境画面的显示方法、装置、设备及存储介质
CN110613938A (zh) 控制虚拟对象使用虚拟道具的方法、终端及存储介质
WO2021031765A1 (zh) 虚拟环境中瞄准镜的应用方法和相关装置
CN111330278B (zh) 基于虚拟环境的动画播放方法、装置、设备及介质
US20230364502A1 (en) Method and apparatus for controlling front sight in virtual scenario, electronic device, and storage medium
CN112451969A (zh) 虚拟对象控制方法、装置、计算机设备及存储介质
CN111589134A (zh) 虚拟环境画面的显示方法、装置、设备及存储介质
KR20090107463A (ko) 실 방향 동적 사격 훈련 시스템
WO2024051422A1 (zh) 虚拟道具的显示方法、装置、设备、介质和程序产品
RU2787649C1 (ru) Способ и устройство для управления операциями, электронное устройство и носитель данных
US20240342604A1 (en) Control method and apparatus for virtual object, storage medium, and electronic device
CN118662899A (zh) 游戏中虚拟镜头的控制方法、装置、电子设备及存储介质
CN116966567A (zh) 一种射击游戏的控制方法、装置、电子设备和存储介质
CN118662897A (zh) 一种游戏交互方法、装置、电子设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20763482

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021517776

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20217013177

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20763482

Country of ref document: EP

Kind code of ref document: A1