WO2021031765A1 - 虚拟环境中瞄准镜的应用方法和相关装置 - Google Patents

虚拟环境中瞄准镜的应用方法和相关装置 Download PDF

Info

Publication number
WO2021031765A1
WO2021031765A1 PCT/CN2020/103000 CN2020103000W WO2021031765A1 WO 2021031765 A1 WO2021031765 A1 WO 2021031765A1 CN 2020103000 W CN2020103000 W CN 2020103000W WO 2021031765 A1 WO2021031765 A1 WO 2021031765A1
Authority
WO
WIPO (PCT)
Prior art keywords
scope
control
touch
virtual
environment
Prior art date
Application number
PCT/CN2020/103000
Other languages
English (en)
French (fr)
Inventor
刘智洪
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to KR1020217032917A priority Critical patent/KR20210135594A/ko
Priority to SG11202109672U priority patent/SG11202109672UA/en
Priority to JP2021554634A priority patent/JP2022524802A/ja
Publication of WO2021031765A1 publication Critical patent/WO2021031765A1/zh
Priority to US17/504,081 priority patent/US12121798B2/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5252Changing parameters of virtual cameras using two or more virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character changes room or displaying a rear-mirror view in a car-driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets

Definitions

  • the embodiments of the present application relate to the field of virtual environments, and particularly to the application of a scope in a virtual environment.
  • the sight view control can be used to switch between the personal viewing angle observation mode and the scope accessory observation mode, that is, when the virtual environment is observed through the personal viewing angle mode, the observation mode can be switched to When observing the virtual environment through the scope accessory observation mode and through the scope observation mode, you can switch the observation mode to the personal perspective mode by clicking the scope control.
  • the embodiment of the present application provides an application method and related device of a sight in a virtual environment, which can solve the problem of low efficiency of observation of the virtual environment and low accuracy of observation.
  • the technical solution is as follows:
  • a method for applying a sight in a virtual environment includes:
  • a first environment interface is displayed, and the first environment interface includes a personal view screen for observing the virtual environment from the personal view of the virtual object, the virtual object holding a virtual item, and the virtual item is equipped with a sight Accessories, the first environment interface further includes a scope control control;
  • the second environment interface including a scope picture for observing the virtual environment through the scope accessory;
  • an application device for a sight in a virtual environment includes:
  • the display module is configured to display a first environment interface, and the first environment interface includes a personal view screen for observing the virtual environment from the personal view of a virtual object, the virtual object holding a virtual item, and the virtual item Scope accessories are equipped on the upper, and the first environment interface also includes scope control controls;
  • a receiving module configured to receive a touch operation on the scope control control and generate a touch signal
  • the display module is further configured to display a second environment interface according to the touch signal, and the second environment interface includes a scope screen for observing the virtual environment through the scope accessory;
  • the receiving module is further configured to restore the display of the person perspective picture when the touch end signal of the touch operation is received.
  • a computer device in another aspect, includes a processor and a memory.
  • the memory stores at least one instruction, at least one program, code set or instruction set, the at least one instruction, the at least A piece of program, the code set or the instruction set is loaded and executed by the processor to realize the application method of the sight in the virtual environment provided in the above embodiment of the present application.
  • a storage medium is provided, the storage medium is used to store a computer program, and the computer program is used to execute to implement the application method of the sight in the virtual environment as provided in the above embodiment of the present application.
  • a computer program product is provided.
  • the computer program product runs on a computer, the computer executes the scope application method in the virtual environment as provided in the above embodiments of the present application.
  • Figure 1 is a schematic diagram of an interface provided in the related art for applying a scope accessory in a virtual environment
  • Fig. 2 is a schematic diagram of an interface for applying a scope accessory in a virtual environment provided by an exemplary embodiment of the present application
  • Fig. 3 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • Fig. 4 is a schematic diagram of an implementation environment provided by an exemplary embodiment of the present application.
  • Fig. 5 is a flowchart of a method for applying a scope in a virtual environment provided by an exemplary embodiment of the present application
  • FIG. 6 is a schematic diagram showing a display of a scope accessory provided based on the embodiment shown in FIG. 5;
  • FIG. 7 is a schematic diagram of determining the corresponding relationship between the touch operation and the scope control control provided by the embodiment shown in FIG. 5;
  • FIG. 8 is a flowchart of a method for applying a scope in a virtual environment provided by another exemplary embodiment of the present application.
  • FIG. 9 is a schematic diagram of an interface corresponding to the hybrid control method provided based on the embodiment shown in FIG. 8;
  • FIG. 10 is a schematic diagram of a time axis corresponding to the hybrid control method provided based on the embodiment shown in FIG. 8;
  • Fig. 11 is a flowchart corresponding to a click control method provided by an exemplary embodiment of the present application.
  • FIG. 12 is a flowchart corresponding to a long press control mode provided by an exemplary embodiment of the present application.
  • FIG. 13 is a flowchart corresponding to a hybrid control method provided by an exemplary embodiment of the present application.
  • FIG. 14 is a flowchart of a method for applying a scope in a virtual environment provided by another exemplary embodiment of the present application.
  • Fig. 15 is a structural block diagram of an application device for a scope in a virtual environment provided by an exemplary embodiment of the present application;
  • FIG. 16 is a structural block diagram of a scope application device in a virtual environment provided by another exemplary embodiment of the present application.
  • Fig. 17 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • Virtual environment the virtual environment displayed (or provided) when the application is running on the terminal.
  • the virtual environment may be a simulation environment of the real world, a semi-simulation and semi-fictional three-dimensional environment, or a purely fictitious three-dimensional environment.
  • the virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment.
  • the following embodiments take the virtual environment as a three-dimensional virtual environment as an example, but are not limited thereto.
  • the virtual environment is also used for a virtual environment battle between at least two virtual characters.
  • the virtual environment is also used for battles between at least two virtual characters using virtual firearms.
  • the virtual environment is also used to use virtual firearms for battle between at least two virtual characters within the target area, and the target area will continue to decrease with the passage of time in the virtual environment.
  • Virtual object refers to the movable object in the virtual environment.
  • the movable objects may be virtual characters, virtual animals, cartoon characters, etc., such as: characters, animals, plants, oil barrels, walls, stones, etc. displayed in a three-dimensional virtual environment.
  • the virtual object is a three-dimensional model created based on animation skeleton technology.
  • Each virtual object has its own shape and volume in the three-dimensional virtual environment, and occupies a part of the space in the three-dimensional virtual environment.
  • Virtual firearms refer to virtual weapons that attack by firing bullets in a virtual environment.
  • the virtual objects can pick up virtual firearms in the virtual environment and attack through the virtual firearms obtained.
  • each virtual firearm can be provided with at least one slot, and each slot is used to assemble at least one firearm accessory, such as: M416 automatic rifles are usually provided with muzzle slots, grip slots, and magazine slots , Buttstock slot and scope slot, among them, the scope slot can be equipped with red dot scope, holographic scope, 2x scope (abbreviation: 2x lens), 4x scope (abbreviation: 4 When the player holds the M416 automatic rifle and opens the mirror, it can zoom in and observe the virtual environment to a certain extent according to the scope of the gun assembly.
  • the virtual prop is realized as the virtual firearm as an example for description.
  • the virtual prop is equipped with a scope accessory, that is, the scope slot of the virtual firearm is equipped with a scope accessory.
  • the scope accessory may be a mechanical scope equipped on the virtual gun itself, or a scope obtained by picking up a virtual object in a virtual environment.
  • the method provided in this application can be applied to virtual reality applications, three-dimensional map programs, military simulation programs, first-person shooting games (FPS), and multiplayer online battle arena games (MOBA) ), etc., the following embodiments are examples of applications in games.
  • Games based on virtual environments are often composed of one or more maps of the game world.
  • the virtual environment in the game simulates the scene of the real world. Users can manipulate the virtual objects in the game to walk, run, jump, shoot, and fight in the virtual environment. , Driving, switching to using virtual weapons, using virtual weapons to attack other virtual objects and other actions, the interaction is strong, and multiple users can team up for competitive games online.
  • the user controls the virtual object to use the virtual weapon to attack the target virtual object the user selects a suitable virtual weapon to attack the virtual object according to the location of the target virtual object or operating habits.
  • the scope control is usually provided in the environment interface.
  • the first-person view of the virtual object is displayed in the first environment interface.
  • the virtual object holds a virtual prop (such as a virtual firearm), the virtual prop is equipped with a scope accessory, and the person’s perspective screen 100 is superimposed and displayed with a scope control 110.
  • the scope screen 120 is displayed.
  • the scope screen 120 is a screen for observing the virtual environment through a scope accessory.
  • the scope screen 120 is superimposed and displayed with the above-mentioned scope control 110.
  • the scope control 110 displays the cross shape as shown in Fig. 1 after the scope accessory is turned on. When the user clicks on the scope control 110 again, the scope accessory is closed and the first-person perspective is restored to display the personal perspective screen 100.
  • FIG. 2 shows a schematic diagram of a user interface of the method for applying a scope in a virtual environment provided by an embodiment of the present application, as shown in FIG. 2:
  • a personal perspective screen 200 is displayed in which a virtual object observes the virtual environment from a first-person perspective.
  • the virtual object holds a virtual prop (such as a virtual firearm), and the virtual prop is equipped with a scope accessory.
  • a scope control control 210 is superimposed and displayed on the personal view screen 200.
  • the scope screen 220 is displayed.
  • the scope screen 220 is a screen for observing the virtual environment through the scope accessory.
  • the user continues to touch the scope control control 210 During the process, the terminal continues to display the scope screen 220, and when the user ends the touch on the scope control control 210, the terminal closes the scope accessories and resumes displaying the person perspective screen 200.
  • the screen content of 200 can be changed content, for example, the screen content has changed due to the movement of virtual objects, the displacement of objects in the virtual scene, etc., the screen content of the personal perspective screen 200 can also be unchanged content.
  • the terminal in this application can be a desktop computer, a laptop portable computer, a mobile phone, a tablet computer, an e-book reader, an MP3 (Moving Picture Experts Group Audio Layer III, dynamic image expert compression standard audio layer 3) player, MP4 ( Moving Picture Experts Group Audio Layer IV, moving picture experts compress standard audio layer 4) Players, etc.
  • An application program supporting a virtual environment is installed and running in the terminal, for example, an application program supporting a three-dimensional virtual environment.
  • the application program can be any of virtual reality applications, three-dimensional map programs, military simulation programs, TPS games, FPS games, and MOBA games.
  • the application program may be a stand-alone version application program, such as a stand-alone version of a 3D game program, or a network online version application program.
  • Fig. 3 shows a structural block diagram of an electronic device provided by an exemplary embodiment of the present application.
  • the electronic device 300 includes an operating system 320 and an application program 322.
  • the operating system 320 is basic software that provides the application 322 with secure access to computer hardware.
  • the application program 322 is an application program supporting a virtual environment.
  • the application program 322 is an application program supporting a three-dimensional virtual environment.
  • the application program 322 may be any of a virtual reality application program, a three-dimensional map program, a military simulation program, a third-person shooting game (Third-Personal Shooting Game, TPS), an FPS, a MOBA game, and a multiplayer gun battle survival game.
  • the application program 322 may be a stand-alone version application program, such as a stand-alone version 4D game program.
  • Fig. 4 shows a structural block diagram of a computer system provided by an exemplary embodiment of the present application.
  • the computer system 400 includes: a first device 420, a server 440, and a second device 460.
  • the first device 420 installs and runs an application program supporting the virtual environment.
  • the application can be any of virtual reality applications, three-dimensional map programs, military simulation programs, TPS games, FPS games, MOBA games, and multiplayer gun battle survival games.
  • the first device 420 is a device used by the first user.
  • the first user uses the first device 420 to control the first virtual object in the virtual environment to perform activities, including but not limited to: adjusting body posture, crawling, walking, running, At least one of riding, jumping, driving, picking up, shooting, attacking, and throwing.
  • the first virtual object is a first virtual character, such as a simulated character or an animation character.
  • the first device 420 is connected to the server 440 through a wireless network or a wired network.
  • the server 440 includes at least one of a server, multiple servers, a cloud computing platform, and a virtualization center.
  • the server 440 is used to provide background services for applications supporting the three-dimensional virtual environment.
  • the server 440 is responsible for the main calculation work, and the first device 420 and the second device 460 are responsible for the secondary calculation work; or, the server 440 is responsible for the secondary calculation work, and the first device 420 and the second device 460 are responsible for the main calculation work;
  • the server 440, the first device 420, and the second device 460 adopt a distributed computing architecture to perform collaborative computing.
  • the second device 460 installs and runs an application program supporting the virtual environment.
  • the application can be any of virtual reality applications, three-dimensional map programs, military simulation programs, FPS games, MOBA games, and multiplayer gun battle survival games.
  • the second device 460 is a device used by the second user.
  • the second user uses the second device 460 to control the second virtual object in the virtual environment to perform activities, including but not limited to: adjusting body posture, crawling, walking, running, At least one of riding, jumping, driving, picking up, shooting, attacking, and throwing.
  • the second virtual object is a second virtual character, such as a simulated character or an animation character.
  • first virtual character and the second virtual character are in the same virtual environment.
  • first virtual character and the second virtual character may belong to the same team, the same organization, have a friend relationship, or have temporary communication permissions.
  • first virtual character and the second virtual character may also belong to different teams, different organizations, or two groups that are hostile.
  • the applications installed on the first device 420 and the second device 460 are the same, or the applications installed on the two devices are the same type of applications on different control system platforms.
  • the first device 420 may generally refer to one of multiple devices
  • the second device 460 may generally refer to one of multiple devices
  • the first device 420 and the second device 460 may be the aforementioned terminals.
  • This embodiment only uses the first device 420 and the second device 460 as examples.
  • the device types of the first device 420 and the second device 460 are the same or different.
  • the device types include: game consoles, desktop computers, smart phones, tablet computers, e-book readers, MP3 players, MP4 players and laptop portables At least one of the computers.
  • the device is a desktop computer as an example.
  • the number of the above-mentioned devices can be more or less. For example, there may be only one device, or there may be dozens or hundreds of devices, or more.
  • the embodiment of the present application does not limit the quantity and type of equipment.
  • the method includes:
  • Step 501 Display a first environment interface.
  • the first environment interface includes a personal perspective screen for observing the virtual environment from the personal perspective of a virtual object, the virtual object holding a virtual prop, and the virtual prop is equipped with a scope accessory.
  • the first environment interface also includes scope control controls.
  • the person perspective picture may be a picture displayed in the first person perspective of the virtual object, or may be a picture displayed in the third person perspective of the virtual object.
  • the first-person perspective is the perspective corresponding to the screen that the virtual object can observe in the virtual environment, and the screen corresponding to the first-person perspective does not include the virtual object itself, such as only the arms and virtual firearms of the virtual object can be seen;
  • the third-person perspective refers to the perspective of observing the virtual object through the camera model in the virtual environment.
  • the screen corresponding to the third-person perspective includes the virtual object itself, and usually the camera model is located behind the virtual object to observe the virtual object , For example, you can see the three-dimensional model of the virtual object and the virtual props (such as virtual firearms) held by the virtual object.
  • FIG. 6 is a schematic diagram of the interface of the personal perspective screen 61 displayed in the first-person perspective of the virtual object.
  • the personal perspective screen 61 includes an item status bar 62, and the item status bar 62 It shows the "red dot" of the scope accessory on the virtual prop, that is, the red dot scope.
  • the person-view image when the person-view image is displayed in the third-person view of the virtual object, the person-view image further includes the virtual prop, and the assembled scope accessory is displayed on the virtual prop.
  • Step 502 Receive a touch operation on the scope control control, and generate a touch signal.
  • the method is applied to a terminal with a touch display screen, receives a touch operation on the scope control on the touch display screen, and generates a touch signal, which is generated when the touch operation starts Is used to indicate the start of the touch operation.
  • a touch operation on the touch screen is received, the first coordinate corresponding to the touch operation on the touch screen is determined, and the second coordinate corresponding to the center point of the sight control control on the touch screen is determined, and when the first coordinate is When the distance between the first coordinate and the second coordinate is less than the required distance, it is determined that the touch operation is a touch operation acting on the scope control control.
  • the touch display screen corresponds to a target coordinate system, in which the first coordinate corresponding to the touch operation is determined, and the second coordinate corresponding to the center point of the sight control control is determined.
  • a target coordinate system in which the first coordinate corresponding to the touch operation is determined, and the second coordinate corresponding to the center point of the sight control control is determined.
  • point A as the touch operation point
  • point B as the center point of the scope control control as an example.
  • the coordinates of point A are (x1, y1) and the coordinates of point B are (x2, y2)
  • the calculation method of the distance between point A and point B please refer to the following formula 1:
  • Step 503 Display a second environment interface according to the touch signal, and the second environment interface includes a scope picture for observing the virtual environment through the scope accessory.
  • the scope accessories assembled on the virtual prop can be at least one of a mechanical scope, a basic scope, a low-power scope, and a high-power scope, where the mechanical scope refers to the virtual prop itself.
  • Scope that is, when there is no other scope in the scope of the virtual prop, the mechanical scope can be used to observe the virtual environment;
  • the basic scope means that there is no magnification in the observation of the virtual environment, or in other words , A scope accessory for observing the virtual environment with 1x magnification effect.
  • the basic scope includes at least one of a red dot scope and a holographic scope;
  • a low-magnification scope refers to observation of the virtual environment Scope, the scope accessories with lower magnification, such as 2x lens, 3x lens, etc.;
  • high magnification scope refers to the scope accessories with higher magnification when observing the virtual environment, such as: 4x lens, 6 Multiplier, 8x, 15x, etc.
  • Step 504 When the touch end signal of the touch operation is received, resume displaying the personal view screen.
  • the user touches the scope control control on the touch display screen as a touch operation on the scope control control.
  • the terminal When the user releases the touch on the scope control control, the terminal generates the touch end signal .
  • the application method of the scope in the virtual environment provided by the embodiments of the present application, by setting the scope control control, and according to whether the touch operation on the scope control control continues, so as not to observe through the scope accessories Switch between the person’s angle of view screen and the scope screen observed through the scope accessory. Only a single touch of the scope accessory is required to switch between the person angle of view screen and the scope screen, which improves the scope.
  • the opening efficiency of accessories improves the efficiency and accuracy of observation of the virtual environment.
  • FIG. 8 is an example of a scope application method in a virtual environment provided by another exemplary embodiment of the present application. The flowchart is illustrated by taking the method applied to the terminal as an example. As shown in Fig. 8, the method includes:
  • Step 801 Display a first environment interface.
  • the first environment interface includes a personal perspective screen for observing the virtual environment from the personal perspective of a virtual object, the virtual object holding a virtual prop, and the virtual prop is equipped with a scope accessory.
  • the first environment interface also includes scope control controls.
  • the person perspective picture is a picture displayed in the first person perspective of the virtual object, or may be a picture displayed in the third person perspective of the virtual object.
  • Step 802 Receive a touch operation on the scope control control, and generate a touch signal.
  • the method is applied to a terminal with a touch display screen to receive a touch operation on the sight control control on the touch display screen.
  • the scope control mode can also be set in the function setting interface, where the scope control mode includes a click control mode, a long press control mode, and a hybrid Any of the control modes.
  • the click control method refers to the method of opening the scope accessory by clicking the scope control control, and the method of clicking the scope control control to close the scope accessory;
  • the long-press control method refers to the long-pressing the scope control control to turn on the scope accessory, And when the long-press operation is over, the scope accessories are turned off;
  • the hybrid control mode refers to the control mode that selects between the click control mode and the long-press control mode with the required time as the critical time.
  • the first environment interface also includes a function setting control
  • the function setting control is used to open the function setting interface, receive a trigger operation on the function setting control, and display the function setting interface according to the trigger operation.
  • the function setting interface The scope includes the scope setting options.
  • the scope setting options include click control options, long-press control options, and hybrid control options. Among them, the click control option corresponds to the above-mentioned click control method, and the long-press control option corresponds to the above-mentioned long-press control method.
  • the hybrid control option corresponds to the aforementioned hybrid control method, and optionally, an opening operation on the hybrid control option is received, and the hybrid control option is used to instruct to determine the closing method of the sight accessory with the required duration as the critical duration.
  • Step 803 Display a second environment interface according to the touch signal, and the second environment interface includes a scope picture for observing the virtual environment through the scope accessory.
  • the scope accessory assembled on the virtual prop may be at least one of a mechanical scope, a basic scope, a low-power scope, and a high-power scope.
  • Step 804 When the touch end signal of the touch operation is received, determine the touch duration of the touch operation.
  • step 805 when the touch duration reaches the required duration, the scope accessory is turned off according to the touch end signal, and the display of the personal perspective screen is restored.
  • the long-press control mode is determined as the control mode of the scope accessories, that is, when the control of the scope control control ends During touch operation, close the scope control.
  • the required duration is a preset duration.
  • the required duration is 1 second.
  • the touch end signal is determined to be the control Signal that the scope accessory is closed.
  • Step 806 When the touch duration does not reach the required duration, the click signal on the scope control control is determined as a signal for controlling the closing of the scope accessory.
  • the touch operation may be regarded as a click operation.
  • step 807 when the click signal on the scope control control is received, the scope accessories are closed, and the personal perspective screen is restored.
  • a personal perspective screen 900 for observing the virtual environment from the first-person perspective of the virtual object is displayed, and the scope control control 910 is superimposed on the personal perspective screen 900.
  • the user performs a touch operation on the scope control control 910, the scope screen 920 is displayed according to the touch operation, and the touch duration of the touch operation reaches the required duration, then when the touch operation ends, the system resumes Displays the personal perspective screen 900; and in the second branch, the user performs a touch operation on the scope control control 910, and displays the scope screen 920 according to the touch operation, and ends the touch operation when the touch time does not reach the required time
  • the click operation on the scope control control 910 is received again, the personal view screen 900 is restored.
  • the content of the personal view screen 900 may be changed or unchanged. Content.
  • time point S is the moment when the user touches the sight control control
  • time point A is the moment when the first user ends the touch
  • time point B is the moment when the second user ends the touch
  • the time point O is the critical moment corresponding to the required duration
  • the application method of the scope in the virtual environment determines the corresponding touch signal and the end of the touch by setting the scope control control, and according to whether the touch operation on the scope control control continues.
  • Signal so that for the person's angle of view screen that is not observed through the scope accessory and the scope screen that is observed through the scope accessory, only a single touch of the scope accessory is required to be between the person's angle of view screen and the scope screen Switching improves the opening efficiency of the scope accessories, thereby improving the efficiency and accuracy of observation of the virtual environment.
  • the required duration is used as the critical duration of the click control mode and the long-press control mode.
  • the user can determine the control mode of the scope accessory through a single touch operation, which improves the control efficiency.
  • step 1101 the scope accessory is in a closed state.
  • the environment interface displayed by the terminal includes a screen for observing the virtual environment from the first person perspective or the third person perspective of the virtual object.
  • Step 1102 Determine whether the user touches the scope control control.
  • Step 1103 When the user touches the scope control control, the scope accessory is turned on.
  • Step 1104 the user ends the touch on the scope control control.
  • Step 1105 keep the mirror on state.
  • Step 1106 Determine whether the user clicks the scope control control again.
  • Step 1107 When the user clicks the scope control control again, the scope accessory is closed.
  • step 1201 the scope accessory is in a closed state.
  • the environment interface displayed by the terminal includes a screen for observing the virtual environment from the first person perspective or the third person perspective of the virtual object.
  • Step 1202 Determine whether the user touches the scope control control.
  • Step 1203 When the user touches the scope control control, the scope accessory is turned on.
  • Step 1204 It is judged whether the user ends the touch operation on the sight control control.
  • Step 1205 When the user ends the touch operation on the scope control control, close the scope accessory.
  • Step 1206 When the user does not end the touch operation on the scope control control, keep the scope accessory turned on.
  • Step 1301 Select a hybrid control mode.
  • Step 1302 Determine whether the user touches the scope control control.
  • Step 1304 Record the touch duration of the touch operation.
  • Step 1305 Determine whether the user has finished touching the scope control control.
  • Step 1306 When the user finishes touching the scope control control, it is determined whether the long-press control mode is triggered.
  • Step 1307 when the long-press control mode is triggered, the scope accessory is closed.
  • Step 1309 It is judged whether the click operation on the scope control control is received again. Optionally, when a click operation on the scope control control is received again, the scope accessory is closed.
  • FIG. 14 is a virtual environment provided by another exemplary embodiment of the present application.
  • the flow chart of the application method of the scope is described by taking the application of the method in the terminal as an example. As shown in FIG. 14, the method includes:
  • Step 1401 Display a first environment interface.
  • the first environment interface includes a personal perspective screen for observing the virtual environment from the personal perspective of a virtual object, the virtual object holding a virtual prop, and the virtual prop is equipped with a scope accessory,
  • the first environment interface also includes scope control controls.
  • the person perspective picture is a picture displayed in the first person perspective of the virtual object, or may be a picture displayed in the third person perspective of the virtual object.
  • Step 1402 Receive a touch operation on the scope control control, and generate a touch signal.
  • the method is applied to a terminal with a touch display screen to receive a touch operation on the sight control control on the touch display screen.
  • the scope control mode can also be set in the function setting interface, where the scope control mode includes a click control mode, a long press control mode, and a hybrid Any of the control modes.
  • the click control method refers to the method of opening the scope accessory by clicking the scope control control, and the method of clicking the scope control control to close the scope accessory;
  • the long-press control method refers to the long-pressing the scope control control to turn on the scope accessory, And when the long-press operation is over, the scope accessories are turned off;
  • the hybrid control mode refers to the control mode that selects between the click control mode and the long-press control mode with the required time as the critical time.
  • the first environment interface also includes a function setting control
  • the function setting control is used to open the function setting interface, receive a trigger operation on the function setting control, and display the function setting interface according to the trigger operation.
  • the function setting interface It includes the scope setting options.
  • the scope setting options include click control options, long press control options, and hybrid control options. Among them, the click control option corresponds to the above click control method, and the long press control option corresponds to the above long press control method
  • the hybrid control option corresponds to the hybrid control method described above.
  • an opening operation on the hybrid control option is received, and the hybrid control option is used to instruct the required duration as the critical duration to determine the closing method of the scope accessory.
  • Step 1403 Display a second environment interface according to the touch signal, and the second environment interface includes a scope picture for observing the virtual environment through the scope accessory.
  • the scope accessory assembled on the virtual prop may be at least one of a mechanical scope, a basic scope, a low-power scope, and a high-power scope.
  • the scope picture is a picture obtained by observing the virtual environment in the first angle of view through the scope accessory.
  • Step 1404 Receive a drag operation determined according to the touch operation.
  • a drag operation connected to the touch operation is received when the touch operation is not ended.
  • Step 1405 Adjust the first viewing angle direction according to the drag operation to obtain the second viewing angle direction.
  • the first viewing angle direction is the viewing angle direction for the camera model to observe the virtual environment in the form of a scope accessory
  • the camera model is performed in the virtual environment according to the dragging direction and the dragging distance of the drag operation.
  • the second viewing angle direction is also not the viewing angle direction for the camera model to observe the virtual environment in the form of a scope accessory.
  • the speed at which the viewing angle direction is adjusted is the first adjustment speed
  • the first viewing angle direction is adjusted at the second adjustment speed according to the above drag operation to obtain The second viewing angle direction, wherein the second adjustment speed is slower than the first adjustment speed.
  • a third environment interface is displayed, and the third environment interface includes a picture for observing the virtual environment in a second angle of view through the scope accessory.
  • Step 1407 When the touch end signal of the touch operation is received, the scope accessory is turned off, and the display of the person perspective screen is restored.
  • the user touches the scope control control on the touch display screen as a touch operation on the scope control control.
  • the terminal When the user releases the touch on the scope control control, the terminal generates the touch end signal .
  • the application method of the scope in the virtual environment provided by the embodiments of the present application, by setting the scope control control, and according to whether the touch operation on the scope control control continues, so as not to observe through the scope accessories Switch between the person’s angle of view screen and the scope screen observed through the scope accessory. Only a single touch of the scope accessory is required to switch between the person angle of view screen and the scope screen, which improves the scope.
  • the opening efficiency of accessories improves the efficiency and accuracy of observation of the virtual environment.
  • the viewing angle direction for observing the virtual environment is adjusted by receiving the drag operation connected with the touch operation, that is, in a single touch event
  • the operation of opening the scope accessories, adjusting the angle of view and closing the scope accessories is realized, and the application efficiency of the scope accessories is improved.
  • FIG. 15 is a structural block diagram of an application device for a scope in a virtual environment provided by an exemplary embodiment of the present application.
  • the device is configured in a terminal as an example for description.
  • the device includes: a display module 1510 And receiving module 1520;
  • the display module 1510 is configured to display a first environment interface.
  • the first environment interface includes a personal view screen for observing the virtual environment from the personal view of a virtual object, the virtual object holding virtual props, and the virtual The props are equipped with scope accessories, and the first environment interface also includes scope control controls;
  • the receiving module 1520 is used to receive the touch operation on the scope control control and generate a touch signal
  • the display module 1510 is further configured to display a second environment interface according to the touch signal, and the second environment interface includes a scope picture for observing the virtual environment through the scope accessory;
  • the receiving module 1520 is further configured to restore the display of the person perspective screen when the touch end signal of the touch operation is received.
  • the receiving module 1520 is further configured to determine the touch duration of the touch operation when the touch end signal of the touch operation is received; when the touch duration reaches the required duration, Turn off the scope accessory according to the touch end signal, and restore the display of the person perspective picture.
  • the device further includes:
  • the determining module 1530 is configured to determine the click signal on the scope control control as a signal for controlling the closing of the scope accessory when the touch time length does not reach the required time length;
  • the receiving module 1520 is further configured to turn off the scope accessory when the click signal on the scope control control is received;
  • the display module 1510 is also used to restore the display of the person perspective picture.
  • the receiving module 1520 is further configured to receive a trigger operation on the function setting control
  • the display module 1510 is further configured to display a function setting interface according to the trigger operation, the function setting interface includes scope setting options, and the scope setting options include hybrid control options;
  • the receiving module 1520 is further configured to receive an opening operation on the hybrid control option, and the hybrid control option is used to instruct the required duration as a critical duration to determine the closing manner of the scope accessory.
  • the scope picture is a picture obtained by observing the virtual environment in a first angle of view through the scope accessory;
  • the receiving module 1520 is further configured to receive a drag operation determined according to the touch operation
  • the device also includes:
  • the adjustment module 1540 is configured to adjust the first viewing angle direction according to the drag operation to obtain a second viewing angle direction;
  • the display module 1510 is also configured to display a third environment interface, the third environment interface including a picture for observing the virtual environment in the second viewing angle direction through the scope accessory.
  • the speed at which the viewing angle direction is adjusted is the first adjustment speed
  • the adjustment module 1540 is further configured to adjust the first viewing angle direction at a second adjustment speed according to the drag operation, wherein the second adjustment speed is slower than the first adjustment speed.
  • the receiving module 1520 is further configured to receive the touch operation on the touch screen
  • the device also includes:
  • the determining module 1530 is configured to determine the first coordinate corresponding to the touch operation on the touch display screen; determine the second coordinate corresponding to the center point of the sight control control on the touch display screen; when the When the distance between the first coordinate and the second coordinate is less than the required distance, it is determined that the touch operation is a touch operation acting on the scope control control.
  • the scope application device in the virtual environment determines the corresponding touch signal and the end of the touch by setting the scope control control, and according to whether the touch operation on the scope control control continues.
  • Signal so that for the person's angle of view screen that is not observed through the scope accessory and the scope screen that is observed through the scope accessory, only a single touch of the scope accessory is required to be between the person's angle of view screen and the scope screen Switching improves the opening efficiency of the scope accessories, thereby improving the efficiency and accuracy of observation of the virtual environment.
  • scope application device in the virtual environment provided in the above embodiment is only illustrated by the division of the above functional modules. In actual applications, the above functions can be allocated by different functional modules as needed. The internal structure of the device is divided into different functional modules to complete all or part of the functions described above.
  • scope application device in the virtual environment provided by the above embodiment and the scope application method in the virtual environment belong to the same concept. The specific implementation process is detailed in the method embodiment, which will not be omitted here. Repeat.
  • FIG. 17 shows a structural block diagram of a terminal 1700 provided by an exemplary embodiment of the present application.
  • the terminal 1700 can be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, moving picture expert compression standard audio layer 3), MP4 (Moving Picture Experts Group Audio Layer IV, moving picture expert compressing standard audio Level 4) Player, laptop or desktop computer.
  • the terminal 1700 may also be called user equipment, portable terminal, laptop terminal, desktop terminal and other names.
  • the terminal 1700 includes a processor 1701 and a memory 1702.
  • the processor 1701 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on.
  • the processor 1701 may adopt at least one hardware form among DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array, Programmable Logic Array). achieve.
  • the processor 1701 may also include a main processor and a coprocessor.
  • the main processor is a processor used to process data in the wake state, also called a CPU (Central Processing Unit, central processing unit); the coprocessor is A low-power processor used to process data in the standby state.
  • the processor 1701 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used to render and draw content that needs to be displayed on the display screen.
  • the processor 1701 may further include an AI (Artificial Intelligence) processor, and the AI processor is used to process calculation operations related to machine learning.
  • AI Artificial Intelligence
  • the memory 1702 may include one or more computer-readable storage media, which may be non-transitory.
  • the memory 1702 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
  • the non-transitory computer-readable storage medium in the memory 1702 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 1701 to implement the virtual environment provided by the method embodiment of the present application The application method of the middle sight.
  • the terminal 1700 may optionally further include: a peripheral device interface 1703 and at least one peripheral device.
  • the processor 1701, the memory 1702, and the peripheral device interface 1703 may be connected by a bus or a signal line.
  • Each peripheral device can be connected to the peripheral device interface 1703 through a bus, a signal line, or a circuit board.
  • the peripheral device includes: at least one of a radio frequency circuit 1704, a touch display screen 1705, a camera 1706, an audio circuit 1707, a positioning component 1708, and a power supply 1709.
  • the peripheral device interface 1703 may be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1701 and the memory 1702.
  • the processor 1701, the memory 1702, and the peripheral device interface 1703 are integrated on the same chip or circuit board; in some other embodiments, any one of the processor 1701, the memory 1702, and the peripheral device interface 1703 or The two can be implemented on separate chips or circuit boards, which are not limited in this embodiment.
  • the radio frequency circuit 1704 is used to receive and transmit RF (Radio Frequency, radio frequency) signals, also called electromagnetic signals.
  • the radio frequency circuit 1704 communicates with a communication network and other communication devices through electromagnetic signals.
  • the radio frequency circuit 1704 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals.
  • the radio frequency circuit 1704 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a user identity module card, and so on.
  • the radio frequency circuit 1704 can communicate with other terminals through at least one wireless communication protocol.
  • the wireless communication protocol includes but is not limited to: World Wide Web, Metropolitan Area Network, Intranet, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area network and/or WiFi (Wireless Fidelity, wireless fidelity) network.
  • the radio frequency circuit 1704 may also include NFC (Near Field Communication) related circuits, which is not limited in this application.
  • the display screen 1705 is used to display UI (User Interface).
  • the UI can include graphics, text, icons, videos, and any combination thereof.
  • the display screen 1705 also has the ability to collect touch signals on or above the surface of the display screen 1705.
  • the touch signal may be input to the processor 1701 as a control signal for processing.
  • the display screen 1705 may also be used to provide virtual buttons and/or virtual keyboards, also called soft buttons and/or soft keyboards.
  • the display screen 1705 there may be one display screen 1705, which is provided with the front panel of the terminal 1700; in other embodiments, there may be at least two display screens 1705, which are respectively arranged on different surfaces of the terminal 1700 or in a folded design; In still other embodiments, the display screen 1705 may be a flexible display screen, which is arranged on the curved surface or the folding surface of the terminal 1700. Furthermore, the display screen 1705 can also be set as a non-rectangular irregular pattern, that is, a special-shaped screen.
  • the display screen 1705 may be made of materials such as LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode, organic light-emitting diode).
  • the camera assembly 1706 is used to capture images or videos.
  • the camera assembly 1706 includes a front camera and a rear camera.
  • the front camera is set on the front panel of the terminal, and the rear camera is set on the back of the terminal.
  • the camera assembly 1706 may also include a flash.
  • the flash can be a single-color flash or a dual-color flash. Dual color temperature flash refers to a combination of warm light flash and cold light flash, which can be used for light compensation under different color temperatures.
  • the audio circuit 1707 may include a microphone and a speaker.
  • the microphone is used to collect sound waves from the user and the environment, and convert the sound waves into electrical signals and input them to the processor 1701 for processing, or input to the radio frequency circuit 1704 to implement voice communication. For the purpose of stereo collection or noise reduction, there may be multiple microphones, which are respectively set in different parts of the terminal 1700.
  • the microphone can also be an array microphone or an omnidirectional acquisition microphone.
  • the speaker is used to convert the electrical signal from the processor 1701 or the radio frequency circuit 1704 into sound waves.
  • the speaker can be a traditional membrane speaker or a piezoelectric ceramic speaker.
  • the speaker When the speaker is a piezoelectric ceramic speaker, it can not only convert the electrical signal into human audible sound waves, but also convert the electrical signal into human inaudible sound waves for purposes such as distance measurement.
  • the audio circuit 1707 may also include a headphone jack.
  • the positioning component 1708 is used to locate the current geographic position of the terminal 1700 to implement navigation or LBS (Location Based Service, location-based service).
  • the positioning component 1708 may be a positioning component based on the GPS (Global Positioning System, Global Positioning System) of the United States, the Beidou system of China, or the Galileo system of Russia.
  • the power supply 1709 is used to supply power to various components in the terminal 1700.
  • the power source 1709 may be alternating current, direct current, disposable batteries or rechargeable batteries.
  • the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery.
  • a wired rechargeable battery is a battery charged through a wired line
  • a wireless rechargeable battery is a battery charged through a wireless coil.
  • the rechargeable battery can also be used to support fast charging technology.
  • the terminal 1700 further includes one or more sensors 1710.
  • the one or more sensors 1710 include, but are not limited to: an acceleration sensor 1711, a gyroscope sensor 1712, a pressure sensor 1713, a fingerprint sensor 1714, an optical sensor 1715, and a proximity sensor 1716.
  • the acceleration sensor 1711 can detect the magnitude of acceleration on the three coordinate axes of the coordinate system established by the terminal 1700.
  • the acceleration sensor 1711 can be used to detect the components of the gravitational acceleration on three coordinate axes.
  • the processor 1701 may control the touch screen 1705 to display the user interface in a horizontal view or a vertical view according to the gravity acceleration signal collected by the acceleration sensor 1711.
  • the acceleration sensor 1711 can also be used for game or user motion data collection.
  • the gyroscope sensor 1712 can detect the body direction and rotation angle of the terminal 1700, and the gyroscope sensor 1712 can cooperate with the acceleration sensor 1711 to collect the user's 3D actions on the terminal 1700. Based on the data collected by the gyroscope sensor 1712, the processor 1701 can implement the following functions: motion sensing (such as changing the UI according to the user's tilt operation), image stabilization during shooting, game control, and inertial navigation.
  • the pressure sensor 1713 may be disposed on the side frame of the terminal 1700 and/or the lower layer of the touch screen 1705.
  • the processor 1701 performs left and right hand recognition or quick operation according to the holding signal collected by the pressure sensor 1713.
  • the processor 1701 controls the operability controls on the UI interface according to the pressure operation of the user on the touch display screen 1705.
  • the operability control includes at least one of a button control, a scroll bar control, an icon control, and a menu control.
  • the fingerprint sensor 1714 is used to collect the user's fingerprint.
  • the processor 1701 can identify the user's identity according to the fingerprint collected by the fingerprint sensor 1714, or the fingerprint sensor 1714 can identify the user's identity according to the collected fingerprint. When it is recognized that the user's identity is a trusted identity, the processor 1701 authorizes the user to perform related sensitive operations, including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings.
  • the fingerprint sensor 1714 may be provided on the front, back or side of the terminal 1700. When a physical button or a manufacturer logo is provided on the terminal 1700, the fingerprint sensor 1714 can be integrated with the physical button or the manufacturer logo.
  • the optical sensor 1715 is used to collect the ambient light intensity.
  • the processor 1701 may control the display brightness of the touch screen 1705 according to the intensity of the ambient light collected by the optical sensor 1715. Specifically, when the ambient light intensity is high, the display brightness of the touch screen 1705 is increased; when the ambient light intensity is low, the display brightness of the touch screen 1705 is decreased.
  • the processor 1701 may also dynamically adjust the shooting parameters of the camera assembly 1706 according to the ambient light intensity collected by the optical sensor 1715.
  • the proximity sensor 1716 also called a distance sensor, is usually arranged on the front panel of the terminal 1700.
  • the proximity sensor 1716 is used to collect the distance between the user and the front of the terminal 1700.
  • the processor 1701 controls the touch screen 1705 to switch from the on-screen state to the off-screen state; when the proximity sensor 1716 detects When the distance between the user and the front of the terminal 1700 gradually increases, the processor 1701 controls the touch display screen 1705 to switch from the rest screen state to the bright screen state.
  • FIG. 17 does not constitute a limitation on the terminal 1700, and may include more or fewer components than shown in the figure, or combine certain components, or adopt different component arrangements.
  • An embodiment of the present application also provides a computer device including a memory and a processor.
  • the memory stores at least one instruction, at least one program, code set or instruction set, at least one instruction, at least one program, code set or instruction
  • the set is loaded by the processor and implements the scope application method in the virtual environment provided in the above embodiment of the present application.
  • the embodiment of the present application also provides a storage medium, the storage medium is used to store a computer program, and the computer program is used to execute to implement the application method of the sight in the virtual environment provided in the above embodiment of the present application.
  • This application also provides a computer program product, which when the computer program product runs on a computer, causes the computer to execute the application method of the sight in the virtual environment provided by the foregoing method embodiments.
  • the program can be stored in a computer-readable storage medium.
  • the medium may be a computer-readable storage medium included in the memory in the foregoing embodiment; or may be a computer-readable storage medium that exists alone and is not assembled into the terminal.
  • the computer-readable storage medium stores at least one instruction, at least one program, code set or instruction set, and the at least one instruction, the at least one program, the code set or the instruction set is loaded and executed by the processor In order to realize the application method of the sight in the virtual environment provided in the above embodiment of the present application.
  • the computer-readable storage medium may include: Read Only Memory (ROM), Random Access Memory (RAM), Solid State Drives (SSD, Solid State Drives), or optical discs.
  • random access memory may include resistive random access memory (ReRAM, Resistance Random Access Memory) and dynamic random access memory (DRAM, Dynamic Random Access Memory).
  • ReRAM resistive random access memory
  • DRAM Dynamic Random Access Memory

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

一种虚拟环境中瞄准镜的应用方法、装置、设备及存储介质,涉及虚拟环境领域,该方法包括:显示第一环境界面,第一环境界面中包括以人称视角对虚拟环境进行观察的画面和瞄准镜控制控件(501);接收在瞄准镜控制控件上的触摸操作,生成触摸信号(502);根据触摸信号显示第二环境界面,其中包括通过瞄准镜配件对虚拟环境进行观察的画面(503);当接收到触摸结束信号时,恢复显示所述人称视角画面(504)。通过设置瞄准镜控制控件,并根据在该瞄准镜控制控件上的触摸操作是否持续,从而在人称视角画面和瞄准镜画面之间进行切换,仅需要对该瞄准镜配件进行单次触摸,即可在人称视角画面和瞄准镜画面之间进行切换,提高了对虚拟环境进行观察的效率以及准确率。

Description

虚拟环境中瞄准镜的应用方法和相关装置
本申请要求于2019年08月22日提交中国专利局、申请号为201910780269.9、申请名称为“虚拟环境中瞄准镜的应用方法、装置、设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及虚拟环境领域,特别涉及虚拟环境中瞄准镜的应用。
背景技术
在包括虚拟环境的应用程序中,通常需要通过控制虚拟环境中的虚拟对象在虚拟环境中进行活动,用户可以控制虚拟对象可以在三维虚拟环境中处于奔跑、行走、站立、平躺和匍匐等姿态,在对虚拟环境进行观察时,可以通过虚拟对象的第一人称视角或第三人称视角对虚拟环境进行观察(也即通过人称视角对虚拟环境进行观察),也可以通过瞄准镜配件对虚拟环境进行观察。
相关技术中,在人称视角观察方式和瞄准镜配件观察方式之间可以通过瞄准镜控件进行切换,也即,当通过人称视角方式观察虚拟环境时,通过点击瞄准镜控件,可以将观察方式切换为通过瞄准镜配件观察方式,而通过瞄准镜观察方式观察虚拟环境时,通过点击瞄准镜控件可以将观察方式切换为人称视角方式。
发明内容
本申请实施例提供了虚拟环境中瞄准镜的应用方法和相关装置,可以解决对虚拟环境进行观察的效率较低,且观察的准确率也较低的问题。所述技术方案如下:
一方面,提供了一种虚拟环境中瞄准镜的应用方法,所述方法包括:
显示第一环境界面,所述第一环境界面中包括以虚拟对象的人称视角对所述虚拟环境进行观察的人称视角画面,所述虚拟对象持有虚拟道具,所述虚拟道具上装配有瞄准镜配件,所述第一环境界面中还包括瞄准镜控制控件;
接收在所述瞄准镜控制控件上的触摸操作,生成触摸信号;
根据所述触摸信号显示第二环境界面,所述第二环境界面中包括通过所述瞄准镜配件对所述虚拟环境进行观察的瞄准镜画面;
当接收到所述触摸操作的触摸结束信号时,恢复显示所述人称视角画面。
另一方面,提供了一种虚拟环境中瞄准镜的应用装置,所述装置包括:
显示模块,用于显示第一环境界面,所述第一环境界面中包括以虚拟对象的人称视角对所述虚拟环境进行观察的人称视角画面,所述虚拟对象持有虚拟道具,所述虚拟道具上装配有瞄准镜配件,所述第一环境界面中还包括瞄准镜控制控件;
接收模块,用于接收在所述瞄准镜控制控件上的触摸操作,生成触摸信号;
所述显示模块,还用于根据所述触摸信号显示第二环境界面,所述第二环境界面中包括通过所述瞄准镜配件对所述虚拟环境进行观察的瞄准镜画面;
所述接收模块,还用于当接收到所述触摸操作的触摸结束信号时,恢复显示所述人称视角画面。
另一方面,提供了一种计算机设备,所述计算机设备包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如上述本申请实施例中提供的虚拟环境中瞄准镜的应用方法。
另一方面,提供了一种存储介质,所述存储介质用于存储计算机程序,所述计算机程序用于执行以实现如上述本申请实施例中提供的虚拟环境中瞄准镜的应用方法。
另一方面,提供了一种计算机程序产品,当所述计算机程序产品在计算机上运行时,使得计算机执行如上述本申请实施例中提供的虚拟环境中瞄准镜的应用方法。
本申请实施例提供的技术方案带来的有益效果至少包括:
通过设置瞄准镜控制控件,并根据在该瞄准镜控制控件上的触摸操作是否持续来确定对应的触摸信号和触摸结束信号,从而针对在不通过瞄准镜配件观察的人称视角画面和通过瞄准镜配件观察的瞄准镜画面,仅需要对该瞄准镜配件进行单次触摸,即可在人称视角画面和瞄准镜画面之间进行切换,提高了瞄准镜配件的开启效率,从而提高了对虚拟环境进行观察的效率以及准确率。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请 的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是相关技术中提供的在虚拟环境中对瞄准镜配件进行应用的界面示意图;
图2是本申请一个示例性实施例提供的虚拟环境中对瞄准镜配件进行应用的界面示意图;
图3是本申请一个示例性实施例提供的终端的结构框图;
图4是本申请一个示例性实施例提供的实施环境示意图;
图5是本申请一个示例性实施例提供的虚拟环境中瞄准镜的应用方法流程图;
图6是基于图5示出的实施例提供的瞄准镜配件的显示示意图;
图7是基于图5示出的实施例提供的触摸操作与瞄准镜控制控件对应关系确定示意图;
图8是本申请另一个示例性实施例提供的虚拟环境中瞄准镜的应用方法流程图;
图9是基于图8示出的实施例提供的混合式控制方式对应的界面示意图;
图10是基于图8示出的实施例提供的混合式控制方式对应的时间轴示意图;
图11是本申请一个示例性实施例提供的点击控制方式对应的流程图;
图12是本申请一个示例性实施例提供的长按控制方式对应的流程图;
图13是本申请一个示例性实施例提供的混合式控制方式对应的流程图;
图14是本申请另一个示例性实施例提供的虚拟环境中瞄准镜的应用方法流程图;
图15是本申请一个示例性实施例提供的虚拟环境中瞄准镜的应用装置的结构框图;
图16是本申请另一个示例性实施例提供的虚拟环境中瞄准镜的应用装置的结构框图;
图17是本申请一个示例性实施例提供的终端的结构框图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请 实施方式作进一步地详细描述。
首先,对本申请实施例中涉及的名词进行简单介绍:
虚拟环境:是应用程序在终端上运行时显示(或提供)的虚拟环境。该虚拟环境可以是对真实世界的仿真环境,也可以是半仿真半虚构的三维环境,还可以是纯虚构的三维环境。虚拟环境可以是二维虚拟环境、2.5维虚拟环境和三维虚拟环境中的任意一种,下述实施例以虚拟环境是三维虚拟环境来举例说明,但对此不加以限定。可选地,该虚拟环境还用于至少两个虚拟角色之间的虚拟环境对战。可选地,该虚拟环境还用于至少两个虚拟角色之间使用虚拟枪械进行对战。可选地,该虚拟环境还用于在目标区域范围内,至少两个虚拟角色之间使用虚拟枪械进行对战,该目标区域范围会随虚拟环境中的时间推移而不断变小。
虚拟对象:是指虚拟环境中的可活动对象。该可活动对象可以是虚拟人物、虚拟动物、动漫人物等,比如:在三维虚拟环境中显示的人物、动物、植物、油桶、墙壁、石块等。可选地,虚拟对象是基于动画骨骼技术创建的三维立体模型。每个虚拟对象在三维虚拟环境中具有自身的形状和体积,占据三维虚拟环境中的一部分空间。
虚拟枪械:是指在虚拟环境通过发射子弹进行攻击的虚拟武器,虚拟对象在虚拟环境中可以对虚拟枪械进行捡拾,并通过捡拾得到的虚拟枪械进行攻击。可选地,每个虚拟枪械都可以设置至少一个槽位,每个槽位用于装配至少一个枪械配件,如:M416自动步枪上通常设置有枪口槽位、握把槽位、弹夹槽位、枪托槽位以及瞄准镜槽位,其中,瞄准镜槽位上可以装配红点瞄准镜、全息瞄准镜、2倍瞄准镜(简称:2倍镜)、4倍瞄准镜(简称:4倍镜)和8倍瞄准镜(简称:8倍镜)中的任意一个,玩家在持有该M416自动步枪并开镜时,可以根据枪械装配的瞄准镜对虚拟环境进行一定程度的放大观察。
可选地,本申请实施例中以虚拟道具实现为该虚拟枪械为例进行说明,该虚拟道具装配有瞄准镜配件,也即该虚拟枪械的瞄准镜槽位上装配有瞄准镜配件。可选地,该瞄准镜配件可以是虚拟枪械上本身装配有的机械瞄准镜,也可以是虚拟对象在虚拟环境中捡拾得到的瞄准镜。
本申请中提供的方法可以应用于虚拟现实应用程序、三维地图程序、军事 仿真程序、第一人称射击游戏(First-person shooting game,FPS)、多人在线战术竞技游戏(Multiplayer Online Battle Arena Games,MOBA)等,下述实施例是以在游戏中的应用来举例说明。
基于虚拟环境的游戏往往由一个或多个游戏世界的地图构成,游戏中的虚拟环境模拟现实世界的场景,用户可以操控游戏中的虚拟对象在虚拟环境中进行行走、跑步、跳跃、射击、格斗、驾驶、切换使用虚拟武器、使用虚拟武器攻击其他虚拟对象等动作,交互性较强,并且多个用户可以在线组队进行竞技游戏。用户控制虚拟对象使用虚拟武器对目标虚拟对象发起攻击时,用户根据目标虚拟对象所在的位置,或操作习惯选择合适的虚拟武器对虚拟对象进行攻击。
相关技术中,在对虚拟环境的观察方式进行切换时,通常在环境界面中提供有瞄准镜控件,如图1所示,在第一环境界面中显示有以虚拟对象的第一人称视角对虚拟环境进行观察的人称视角画面100,该虚拟对象持有虚拟道具(如:虚拟枪械),该虚拟道具上装配有瞄准镜配件,该人称视角画面100上叠加显示有瞄准镜控件110,当用户在该瞄准镜控件110上进行点击后,显示瞄准镜画面120,该瞄准镜画面120为通过瞄准镜配件对虚拟环境进行观察的画面,该瞄准镜画面120上叠加显示有上述瞄准镜控件110,该瞄准镜控件110在瞄准镜配件开启后,显示为如图1所示的叉号形状,当用户在该瞄准镜控件110上再次点击时,关闭瞄准镜配件,并恢复以第一人称视角显示人称视角画面100。
本申请实施例提供了一种虚拟环境中瞄准镜的应用方法,图2示出了本申请实施例提供的虚拟环境中瞄准镜的应用方法的用户界面示意图,如图2所示:
在第一环境界面中显示有虚拟对象以第一人称视角对虚拟环境进行观察的人称视角画面200,该虚拟对象持有虚拟道具(如:虚拟枪械),该虚拟道具上装配有瞄准镜配件,该人称视角画面200上叠加显示有瞄准镜控制控件210。当用户在该瞄准镜控制控件210上进行触摸后,显示瞄准镜画面220,该瞄准镜画面220为通过瞄准镜配件对虚拟环境进行观察的画面,用户持续对该瞄准镜控制控件210进行触摸的过程中,终端持续显示该瞄准镜画面220,而当用户结束对该瞄准镜控制控件210的触摸时,终端关闭瞄准镜配件,并恢复显示人称视角画面200,值得注意的是,该人称视角画面200的画面内容可以是变化后的 内容,例如由于虚拟对象进行了移动、虚拟场景中的物体发生的位移等导致画面内容发生了变化,该人称视角画面200的画面内容也可以是未发生变化的内容。
本申请中的终端可以是台式计算机、膝上型便携计算机、手机、平板电脑、电子书阅读器、MP3(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)播放器、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器等等。该终端中安装和运行有支持虚拟环境的应用程序,比如支持三维虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图程序、军事仿真程序、TPS游戏、FPS游戏、MOBA游戏中的任意一种。可选地,该应用程序可以是单机版的应用程序,比如单机版的3D游戏程序,也可以是网络联机版的应用程序。
图3示出了本申请一个示例性实施例提供的电子设备的结构框图。该电子设备300包括:操作系统320和应用程序322。
操作系统320是为应用程序322提供对计算机硬件的安全访问的基础软件。
应用程序322是支持虚拟环境的应用程序。可选地,应用程序322是支持三维虚拟环境的应用程序。该应用程序322可以是虚拟现实应用程序、三维地图程序、军事仿真程序、第三人称射击游戏(Third-Personal Shooting Game,TPS)、FPS、MOBA游戏、多人枪战类生存游戏中的任意一种。该应用程序322可以是单机版的应用程序,比如单机版的4D游戏程序。
图4示出了本申请一个示例性实施例提供的计算机系统的结构框图。该计算机系统400包括:第一设备420、服务器440和第二设备460。
第一设备420安装和运行有支持虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图程序、军事仿真程序、TPS游戏、FPS游戏、MOBA游戏、多人枪战类生存游戏中的任意一种。第一设备420是第一用户使用的设备,第一用户使用第一设备420控制位于虚拟环境中的第一虚拟对象进行活动,该活动包括但不限于:调整身体姿态、爬行、步行、奔跑、骑行、跳跃、驾驶、拾取、射击、攻击、投掷中的至少一种。示意性的,第一虚拟对象是第一虚拟人物,比如仿真人物角色或动漫人物角色。
第一设备420通过无线网络或有线网络与服务器440相连。
服务器440包括一台服务器、多台服务器、云计算平台和虚拟化中心中的至少一种。服务器440用于为支持三维虚拟环境的应用程序提供后台服务。可选地,服务器440承担主要计算工作,第一设备420和第二设备460承担次要计算工作;或者,服务器440承担次要计算工作,第一设备420和第二设备460承担主要计算工作;或者,服务器440、第一设备420和第二设备460三者之间采用分布式计算架构进行协同计算。
第二设备460安装和运行有支持虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图程序、军事仿真程序、FPS游戏、MOBA游戏、多人枪战类生存游戏中的任意一种。第二设备460是第二用户使用的设备,第二用户使用第二设备460控制位于虚拟环境中的第二虚拟对象进行活动,该活动包括但不限于:调整身体姿态、爬行、步行、奔跑、骑行、跳跃、驾驶、拾取、射击、攻击、投掷中的至少一种。示意性的,第二虚拟对象是第二虚拟人物,比如仿真人物角色或动漫人物角色。
可选地,第一虚拟人物和第二虚拟人物处于同一虚拟环境中。可选地,第一虚拟人物和第二虚拟人物可以属于同一个队伍、同一个组织、具有好友关系或具有临时性的通讯权限。可选地,第一虚拟人物和第二虚拟人物也可以属于不同队伍、不同组织、或具有敌对性的两个团体。
可选地,第一设备420和第二设备460上安装的应用程序是相同的,或两个设备上安装的应用程序是不同控制系统平台的同一类型应用程序。第一设备420可以泛指多个设备中的一个,第二设备460可以泛指多个设备中的一个,第一设备420和第二设备460可以为前述的终端。本实施例仅以第一设备420和第二设备460来举例说明。第一设备420和第二设备460的设备类型相同或不同,该设备类型包括:游戏主机、台式计算机、智能手机、平板电脑、电子书阅读器、MP3播放器、MP4播放器和膝上型便携计算机中的至少一种。以下实施例以设备是台式计算机来举例说明。
本领域技术人员可以知晓,上述设备的数量可以更多或更少。比如上述设备可以仅为一个,或者上述设备为几十个或几百个,或者更多数量。本申请实施例对设备的数量和设备类型不加以限定。
结合上述名词简介以及实施环境说明,对本申请实施例提供的虚拟环境中 瞄准镜的应用方法进行说明,以该方法应用于终端中为例进行说明,如图5所示,该方法包括:
步骤501,显示第一环境界面,该第一环境界面中包括以虚拟对象的人称视角对虚拟环境进行观察的人称视角画面,该虚拟对象持有虚拟道具,该虚拟道具上装配有瞄准镜配件,该第一环境界面中还包括瞄准镜控制控件。
可选地,该人称视角画面可以是以虚拟对象的第一人称视角显示的画面,也可以是以虚拟对象的第三人称视角显示的画面。其中,第一人称视角即虚拟对象在虚拟环境中所能观察到的画面对应的视角,该第一人称视角对应的画面中不包括该虚拟对象本身,比如只能看到虚拟对象的手臂和虚拟枪械;第三人称视角即在虚拟环境中通过摄像机模型对虚拟对象进行观察的视角,该第三人称视角对应的画面中包括虚拟对象本身,且通常该摄像机模型是位于虚拟对象的后方对该虚拟对象进行观察的,比如能够看到虚拟对象的三维模型以及该虚拟对象所持有的虚拟道具(如:虚拟枪械)。
可选地,当以虚拟对象的第一人称视角或者第三人称视角显示人称视角画面时,该人称视角画面中还包括道具状态栏,该道具状态栏中显示有该虚拟道具上装配的瞄准镜配件,以第一人称视角为例,如图6所示,图6是以虚拟对象的第一人称视角显示的人称视角画面61的界面示意图,该人称视角画面61中包括道具状态栏62,该道具状态栏62中显示有该虚拟道具上装配的瞄准镜配件“红点”,即红点瞄准镜。
可选地,当以虚拟对象的第三人称视角显示人称视角画面时,该人称视角画面中还包括该虚拟道具,该虚拟道具上显示有被装配的瞄准镜配件。
步骤502,接收在瞄准镜控制控件上的触摸操作,生成触摸信号。
可选地,该方法应用于带有触摸显示屏的终端上,接收在该触摸显示屏上对该瞄准镜控制控件上的触摸操作,并生成触摸信号,该触摸信号为该触摸操作开始时生成的用于表示该触摸操作开始的信号。
可选地,接收在触摸显示屏上的触摸操作,确定该触摸操作在触摸显示屏上对应的第一坐标,确定瞄准镜控制控件的中心点在触摸显示屏上对应的第二坐标,当第一坐标和第二坐标之间的距离小于要求距离时,确定触摸操作为作用于瞄准镜控制控件上的触摸操作。
可选地,该触摸显示屏对应有目标坐标系,在该目标坐标系中确定该触摸操作对应的第一坐标,以及确定瞄准镜控制控件的中心点对应的第二坐标。示意性的,以点A为触摸操作所在点,点B为瞄准镜控制控件的中心点为例进行说明,当点A坐标为(x1,y1),点B的坐标为(x2,y2)时,则点A和点B之间的距离计算方式请参考如下公式一:
公式一:
Figure PCTCN2020103000-appb-000001
示意性的,请参考图7,该点A的坐标为(-20,20),点B的坐标为(20,-10),则AB之间的距离为50,当要求距离为100时,则确定该AB之间的距离小于要求距离,也即触摸操作为作用于瞄准镜控制控件上的触摸操作。
步骤503,根据触摸信号显示第二环境界面,该第二环境界面中包括通过瞄准镜配件对虚拟环境进行观察的瞄准镜画面。
可选地,该虚拟道具上装配的瞄准镜配件可以是机械瞄准镜、基础瞄准镜、低倍瞄准镜和高倍瞄准镜中的至少一种,其中,机械瞄准镜是指虚拟道具上本身装配的瞄准镜,也即,当虚拟道具的瞄准镜槽位上未装配其他瞄准镜时,可以通过该机械瞄准镜对虚拟环境进行观察;基础瞄准镜是指在观察虚拟环境中无放大作用,或者说,以1倍放大效果对虚拟环境进行观察的瞄准镜配件,可选地,该基础瞄准镜包括红点瞄准镜和全息瞄准镜中的至少一种;低倍瞄准镜是指对虚拟环境进行观察时,放大倍数较低的瞄准镜配件,如:2倍镜、3倍镜等;高倍瞄准镜是指对虚拟环境进行观察时,放大倍数较高的瞄准镜配件,如:4倍镜、6倍镜、8倍镜、15倍镜等。
步骤504,当接收到触摸操作的触摸结束信号时,恢复显示人称视角画面。
可选地,用户通过在触摸显示屏上对瞄准镜控制控件进行触摸,作为对该瞄准镜控制控件的触摸操作,当用户松开对该瞄准镜控制控件的触摸时,终端生成该触摸结束信号。
综上所述,本申请实施例提供的虚拟环境中瞄准镜的应用方法,通过设置瞄准镜控制控件,并根据在该瞄准镜控制控件上的触摸操作是否持续,从而在不通过瞄准镜配件观察的人称视角画面和通过瞄准镜配件观察的瞄准镜画面之间进行切换,仅需要对该瞄准镜配件进行单次触摸,即可在人称视角画面和瞄准镜画面之间进行切换,提高了瞄准镜配件的开启效率,从而提高了对虚拟 环境进行观察的效率以及准确率。
在一个可选的实施例中,根据触摸操作的触摸时长是否达到要求时长分为不同的瞄准镜控制方式,图8是本申请另一个示例性实施例提供的虚拟环境中瞄准镜的应用方法的流程图,以该方法应用于终端中为例进行说明,如图8所示,该方法包括:
步骤801,显示第一环境界面,该第一环境界面中包括以虚拟对象的人称视角对虚拟环境进行观察的人称视角画面,该虚拟对象持有虚拟道具,该虚拟道具上装配有瞄准镜配件,该第一环境界面中还包括瞄准镜控制控件。
可选地,该人称视角画面是以虚拟对象的第一人称视角显示的画面,也可以是以虚拟对象的第三人称视角显示的画面。
步骤802,接收在瞄准镜控制控件上的触摸操作,生成触摸信号。
可选地,该方法应用于带有触摸显示屏的终端上,接收在该触摸显示屏上对该瞄准镜控制控件上的触摸操作。
可选地,接收在该瞄准镜控制控件上的触摸操作之前,还可以在功能设置界面中对瞄准镜控制方式进行设置,其中,该瞄准镜控制方式包括点击控制方式、长按控制方式和混合式控制方式中的任意一种。可选地,点击控制方式是指通过点击瞄准镜控制控件开启瞄准镜配件,以及点击瞄准镜控制控件关闭瞄准镜配件的方式;长按控制方式是指长按瞄准镜控制控件开启瞄准镜配件,并当结束长按操作时关闭瞄准镜配件的方式;混合式控制方式是指以要求时长为临界时长在点击控制方式和长按控制方式之间选择的控制方式。
可选地,该第一环境界面中还包括功能设置控件,该功能设置控件用于开启功能设置界面,接收在该功能设置控件上的触发操作,根据触发操作显示功能设置界面,该功能设置界面中包括瞄准镜设置选项,该瞄准镜设置选项中包括点击控制选项、长按控制选项和混合式控制选项,其中,点击控制选项对应上述点击控制方式,长按控制选项对应上述长按控制方式,混合式控制选项对应上述混合式控制方式,可选地,接收在混合式控制选项上的开启操作,该混合式控制选项用于指示以要求时长为临界时长确定瞄准镜配件的关闭方式。
步骤803,根据触摸信号显示第二环境界面,该第二环境界面中包括通过瞄准镜配件对虚拟环境进行观察的瞄准镜画面。
可选地,该虚拟道具上装配的瞄准镜配件可以是机械瞄准镜、基础瞄准镜、低倍瞄准镜和高倍瞄准镜中的至少一种。
步骤804,当接收到触摸操作的触摸结束信号时,确定触摸操作的触摸时长。
步骤805,当触摸时长达到要求时长时,根据触摸结束信号关闭瞄准镜配件,恢复显示人称视角画面。
可选地,当触摸操作的触摸时长达到要求时长而未接收到触摸结束信号时,则将长按控制方式确定为对瞄准镜配件的控制方式,也即,当结束对该瞄准镜控制控件的触摸操作时,关闭瞄准镜控件。
可选地,该要求时长为预先设置的时长,示意性的,该要求时长为1秒,则当触摸操作的触摸时长达到1秒且未接收到触摸结束信号时,将触摸结束信号确定为控制瞄准镜配件关闭的信号。
步骤806,当触摸时长未达到要求时长时,将对瞄准镜控制控件的点击信号确定为控制瞄准镜配件关闭的信号。
可选地,当该触摸操作的触摸时长未达到要求时长且接收到触摸结束信号时,该触摸操作可以视为点击操作。
步骤807,当接收到在瞄准镜控制控件上的点击信号时,关闭瞄准镜配件,恢复显示人称视角画面。
示意性的,请参考图9,在第一环境界面中显示有以虚拟对象的第一人称视角对虚拟环境进行观察的人称视角画面900,该人称视角画面900上叠加显示有瞄准镜控制控件910,在第一分支情况下,用户在该瞄准镜控制控件910上进行触摸操作,根据该触摸操作显示瞄准镜画面920,且该触摸操作的触摸时长达到要求时长,则当该触摸操作结束时,恢复显示人称视角画面900;而在第二分支情况下,用户在该瞄准镜控制控件910上进行触摸操作,根据该触摸操作显示瞄准镜画面920,在触摸时长未达到要求时长时结束触摸操作,则当再次接收到在该瞄准镜控制控件910上的点击操作时,恢复显示人称视角画面900,值得注意的是,该人称视角画面900的画面内容可以是变化后的内容,也可以是未发生变化的内容。
示意性的,请参考图10,在时间轴上,时间点S为用户触摸瞄准镜控制控 件的时刻,时间点A为第一用户结束触摸的时刻,时间点B为第二用户结束触摸的时刻,时间点O为要求时长对应的临界时刻,则第一用户开启瞄准镜配件后需要再次点击瞄准镜控制控件关闭瞄准镜配件,第二用户在时间点B结束触摸时关闭瞄准镜配件。
综上所述,本申请实施例提供的虚拟环境中瞄准镜的应用方法,通过设置瞄准镜控制控件,并根据在该瞄准镜控制控件上的触摸操作是否持续来确定对应的触摸信号和触摸结束信号,从而针对在不通过瞄准镜配件观察的人称视角画面和通过瞄准镜配件观察的瞄准镜画面,仅需要对该瞄准镜配件进行单次触摸,即可在人称视角画面和瞄准镜画面之间进行切换,提高了瞄准镜配件的开启效率,从而提高了对虚拟环境进行观察的效率以及准确率。
本实施例提供的方法,通过设置要求时长,以要求时长作为点击控制方式和长按控制方式的临界时长,用户可以通过单次触摸操作确定对该瞄准镜配件的控制方式,提高了控制效率。
示意性的,结合附图对上述点击控制方式、长按控制方式以及混合式控制方式分别进行说明:
第一,点击控制方式的实现过程请参考图11,该过程包括如下步骤:
步骤1101,瞄准镜配件处于关闭状态。可选地,终端显示的环境界面中包括以虚拟对象的第一人称视角或第三人称视角对虚拟环境进行观察的画面。步骤1102,判断用户是否触摸该瞄准镜控制控件。步骤1103,当用户触摸该瞄准镜控制控件时,开启瞄准镜配件。可选地,开启瞄准镜配件后,通过该瞄准镜配件对虚拟环境进行观察。步骤1104,用户结束对瞄准镜控制控件的触摸。步骤1105,保持开镜状态。步骤1106,判断用户是否再次点击瞄准镜控制控件。步骤1107,当用户再次点击瞄准镜控制控件时,关闭瞄准镜配件。
第二,长按控制方式的实现过程请参考图12,该过程包括如下步骤:
步骤1201,瞄准镜配件处于关闭状态。可选地,终端显示的环境界面中包括以虚拟对象的第一人称视角或第三人称视角对虚拟环境进行观察的画面。步骤1202,判断用户是否触摸该瞄准镜控制控件。步骤1203,当用户触摸该瞄准镜控制控件时,开启瞄准镜配件。可选地,开启瞄准镜配件后,通过该瞄准镜配件对虚拟环境进行观察。步骤1204,判断用户是否结束对该瞄准镜控制控件 的触摸操作。步骤1205,当用户结束对该瞄准镜控制控件的触摸操作时,关闭瞄准镜配件。步骤1206,当用户未结束对该瞄准镜控制控件的触摸操作时,保持瞄准镜配件开启。
第三,混合式控制方式的实现过程请参考图13,该过程包括如下步骤:
步骤1301,选择混合式控制方式。步骤1302,判断用户是否触摸该瞄准镜控制控件。步骤1303,当用户触摸该瞄准镜控制控件时,开启瞄准镜配件。步骤1304,记录触摸操作的触摸时长。步骤1305,判断用户是否结束触摸瞄准镜控制控件。步骤1306,当用户结束触摸瞄准镜控制控件时,判断是否触发长按控制方式。步骤1307,当触发长按控制方式时,关闭瞄准镜配件。步骤1308,当未触发长按控制方式时,保持开启瞄准镜配件。步骤1309,判断是否再次接收到对瞄准镜控制控件的点击操作。可选地,当再次接收到对瞄准镜控制控件的点击操作时,关闭瞄准镜配件。
在一个可选的实施例中,上述瞄准镜配件在开启后,通过瞄准镜观察虚拟环境时,可以通过拖动操作调整视角方向,图14是本申请另一个示例性实施例提供的虚拟环境中瞄准镜的应用方法的流程图,以该方法应用于终端中为例进行说明,如图14所示,该方法包括:
步骤1401,显示第一环境界面,该第一环境界面中包括以虚拟对象的人称视角对虚拟环境进行观察的人称视角画面,该虚拟对象持有虚拟道具,该虚拟道具上装配有瞄准镜配件,该第一环境界面中还包括瞄准镜控制控件。
可选地,该人称视角画面是以虚拟对象的第一人称视角显示的画面,也可以是以虚拟对象的第三人称视角显示的画面。
步骤1402,接收在瞄准镜控制控件上的触摸操作,生成触摸信号。
可选地,该方法应用于带有触摸显示屏的终端上,接收在该触摸显示屏上对该瞄准镜控制控件上的触摸操作。
可选地,接收在该瞄准镜控制控件上的触摸操作之前,还可以在功能设置界面中对瞄准镜控制方式进行设置,其中,该瞄准镜控制方式包括点击控制方式、长按控制方式和混合式控制方式中的任意一种。可选地,点击控制方式是指通过点击瞄准镜控制控件开启瞄准镜配件,以及点击瞄准镜控制控件关闭瞄准镜配件的方式;长按控制方式是指长按瞄准镜控制控件开启瞄准镜配件,并 当结束长按操作时关闭瞄准镜配件的方式;混合式控制方式是指以要求时长为临界时长在点击控制方式和长按控制方式之间选择的控制方式。
可选地,该第一环境界面中还包括功能设置控件,该功能设置控件用于开启功能设置界面,接收在该功能设置控件上的触发操作,根据触发操作显示功能设置界面,该功能设置界面中包括瞄准镜设置选项,该瞄准镜设置选项中包括点击控制选项、长按控制选项和混合式控制选项,其中,点击控制选项想对应上述点击控制方式,长按控制选项对应上述长按控制方式,混合式控制选项对应上述混合式控制方式,可选地,接收在混合式控制选项上的开启操作,该混合式控制选项用于指示以要求时长为临界时长确定瞄准镜配件的关闭方式。
步骤1403,根据触摸信号显示第二环境界面,该第二环境界面中包括通过瞄准镜配件对虚拟环境进行观察的瞄准镜画面。
可选地,该虚拟道具上装配的瞄准镜配件可以是机械瞄准镜、基础瞄准镜、低倍瞄准镜和高倍瞄准镜中的至少一种。
可选地,该瞄准镜画面为通过瞄准镜配件以第一视角方向对虚拟环境进行观察的画面。
步骤1404,接收根据触摸操作确定的拖动操作。
可选地,接收到触摸操作后,在未结束该触摸操作时接收与该触摸操作衔接的拖动操作。
步骤1405,根据拖动操作对第一视角方向进行调整,得到第二视角方向。
可选地,该第一视角方向为摄像机模型以瞄准镜配件的形式对虚拟环境进行观察的视角方向,根据该拖动操作的拖动方向和拖动距离,对该摄像机模型在虚拟环境中进行对应的旋转,并将旋转后的视角方向作为上述第二视角方向。可选地,该第二视角方向也未摄像机模型以瞄准镜配件的形式对虚拟环境进行观察的视角方向。
可选地,当未通过瞄准镜配件对虚拟环境进行观察时,对视角方向进行调整的速度为第一调整速度,而根据上述拖动操作以第二调整速度对第一视角方向进行调整,得到第二视角方向,其中,第二调整速度慢于第一调整速度。
步骤1406,显示第三环境界面,该第三环境界面中包括通过瞄准镜配件以第二视角方向对虚拟环境进行观察的画面。
步骤1407,当接收到触摸操作的触摸结束信号时,关闭瞄准镜配件,恢复显示人称视角画面。
可选地,用户通过在触摸显示屏上对瞄准镜控制控件进行触摸,作为对该瞄准镜控制控件的触摸操作,当用户松开对该瞄准镜控制控件的触摸时,终端生成该触摸结束信号。
综上所述,本申请实施例提供的虚拟环境中瞄准镜的应用方法,通过设置瞄准镜控制控件,并根据在该瞄准镜控制控件上的触摸操作是否持续,从而在不通过瞄准镜配件观察的人称视角画面和通过瞄准镜配件观察的瞄准镜画面之间进行切换,仅需要对该瞄准镜配件进行单次触摸,即可在人称视角画面和瞄准镜画面之间进行切换,提高了瞄准镜配件的开启效率,从而提高了对虚拟环境进行观察的效率以及准确率。
本实施例提供的方法,在开启瞄准镜配件并通过瞄准镜配件观察虚拟环境时,通过接收与触摸操作衔接的拖动操作调整对虚拟环境进行观察的视角方向,也即在单次触摸事件中实现开启瞄准镜配件、调整视角以及关闭瞄准镜配件的操作,提高了瞄准镜配件的应用效率。
图15是本申请一个示例性实施例提供的虚拟环境中瞄准镜的应用装置的结构框图,以该装置被配置于终端中为例进行说明,如图15所示,该装置包括:显示模块1510和接收模块1520;
显示模块1510,用于显示第一环境界面,所述第一环境界面中包括以虚拟对象的人称视角对所述虚拟环境进行观察的人称视角画面,所述虚拟对象持有虚拟道具,所述虚拟道具上装配有瞄准镜配件,所述第一环境界面中还包括瞄准镜控制控件;
接收模块1520,用于接收在所述瞄准镜控制控件上的触摸操作,生成触摸信号;
所述显示模块1510,还用于根据所述触摸信号显示第二环境界面,所述第二环境界面中包括通过所述瞄准镜配件对所述虚拟环境进行观察的瞄准镜画面;
所述接收模块1520,还用于当接收到所述触摸操作的触摸结束信号时,恢复显示所述人称视角画面。
在一个可选的实施例中,所述接收模块1520,还用于当接收到所述触摸操作的触摸结束信号时,确定所述触摸操作的触摸时长;当所述触摸时长达到要求时长时,根据所述触摸结束信号关闭所述瞄准镜配件,恢复显示所述人称视角画面。
在一个可选的实施例中,如图16所示,该装置还包括:
确定模块1530,用于当所述触摸时长未达到所述要求时长时,将对所述瞄准镜控制控件的点击信号确定为控制所述瞄准镜配件关闭的信号;
所述接收模块1520,还用于当接收到在所述瞄准镜控制控件上的所述点击信号时,关闭所述瞄准镜配件;
所述显示模块1510,还用于恢复显示所述人称视角画面。
在一个可选的实施例中,所述接收模块1520,还用于接收在功能设置控件上的触发操作;
所述显示模块1510,还用于根据所述触发操作显示功能设置界面,所述功能设置界面中包括瞄准镜设置选项,所述瞄准镜设置选项中包括混合式控制选项;
所述接收模块1520,还用于接收在所述混合式控制选项上的开启操作,所述混合式控制选项用于指示以所述要求时长为临界时长确定所述瞄准镜配件的关闭方式。
在一个可选的实施例中,所述瞄准镜画面为通过所述瞄准镜配件以第一视角方向对所述虚拟环境进行观察的画面;
所述接收模块1520,还用于接收根据所述触摸操作确定的拖动操作;
所述装置还包括:
调整模块1540,用于根据所述拖动操作对所述第一视角方向进行调整,得到第二视角方向;
所述显示模块1510,还用于显示第三环境界面,所述第三环境界面中包括通过所述瞄准镜配件以所述第二视角方向对所述虚拟环境进行观察的画面。
在一个可选的实施例中,当未通过所述瞄准镜配件对所述虚拟环境进行观察时,对视角方向进行调整的速度为第一调整速度;
所述调整模块1540,还用于根据所述拖动操作以第二调整速度对所述第一 视角方向进行调整,其中,所述第二调整速度慢于所述第一调整速度。
在一个可选的实施例中,所述接收模块1520,还用于接收在触摸显示屏上的所述触摸操作;
所述装置还包括:
确定模块1530,用于确定所述触摸操作在所述触摸显示屏上对应的第一坐标;确定所述瞄准镜控制控件的中心点在所述触摸显示屏上对应的第二坐标;当所述第一坐标和所述第二坐标之间的距离小于要求距离时,确定所述触摸操作为作用于所述瞄准镜控制控件上的触摸操作。
综上所述,本申请实施例提供的虚拟环境中瞄准镜的应用装置,通过设置瞄准镜控制控件,并根据在该瞄准镜控制控件上的触摸操作是否持续来确定对应的触摸信号和触摸结束信号,从而针对在不通过瞄准镜配件观察的人称视角画面和通过瞄准镜配件观察的瞄准镜画面,仅需要对该瞄准镜配件进行单次触摸,即可在人称视角画面和瞄准镜画面之间进行切换,提高了瞄准镜配件的开启效率,从而提高了对虚拟环境进行观察的效率以及准确率。
需要说明的是:上述实施例提供的虚拟环境中瞄准镜的应用装置,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将设备的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的虚拟环境中瞄准镜的应用装置与虚拟环境中瞄准镜的应用虚拟环境中瞄准镜的应用方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
图17示出了本申请一个示例性实施例提供的终端1700的结构框图。该终端1700可以是:智能手机、平板电脑、MP3播放器(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器、笔记本电脑或台式电脑。终端1700还可能被称为用户设备、便携式终端、膝上型终端、台式终端等其他名称。
通常,终端1700包括有:处理器1701和存储器1702。
处理器1701可以包括一个或多个处理核心,比、如4核心处理器、8核心处理器等。处理器1701可以采用DSP(Digital Signal Processing,数字信号处理)、 FPGA(Field-Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。处理器1701也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器1701可以在集成有GPU(Graphics Processing Unit,图像处理器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器1701还可以包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器1702可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是非暂态的。存储器1702还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1702中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器1701所执行以实现本申请中方法实施例提供的虚拟环境中瞄准镜的应用方法。
在一些实施例中,终端1700还可选包括有:外围设备接口1703和至少一个外围设备。处理器1701、存储器1702和外围设备接口1703之间可以通过总线或信号线相连。各个外围设备可以通过总线、信号线或电路板与外围设备接口1703相连。具体地,外围设备包括:射频电路1704、触摸显示屏1705、摄像头1706、音频电路1707、定位组件1708和电源1709中的至少一种。
外围设备接口1703可被用于将I/O(Input/Output,输入/输出)相关的至少一个外围设备连接到处理器1701和存储器1702。在一些实施例中,处理器1701、存储器1702和外围设备接口1703被集成在同一芯片或电路板上;在一些其他实施例中,处理器1701、存储器1702和外围设备接口1703中的任意一个或两个可以在单独的芯片或电路板上实现,本实施例对此不加以限定。
射频电路1704用于接收和发射RF(Radio Frequency,射频)信号,也称电磁信号。射频电路1704通过电磁信号与通信网络以及其他通信设备进行通信。射频电路1704将电信号转换为电磁信号进行发送,或者,将接收到的电磁信号转换为电信号。可选地,射频电路1704包括:天线系统、RF收发器、一个或 多个放大器、调谐器、振荡器、数字信号处理器、编解码芯片组、用户身份模块卡等等。射频电路1704可以通过至少一种无线通信协议来与其它终端进行通信。该无线通信协议包括但不限于:万维网、城域网、内联网、各代移动通信网络(2G、3G、4G及5G)、无线局域网和/或WiFi(Wireless Fidelity,无线保真)网络。在一些实施例中,射频电路1704还可以包括NFC(Near Field Communication,近距离无线通信)有关的电路,本申请对此不加以限定。
显示屏1705用于显示UI(User Interface,用户界面)。该UI可以包括图形、文本、图标、视频及其它们的任意组合。当显示屏1705是触摸显示屏时,显示屏1705还具有采集在显示屏1705的表面或表面上方的触摸信号的能力。该触摸信号可以作为控制信号输入至处理器1701进行处理。此时,显示屏1705还可以用于提供虚拟按钮和/或虚拟键盘,也称软按钮和/或软键盘。在一些实施例中,显示屏1705可以为一个,设置终端1700的前面板;在另一些实施例中,显示屏1705可以为至少两个,分别设置在终端1700的不同表面或呈折叠设计;在再一些实施例中,显示屏1705可以是柔性显示屏,设置在终端1700的弯曲表面上或折叠面上。甚至,显示屏1705还可以设置成非矩形的不规则图形,也即异形屏。显示屏1705可以采用LCD(Liquid Crystal Display,液晶显示屏)、OLED(Organic Light-Emitting Diode,有机发光二极管)等材质制备。
摄像头组件1706用于采集图像或视频。可选地,摄像头组件1706包括前置摄像头和后置摄像头。通常,前置摄像头设置在终端的前面板,后置摄像头设置在终端的背面。在一些实施例中,后置摄像头为至少两个,分别为主摄像头、景深摄像头、广角摄像头、长焦摄像头中的任意一种,以实现主摄像头和景深摄像头融合实现背景虚化功能、主摄像头和广角摄像头融合实现全景拍摄以及VR(Virtual Reality,虚拟现实)拍摄功能或者其它融合拍摄功能。在一些实施例中,摄像头组件1706还可以包括闪光灯。闪光灯可以是单色温闪光灯,也可以是双色温闪光灯。双色温闪光灯是指暖光闪光灯和冷光闪光灯的组合,可以用于不同色温下的光线补偿。
音频电路1707可以包括麦克风和扬声器。麦克风用于采集用户及环境的声波,并将声波转换为电信号输入至处理器1701进行处理,或者输入至射频电路1704以实现语音通信。出于立体声采集或降噪的目的,麦克风可以为多个,分 别设置在终端1700的不同部位。麦克风还可以是阵列麦克风或全向采集型麦克风。扬声器则用于将来自处理器1701或射频电路1704的电信号转换为声波。扬声器可以是传统的薄膜扬声器,也可以是压电陶瓷扬声器。当扬声器是压电陶瓷扬声器时,不仅可以将电信号转换为人类可听见的声波,也可以将电信号转换为人类听不见的声波以进行测距等用途。在一些实施例中,音频电路1707还可以包括耳机插孔。
定位组件1708用于定位终端1700的当前地理位置,以实现导航或LBS(Location Based Service,基于位置的服务)。定位组件1708可以是基于美国的GPS(Global Positioning System,全球定位系统)、中国的北斗系统或俄罗斯的伽利略系统的定位组件。
电源1709用于为终端1700中的各个组件进行供电。电源1709可以是交流电、直流电、一次性电池或可充电电池。当电源1709包括可充电电池时,该可充电电池可以是有线充电电池或无线充电电池。有线充电电池是通过有线线路充电的电池,无线充电电池是通过无线线圈充电的电池。该可充电电池还可以用于支持快充技术。
在一些实施例中,终端1700还包括有一个或多个传感器1710。该一个或多个传感器1710包括但不限于:加速度传感器1711、陀螺仪传感器1712、压力传感器1713、指纹传感器1714、光学传感器1715以及接近传感器1716。
加速度传感器1711可以检测以终端1700建立的坐标系的三个坐标轴上的加速度大小。比如,加速度传感器1711可以用于检测重力加速度在三个坐标轴上的分量。处理器1701可以根据加速度传感器1711采集的重力加速度信号,控制触摸显示屏1705以横向视图或纵向视图进行用户界面的显示。加速度传感器1711还可以用于游戏或者用户的运动数据的采集。
陀螺仪传感器1712可以检测终端1700的机体方向及转动角度,陀螺仪传感器1712可以与加速度传感器1711协同采集用户对终端1700的3D动作。处理器1701根据陀螺仪传感器1712采集的数据,可以实现如下功能:动作感应(比如根据用户的倾斜操作来改变UI)、拍摄时的图像稳定、游戏控制以及惯性导航。
压力传感器1713可以设置在终端1700的侧边框和/或触摸显示屏1705的下层。当压力传感器1713设置在终端1700的侧边框时,可以检测用户对终端1700 的握持信号,由处理器1701根据压力传感器1713采集的握持信号进行左右手识别或快捷操作。当压力传感器1713设置在触摸显示屏1705的下层时,由处理器1701根据用户对触摸显示屏1705的压力操作,实现对UI界面上的可操作性控件进行控制。可操作性控件包括按钮控件、滚动条控件、图标控件、菜单控件中的至少一种。
指纹传感器1714用于采集用户的指纹,由处理器1701根据指纹传感器1714采集到的指纹识别用户的身份,或者,由指纹传感器1714根据采集到的指纹识别用户的身份。在识别出用户的身份为可信身份时,由处理器1701授权该用户执行相关的敏感操作,该敏感操作包括解锁屏幕、查看加密信息、下载软件、支付及更改设置等。指纹传感器1714可以被设置终端1700的正面、背面或侧面。当终端1700上设置有物理按键或厂商Logo时,指纹传感器1714可以与物理按键或厂商Logo集成在一起。
光学传感器1715用于采集环境光强度。在一个实施例中,处理器1701可以根据光学传感器1715采集的环境光强度,控制触摸显示屏1705的显示亮度。具体地,当环境光强度较高时,调高触摸显示屏1705的显示亮度;当环境光强度较低时,调低触摸显示屏1705的显示亮度。在另一个实施例中,处理器1701还可以根据光学传感器1715采集的环境光强度,动态调整摄像头组件1706的拍摄参数。
接近传感器1716,也称距离传感器,通常设置在终端1700的前面板。接近传感器1716用于采集用户与终端1700的正面之间的距离。在一个实施例中,当接近传感器1716检测到用户与终端1700的正面之间的距离逐渐变小时,由处理器1701控制触摸显示屏1705从亮屏状态切换为息屏状态;当接近传感器1716检测到用户与终端1700的正面之间的距离逐渐变大时,由处理器1701控制触摸显示屏1705从息屏状态切换为亮屏状态。
本领域技术人员可以理解,图17中示出的结构并不构成对终端1700的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
本申请实施例还提供一种计算机设备,该计算机设备包括存储器和处理器,存储器中存储有至少一条指令、至少一段程序、代码集或指令集,至少一条指 令、至少一段程序、代码集或指令集由处理器加载并实现如上述本申请实施例提供的虚拟环境中瞄准镜的应用方法。
本申请实施例还提供一种存储介质,所述存储介质用于存储计算机程序,所述计算机程序用于执行以实现如上述本申请实施例提供的虚拟环境中瞄准镜的应用方法。
本申请还提供了一种计算机程序产品,当计算机程序产品在计算机上运行时,使得计算机执行上述各个方法实施例提供的虚拟环境中瞄准镜的应用方法。
本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,该计算机可读存储介质可以是上述实施例中的存储器中所包含的计算机可读存储介质;也可以是单独存在,未装配入终端中的计算机可读存储介质。该计算机可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如上述本申请实施例提供的虚拟环境中瞄准镜的应用方法。
可选地,该计算机可读存储介质可以包括:只读存储器(ROM,Read Only Memory)、随机存取记忆体(RAM,Random Access Memory)、固态硬盘(SSD,Solid State Drives)或光盘等。其中,随机存取记忆体可以包括电阻式随机存取记忆体(ReRAM,Resistance Random Access Memory)和动态随机存取存储器(DRAM,Dynamic Random Access Memory)。上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
以上所述仅为本申请的较佳实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (16)

  1. 一种虚拟环境中瞄准镜的应用方法,所述方法由终端执行,所述方法包括:
    显示第一环境界面,所述第一环境界面中包括以虚拟对象的人称视角对所述虚拟环境进行观察的人称视角画面,所述虚拟对象持有虚拟道具,所述虚拟道具上装配有瞄准镜配件,所述第一环境界面中还包括瞄准镜控制控件;
    接收在所述瞄准镜控制控件上的触摸操作,生成触摸信号;
    根据所述触摸信号显示第二环境界面,所述第二环境界面中包括通过所述瞄准镜配件对所述虚拟环境进行观察的瞄准镜画面;
    当接收到所述触摸操作的触摸结束信号时,恢复显示所述人称视角画面。
  2. 根据权利要求1所述的方法,所述当接收到所述触摸操作的触摸结束信号时,恢复显示所述人称视角画面,包括:
    当接收到所述触摸操作的触摸结束信号时,确定所述触摸操作的触摸时长;
    当所述触摸时长达到要求时长时,根据所述触摸结束信号关闭所述瞄准镜配件,恢复显示所述人称视角画面。
  3. 根据权利要求2所述的方法,所述方法还包括:
    当所述触摸时长未达到所述要求时长时,将对所述瞄准镜控制控件的点击信号确定为控制所述瞄准镜配件关闭的信号;
    当接收到在所述瞄准镜控制控件上的所述点击信号时,关闭所述瞄准镜配件,恢复显示所述人称视角画面。
  4. 根据权利要求3所述的方法,所述接收在所述瞄准镜控制控件上的触摸操作之前,还包括:
    接收在功能设置控件上的触发操作;
    根据所述触发操作显示功能设置界面,所述功能设置界面中包括瞄准镜设置选项,所述瞄准镜设置选项中包括混合式控制选项;
    接收在所述混合式控制选项上的开启操作,所述混合式控制选项用于指示以所述要求时长为临界时长确定所述瞄准镜配件的关闭方式。
  5. 根据权利要求1至4任一所述的方法,所述瞄准镜画面为通过所述瞄准镜配件以第一视角方向对所述虚拟环境进行观察的画面;
    所述根据所述触摸操作显示第二环境界面之后,还包括:
    接收根据所述触摸操作确定的拖动操作;
    根据所述拖动操作对所述第一视角方向进行调整,得到第二视角方向;
    显示第三环境界面,所述第三环境界面中包括通过所述瞄准镜配件以所述第二视角方向对所述虚拟环境进行观察的画面。
  6. 根据权利要求5所述的方法,当未通过所述瞄准镜配件对所述虚拟环境进行观察时,对视角方向进行调整的速度为第一调整速度;
    所述根据所述拖动操作对所述第一视角方向进行调整,包括:
    根据所述拖动操作以第二调整速度对所述第一视角方向进行调整,其中,所述第二调整速度慢于所述第一调整速度。
  7. 根据权利要求1至4任一所述的方法,所述接收在所述瞄准镜控制控件上的触摸操作,包括:
    接收在触摸显示屏上的触摸操作;
    确定所述触摸操作在所述触摸显示屏上对应的第一坐标;
    确定所述瞄准镜控制控件的中心点在所述触摸显示屏上对应的第二坐标;
    当所述第一坐标和所述第二坐标之间的距离小于要求距离时,确定所述触摸操作为作用于所述瞄准镜控制控件上的所述触摸操作。
  8. 一种虚拟环境中瞄准镜的应用装置,所述装置包括:
    显示模块,用于显示第一环境界面,所述第一环境界面中包括以虚拟对象的人称视角对所述虚拟环境进行观察的人称视角画面,所述虚拟对象持有虚拟道具,所述虚拟道具上装配有瞄准镜配件,所述第一环境界面中还包括瞄准镜控制控件;
    接收模块,用于接收在所述瞄准镜控制控件上的触摸操作,生成触摸信号;
    所述显示模块,还用于根据所述触摸信号显示第二环境界面,所述第二环境界面中包括通过所述瞄准镜配件对所述虚拟环境进行观察的瞄准镜画面;
    所述接收模块,还用于当接收到所述触摸操作的触摸结束信号时,恢复显示所述人称视角画面。
  9. 根据权利要求8所述的装置,所述接收模块,还用于当接收到所述触摸操作的触摸结束信号时,确定所述触摸操作的触摸时长;当所述触摸时长达 到要求时长时,根据所述触摸结束信号关闭所述瞄准镜配件,恢复显示所述人称视角画面。
  10. 根据权利要求9所述的装置,所述装置还包括:
    确定模块,用于当所述触摸时长未达到所述要求时长时,将对所述瞄准镜控制控件的点击信号确定为控制所述瞄准镜配件关闭的信号;
    所述接收模块,还用于当接收到在所述瞄准镜控制控件上的所述点击信号时,关闭所述瞄准镜配件;
    所述显示模块,还用于恢复显示所述人称视角画面。
  11. 根据权利要求10所述的装置,所述接收模块,还用于接收在功能设置控件上的触发操作;
    所述显示模块,还用于根据所述触发操作显示功能设置界面,所述功能设置界面中包括瞄准镜设置选项,所述瞄准镜设置选项中包括混合式控制选项;
    所述接收模块,还用于接收在所述混合式控制选项上的开启操作,所述混合式控制选项用于指示以所述要求时长为临界时长确定所述瞄准镜配件的关闭方式。
  12. 根据权利要求8至11任一所述的装置,所述瞄准镜画面为通过所述瞄准镜配件以第一视角方向对所述虚拟环境进行观察的画面;
    所述接收模块,还用于接收根据所述触摸操作确定的拖动操作;
    所述装置还包括:
    调整模块,用于根据所述拖动操作对所述第一视角方向进行调整,得到第二视角方向;
    所述显示模块,还用于显示第三环境界面,所述第三环境界面中包括通过所述瞄准镜配件以所述第二视角方向对所述虚拟环境进行观察的画面。
  13. 根据权利要求12所述的装置,当未通过所述瞄准镜配件对所述虚拟环境进行观察时,对视角方向进行调整的速度为第一调整速度;
    所述调整模块,还用于根据所述拖动操作以第二调整速度对所述第一视角方向进行调整,其中,所述第二调整速度慢于所述第一调整速度。
  14. 一种计算机设备,所述计算机设备包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、 所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如权利要求1至7任一所述的虚拟环境中瞄准镜的应用方法。
  15. 一种存储介质,所述存储介质用于存储计算机程序,所述计算机程序用于执行权利要求1至7任一所述的虚拟环境中瞄准镜的应用方法。
  16. 一种包括指令的计算机程序产品,当其在计算机上运行时,使得所述计算机执行权利要求1至7任一所述的虚拟环境中瞄准镜的应用方法。
PCT/CN2020/103000 2019-08-22 2020-07-20 虚拟环境中瞄准镜的应用方法和相关装置 WO2021031765A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
KR1020217032917A KR20210135594A (ko) 2019-08-22 2020-07-20 가상 환경에서 망원경 관찰의 적용 방법 및 관련 장치
SG11202109672U SG11202109672UA (en) 2019-08-22 2020-07-20 Application method and related apparatus of sighting telescope in virtual environment
JP2021554634A JP2022524802A (ja) 2019-08-22 2020-07-20 仮想環境におけるスコープの適用方法及び装置並びにコンピュータ装置及びプログラム
US17/504,081 US12121798B2 (en) 2019-08-22 2021-10-18 Method and related apparatus for application of scope in virtual environment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910780269.9A CN110448908B (zh) 2019-08-22 2019-08-22 虚拟环境中瞄准镜的应用方法、装置、设备及存储介质
CN201910780269.9 2019-08-22

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/504,081 Continuation US12121798B2 (en) 2019-08-22 2021-10-18 Method and related apparatus for application of scope in virtual environment

Publications (1)

Publication Number Publication Date
WO2021031765A1 true WO2021031765A1 (zh) 2021-02-25

Family

ID=68488566

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/103000 WO2021031765A1 (zh) 2019-08-22 2020-07-20 虚拟环境中瞄准镜的应用方法和相关装置

Country Status (5)

Country Link
JP (1) JP2022524802A (zh)
KR (1) KR20210135594A (zh)
CN (1) CN110448908B (zh)
SG (1) SG11202109672UA (zh)
WO (1) WO2021031765A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024183469A1 (zh) * 2023-03-07 2024-09-12 腾讯科技(深圳)有限公司 游戏画面的显示方法、装置、设备及计算机可读存储介质

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110448908B (zh) * 2019-08-22 2020-12-01 腾讯科技(深圳)有限公司 虚拟环境中瞄准镜的应用方法、装置、设备及存储介质
CN112330823B (zh) * 2020-11-05 2023-06-16 腾讯科技(深圳)有限公司 虚拟道具的显示方法、装置、设备及可读存储介质
CN112791398B (zh) * 2021-02-03 2024-07-16 网易(杭州)网络有限公司 游戏中虚拟倍镜的控制方法、装置、电子设备及存储介质
CN112774188B (zh) * 2021-02-03 2024-07-23 网易(杭州)网络有限公司 游戏中虚拟倍镜的调节方法、装置、电子设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107678647A (zh) * 2017-09-26 2018-02-09 网易(杭州)网络有限公司 虚拟射击主体控制方法、装置、电子设备及存储介质
CN108339272A (zh) * 2018-02-12 2018-07-31 网易(杭州)网络有限公司 虚拟射击主体控制方法及装置、电子设备、存储介质
CN110141869A (zh) * 2019-04-11 2019-08-20 腾讯科技(深圳)有限公司 操作控制方法、装置、电子设备及存储介质
CN110448908A (zh) * 2019-08-22 2019-11-15 腾讯科技(深圳)有限公司 虚拟环境中瞄准镜的应用方法、装置、设备及存储介质

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5238640B2 (ja) * 2009-08-18 2013-07-17 株式会社コナミデジタルエンタテインメント ゲーム装置、ゲーム制御プログラム、及びゲーム制御方法
CN103631408B (zh) * 2012-08-21 2019-01-04 腾讯科技(深圳)有限公司 触控指令的处理方法和装置
KR101777630B1 (ko) * 2017-02-01 2017-09-12 에스명심 주식회사 사격 모바일 게임 시스템
CN108939540B (zh) * 2018-07-04 2021-08-24 网易(杭州)网络有限公司 射击游戏辅助瞄准方法、装置、存储介质、处理器及终端
CN109847370A (zh) * 2019-03-26 2019-06-07 网易(杭州)网络有限公司 射击游戏的控制方法、装置、设备和存储介质
CN110090442B (zh) * 2019-05-16 2023-03-24 网易(杭州)网络有限公司 游戏的开镜方法及装置、电子设备、存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107678647A (zh) * 2017-09-26 2018-02-09 网易(杭州)网络有限公司 虚拟射击主体控制方法、装置、电子设备及存储介质
CN108339272A (zh) * 2018-02-12 2018-07-31 网易(杭州)网络有限公司 虚拟射击主体控制方法及装置、电子设备、存储介质
CN110141869A (zh) * 2019-04-11 2019-08-20 腾讯科技(深圳)有限公司 操作控制方法、装置、电子设备及存储介质
CN110448908A (zh) * 2019-08-22 2019-11-15 腾讯科技(深圳)有限公司 虚拟环境中瞄准镜的应用方法、装置、设备及存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024183469A1 (zh) * 2023-03-07 2024-09-12 腾讯科技(深圳)有限公司 游戏画面的显示方法、装置、设备及计算机可读存储介质

Also Published As

Publication number Publication date
JP2022524802A (ja) 2022-05-10
CN110448908B (zh) 2020-12-01
KR20210135594A (ko) 2021-11-15
SG11202109672UA (en) 2021-10-28
CN110448908A (zh) 2019-11-15
US20220032176A1 (en) 2022-02-03

Similar Documents

Publication Publication Date Title
CN108434736B (zh) 虚拟环境对战中的装备显示方法、装置、设备及存储介质
JP7471369B2 (ja) 仮想環境におけるアクセサリー選択方法及びその装置、機器並びにコンピュータプログラム
WO2020253832A1 (zh) 控制虚拟对象对虚拟物品进行标记的方法、装置及介质
CN108815851B (zh) 在虚拟环境中射击时的界面显示方法、设备及存储介质
CN110413171B (zh) 控制虚拟对象进行快捷操作的方法、装置、设备及介质
CN109529319B (zh) 界面控件的显示方法、设备及存储介质
US11656755B2 (en) Method and apparatus for controlling virtual object to drop virtual item and medium
US12036474B2 (en) Virtual object control method and apparatus, device, and readable storage medium
CN110755841B (zh) 虚拟环境中道具的切换方法、装置、设备及可读存储介质
WO2021036577A1 (zh) 控制虚拟对象的方法和相关装置
WO2021031765A1 (zh) 虚拟环境中瞄准镜的应用方法和相关装置
CN113398571B (zh) 虚拟道具的切换方法、装置、终端及存储介质
CN108786110B (zh) 虚拟环境中的瞄准镜显示方法、设备及存储介质
TWI802978B (zh) 應用程式內的控制項位置調整方法及裝置、設備及存儲介質
CN111672106B (zh) 虚拟场景显示方法、装置、计算机设备及存储介质
CN112330823B (zh) 虚拟道具的显示方法、装置、设备及可读存储介质
WO2021143253A1 (zh) 虚拟环境中虚拟道具的操作方法、装置、设备及可读介质
CN112451969A (zh) 虚拟对象控制方法、装置、计算机设备及存储介质
CN112402969B (zh) 虚拟场景中虚拟对象控制方法、装置、设备及存储介质
CN112044066A (zh) 界面显示方法、装置、设备及可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20854510

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021554634

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20217032917

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20854510

Country of ref document: EP

Kind code of ref document: A1