WO2021031765A1 - 虚拟环境中瞄准镜的应用方法和相关装置 - Google Patents
虚拟环境中瞄准镜的应用方法和相关装置 Download PDFInfo
- Publication number
- WO2021031765A1 WO2021031765A1 PCT/CN2020/103000 CN2020103000W WO2021031765A1 WO 2021031765 A1 WO2021031765 A1 WO 2021031765A1 CN 2020103000 W CN2020103000 W CN 2020103000W WO 2021031765 A1 WO2021031765 A1 WO 2021031765A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- scope
- control
- touch
- virtual
- environment
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/426—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
- A63F13/5252—Changing parameters of virtual cameras using two or more virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character changes room or displaying a rear-mirror view in a car-driving game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
Definitions
- the embodiments of the present application relate to the field of virtual environments, and particularly to the application of a scope in a virtual environment.
- the sight view control can be used to switch between the personal viewing angle observation mode and the scope accessory observation mode, that is, when the virtual environment is observed through the personal viewing angle mode, the observation mode can be switched to When observing the virtual environment through the scope accessory observation mode and through the scope observation mode, you can switch the observation mode to the personal perspective mode by clicking the scope control.
- the embodiment of the present application provides an application method and related device of a sight in a virtual environment, which can solve the problem of low efficiency of observation of the virtual environment and low accuracy of observation.
- the technical solution is as follows:
- a method for applying a sight in a virtual environment includes:
- a first environment interface is displayed, and the first environment interface includes a personal view screen for observing the virtual environment from the personal view of the virtual object, the virtual object holding a virtual item, and the virtual item is equipped with a sight Accessories, the first environment interface further includes a scope control control;
- the second environment interface including a scope picture for observing the virtual environment through the scope accessory;
- an application device for a sight in a virtual environment includes:
- the display module is configured to display a first environment interface, and the first environment interface includes a personal view screen for observing the virtual environment from the personal view of a virtual object, the virtual object holding a virtual item, and the virtual item Scope accessories are equipped on the upper, and the first environment interface also includes scope control controls;
- a receiving module configured to receive a touch operation on the scope control control and generate a touch signal
- the display module is further configured to display a second environment interface according to the touch signal, and the second environment interface includes a scope screen for observing the virtual environment through the scope accessory;
- the receiving module is further configured to restore the display of the person perspective picture when the touch end signal of the touch operation is received.
- a computer device in another aspect, includes a processor and a memory.
- the memory stores at least one instruction, at least one program, code set or instruction set, the at least one instruction, the at least A piece of program, the code set or the instruction set is loaded and executed by the processor to realize the application method of the sight in the virtual environment provided in the above embodiment of the present application.
- a storage medium is provided, the storage medium is used to store a computer program, and the computer program is used to execute to implement the application method of the sight in the virtual environment as provided in the above embodiment of the present application.
- a computer program product is provided.
- the computer program product runs on a computer, the computer executes the scope application method in the virtual environment as provided in the above embodiments of the present application.
- Figure 1 is a schematic diagram of an interface provided in the related art for applying a scope accessory in a virtual environment
- Fig. 2 is a schematic diagram of an interface for applying a scope accessory in a virtual environment provided by an exemplary embodiment of the present application
- Fig. 3 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
- Fig. 4 is a schematic diagram of an implementation environment provided by an exemplary embodiment of the present application.
- Fig. 5 is a flowchart of a method for applying a scope in a virtual environment provided by an exemplary embodiment of the present application
- FIG. 6 is a schematic diagram showing a display of a scope accessory provided based on the embodiment shown in FIG. 5;
- FIG. 7 is a schematic diagram of determining the corresponding relationship between the touch operation and the scope control control provided by the embodiment shown in FIG. 5;
- FIG. 8 is a flowchart of a method for applying a scope in a virtual environment provided by another exemplary embodiment of the present application.
- FIG. 9 is a schematic diagram of an interface corresponding to the hybrid control method provided based on the embodiment shown in FIG. 8;
- FIG. 10 is a schematic diagram of a time axis corresponding to the hybrid control method provided based on the embodiment shown in FIG. 8;
- Fig. 11 is a flowchart corresponding to a click control method provided by an exemplary embodiment of the present application.
- FIG. 12 is a flowchart corresponding to a long press control mode provided by an exemplary embodiment of the present application.
- FIG. 13 is a flowchart corresponding to a hybrid control method provided by an exemplary embodiment of the present application.
- FIG. 14 is a flowchart of a method for applying a scope in a virtual environment provided by another exemplary embodiment of the present application.
- Fig. 15 is a structural block diagram of an application device for a scope in a virtual environment provided by an exemplary embodiment of the present application;
- FIG. 16 is a structural block diagram of a scope application device in a virtual environment provided by another exemplary embodiment of the present application.
- Fig. 17 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
- Virtual environment the virtual environment displayed (or provided) when the application is running on the terminal.
- the virtual environment may be a simulation environment of the real world, a semi-simulation and semi-fictional three-dimensional environment, or a purely fictitious three-dimensional environment.
- the virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment.
- the following embodiments take the virtual environment as a three-dimensional virtual environment as an example, but are not limited thereto.
- the virtual environment is also used for a virtual environment battle between at least two virtual characters.
- the virtual environment is also used for battles between at least two virtual characters using virtual firearms.
- the virtual environment is also used to use virtual firearms for battle between at least two virtual characters within the target area, and the target area will continue to decrease with the passage of time in the virtual environment.
- Virtual object refers to the movable object in the virtual environment.
- the movable objects may be virtual characters, virtual animals, cartoon characters, etc., such as: characters, animals, plants, oil barrels, walls, stones, etc. displayed in a three-dimensional virtual environment.
- the virtual object is a three-dimensional model created based on animation skeleton technology.
- Each virtual object has its own shape and volume in the three-dimensional virtual environment, and occupies a part of the space in the three-dimensional virtual environment.
- Virtual firearms refer to virtual weapons that attack by firing bullets in a virtual environment.
- the virtual objects can pick up virtual firearms in the virtual environment and attack through the virtual firearms obtained.
- each virtual firearm can be provided with at least one slot, and each slot is used to assemble at least one firearm accessory, such as: M416 automatic rifles are usually provided with muzzle slots, grip slots, and magazine slots , Buttstock slot and scope slot, among them, the scope slot can be equipped with red dot scope, holographic scope, 2x scope (abbreviation: 2x lens), 4x scope (abbreviation: 4 When the player holds the M416 automatic rifle and opens the mirror, it can zoom in and observe the virtual environment to a certain extent according to the scope of the gun assembly.
- the virtual prop is realized as the virtual firearm as an example for description.
- the virtual prop is equipped with a scope accessory, that is, the scope slot of the virtual firearm is equipped with a scope accessory.
- the scope accessory may be a mechanical scope equipped on the virtual gun itself, or a scope obtained by picking up a virtual object in a virtual environment.
- the method provided in this application can be applied to virtual reality applications, three-dimensional map programs, military simulation programs, first-person shooting games (FPS), and multiplayer online battle arena games (MOBA) ), etc., the following embodiments are examples of applications in games.
- Games based on virtual environments are often composed of one or more maps of the game world.
- the virtual environment in the game simulates the scene of the real world. Users can manipulate the virtual objects in the game to walk, run, jump, shoot, and fight in the virtual environment. , Driving, switching to using virtual weapons, using virtual weapons to attack other virtual objects and other actions, the interaction is strong, and multiple users can team up for competitive games online.
- the user controls the virtual object to use the virtual weapon to attack the target virtual object the user selects a suitable virtual weapon to attack the virtual object according to the location of the target virtual object or operating habits.
- the scope control is usually provided in the environment interface.
- the first-person view of the virtual object is displayed in the first environment interface.
- the virtual object holds a virtual prop (such as a virtual firearm), the virtual prop is equipped with a scope accessory, and the person’s perspective screen 100 is superimposed and displayed with a scope control 110.
- the scope screen 120 is displayed.
- the scope screen 120 is a screen for observing the virtual environment through a scope accessory.
- the scope screen 120 is superimposed and displayed with the above-mentioned scope control 110.
- the scope control 110 displays the cross shape as shown in Fig. 1 after the scope accessory is turned on. When the user clicks on the scope control 110 again, the scope accessory is closed and the first-person perspective is restored to display the personal perspective screen 100.
- FIG. 2 shows a schematic diagram of a user interface of the method for applying a scope in a virtual environment provided by an embodiment of the present application, as shown in FIG. 2:
- a personal perspective screen 200 is displayed in which a virtual object observes the virtual environment from a first-person perspective.
- the virtual object holds a virtual prop (such as a virtual firearm), and the virtual prop is equipped with a scope accessory.
- a scope control control 210 is superimposed and displayed on the personal view screen 200.
- the scope screen 220 is displayed.
- the scope screen 220 is a screen for observing the virtual environment through the scope accessory.
- the user continues to touch the scope control control 210 During the process, the terminal continues to display the scope screen 220, and when the user ends the touch on the scope control control 210, the terminal closes the scope accessories and resumes displaying the person perspective screen 200.
- the screen content of 200 can be changed content, for example, the screen content has changed due to the movement of virtual objects, the displacement of objects in the virtual scene, etc., the screen content of the personal perspective screen 200 can also be unchanged content.
- the terminal in this application can be a desktop computer, a laptop portable computer, a mobile phone, a tablet computer, an e-book reader, an MP3 (Moving Picture Experts Group Audio Layer III, dynamic image expert compression standard audio layer 3) player, MP4 ( Moving Picture Experts Group Audio Layer IV, moving picture experts compress standard audio layer 4) Players, etc.
- An application program supporting a virtual environment is installed and running in the terminal, for example, an application program supporting a three-dimensional virtual environment.
- the application program can be any of virtual reality applications, three-dimensional map programs, military simulation programs, TPS games, FPS games, and MOBA games.
- the application program may be a stand-alone version application program, such as a stand-alone version of a 3D game program, or a network online version application program.
- Fig. 3 shows a structural block diagram of an electronic device provided by an exemplary embodiment of the present application.
- the electronic device 300 includes an operating system 320 and an application program 322.
- the operating system 320 is basic software that provides the application 322 with secure access to computer hardware.
- the application program 322 is an application program supporting a virtual environment.
- the application program 322 is an application program supporting a three-dimensional virtual environment.
- the application program 322 may be any of a virtual reality application program, a three-dimensional map program, a military simulation program, a third-person shooting game (Third-Personal Shooting Game, TPS), an FPS, a MOBA game, and a multiplayer gun battle survival game.
- the application program 322 may be a stand-alone version application program, such as a stand-alone version 4D game program.
- Fig. 4 shows a structural block diagram of a computer system provided by an exemplary embodiment of the present application.
- the computer system 400 includes: a first device 420, a server 440, and a second device 460.
- the first device 420 installs and runs an application program supporting the virtual environment.
- the application can be any of virtual reality applications, three-dimensional map programs, military simulation programs, TPS games, FPS games, MOBA games, and multiplayer gun battle survival games.
- the first device 420 is a device used by the first user.
- the first user uses the first device 420 to control the first virtual object in the virtual environment to perform activities, including but not limited to: adjusting body posture, crawling, walking, running, At least one of riding, jumping, driving, picking up, shooting, attacking, and throwing.
- the first virtual object is a first virtual character, such as a simulated character or an animation character.
- the first device 420 is connected to the server 440 through a wireless network or a wired network.
- the server 440 includes at least one of a server, multiple servers, a cloud computing platform, and a virtualization center.
- the server 440 is used to provide background services for applications supporting the three-dimensional virtual environment.
- the server 440 is responsible for the main calculation work, and the first device 420 and the second device 460 are responsible for the secondary calculation work; or, the server 440 is responsible for the secondary calculation work, and the first device 420 and the second device 460 are responsible for the main calculation work;
- the server 440, the first device 420, and the second device 460 adopt a distributed computing architecture to perform collaborative computing.
- the second device 460 installs and runs an application program supporting the virtual environment.
- the application can be any of virtual reality applications, three-dimensional map programs, military simulation programs, FPS games, MOBA games, and multiplayer gun battle survival games.
- the second device 460 is a device used by the second user.
- the second user uses the second device 460 to control the second virtual object in the virtual environment to perform activities, including but not limited to: adjusting body posture, crawling, walking, running, At least one of riding, jumping, driving, picking up, shooting, attacking, and throwing.
- the second virtual object is a second virtual character, such as a simulated character or an animation character.
- first virtual character and the second virtual character are in the same virtual environment.
- first virtual character and the second virtual character may belong to the same team, the same organization, have a friend relationship, or have temporary communication permissions.
- first virtual character and the second virtual character may also belong to different teams, different organizations, or two groups that are hostile.
- the applications installed on the first device 420 and the second device 460 are the same, or the applications installed on the two devices are the same type of applications on different control system platforms.
- the first device 420 may generally refer to one of multiple devices
- the second device 460 may generally refer to one of multiple devices
- the first device 420 and the second device 460 may be the aforementioned terminals.
- This embodiment only uses the first device 420 and the second device 460 as examples.
- the device types of the first device 420 and the second device 460 are the same or different.
- the device types include: game consoles, desktop computers, smart phones, tablet computers, e-book readers, MP3 players, MP4 players and laptop portables At least one of the computers.
- the device is a desktop computer as an example.
- the number of the above-mentioned devices can be more or less. For example, there may be only one device, or there may be dozens or hundreds of devices, or more.
- the embodiment of the present application does not limit the quantity and type of equipment.
- the method includes:
- Step 501 Display a first environment interface.
- the first environment interface includes a personal perspective screen for observing the virtual environment from the personal perspective of a virtual object, the virtual object holding a virtual prop, and the virtual prop is equipped with a scope accessory.
- the first environment interface also includes scope control controls.
- the person perspective picture may be a picture displayed in the first person perspective of the virtual object, or may be a picture displayed in the third person perspective of the virtual object.
- the first-person perspective is the perspective corresponding to the screen that the virtual object can observe in the virtual environment, and the screen corresponding to the first-person perspective does not include the virtual object itself, such as only the arms and virtual firearms of the virtual object can be seen;
- the third-person perspective refers to the perspective of observing the virtual object through the camera model in the virtual environment.
- the screen corresponding to the third-person perspective includes the virtual object itself, and usually the camera model is located behind the virtual object to observe the virtual object , For example, you can see the three-dimensional model of the virtual object and the virtual props (such as virtual firearms) held by the virtual object.
- FIG. 6 is a schematic diagram of the interface of the personal perspective screen 61 displayed in the first-person perspective of the virtual object.
- the personal perspective screen 61 includes an item status bar 62, and the item status bar 62 It shows the "red dot" of the scope accessory on the virtual prop, that is, the red dot scope.
- the person-view image when the person-view image is displayed in the third-person view of the virtual object, the person-view image further includes the virtual prop, and the assembled scope accessory is displayed on the virtual prop.
- Step 502 Receive a touch operation on the scope control control, and generate a touch signal.
- the method is applied to a terminal with a touch display screen, receives a touch operation on the scope control on the touch display screen, and generates a touch signal, which is generated when the touch operation starts Is used to indicate the start of the touch operation.
- a touch operation on the touch screen is received, the first coordinate corresponding to the touch operation on the touch screen is determined, and the second coordinate corresponding to the center point of the sight control control on the touch screen is determined, and when the first coordinate is When the distance between the first coordinate and the second coordinate is less than the required distance, it is determined that the touch operation is a touch operation acting on the scope control control.
- the touch display screen corresponds to a target coordinate system, in which the first coordinate corresponding to the touch operation is determined, and the second coordinate corresponding to the center point of the sight control control is determined.
- a target coordinate system in which the first coordinate corresponding to the touch operation is determined, and the second coordinate corresponding to the center point of the sight control control is determined.
- point A as the touch operation point
- point B as the center point of the scope control control as an example.
- the coordinates of point A are (x1, y1) and the coordinates of point B are (x2, y2)
- the calculation method of the distance between point A and point B please refer to the following formula 1:
- Step 503 Display a second environment interface according to the touch signal, and the second environment interface includes a scope picture for observing the virtual environment through the scope accessory.
- the scope accessories assembled on the virtual prop can be at least one of a mechanical scope, a basic scope, a low-power scope, and a high-power scope, where the mechanical scope refers to the virtual prop itself.
- Scope that is, when there is no other scope in the scope of the virtual prop, the mechanical scope can be used to observe the virtual environment;
- the basic scope means that there is no magnification in the observation of the virtual environment, or in other words , A scope accessory for observing the virtual environment with 1x magnification effect.
- the basic scope includes at least one of a red dot scope and a holographic scope;
- a low-magnification scope refers to observation of the virtual environment Scope, the scope accessories with lower magnification, such as 2x lens, 3x lens, etc.;
- high magnification scope refers to the scope accessories with higher magnification when observing the virtual environment, such as: 4x lens, 6 Multiplier, 8x, 15x, etc.
- Step 504 When the touch end signal of the touch operation is received, resume displaying the personal view screen.
- the user touches the scope control control on the touch display screen as a touch operation on the scope control control.
- the terminal When the user releases the touch on the scope control control, the terminal generates the touch end signal .
- the application method of the scope in the virtual environment provided by the embodiments of the present application, by setting the scope control control, and according to whether the touch operation on the scope control control continues, so as not to observe through the scope accessories Switch between the person’s angle of view screen and the scope screen observed through the scope accessory. Only a single touch of the scope accessory is required to switch between the person angle of view screen and the scope screen, which improves the scope.
- the opening efficiency of accessories improves the efficiency and accuracy of observation of the virtual environment.
- FIG. 8 is an example of a scope application method in a virtual environment provided by another exemplary embodiment of the present application. The flowchart is illustrated by taking the method applied to the terminal as an example. As shown in Fig. 8, the method includes:
- Step 801 Display a first environment interface.
- the first environment interface includes a personal perspective screen for observing the virtual environment from the personal perspective of a virtual object, the virtual object holding a virtual prop, and the virtual prop is equipped with a scope accessory.
- the first environment interface also includes scope control controls.
- the person perspective picture is a picture displayed in the first person perspective of the virtual object, or may be a picture displayed in the third person perspective of the virtual object.
- Step 802 Receive a touch operation on the scope control control, and generate a touch signal.
- the method is applied to a terminal with a touch display screen to receive a touch operation on the sight control control on the touch display screen.
- the scope control mode can also be set in the function setting interface, where the scope control mode includes a click control mode, a long press control mode, and a hybrid Any of the control modes.
- the click control method refers to the method of opening the scope accessory by clicking the scope control control, and the method of clicking the scope control control to close the scope accessory;
- the long-press control method refers to the long-pressing the scope control control to turn on the scope accessory, And when the long-press operation is over, the scope accessories are turned off;
- the hybrid control mode refers to the control mode that selects between the click control mode and the long-press control mode with the required time as the critical time.
- the first environment interface also includes a function setting control
- the function setting control is used to open the function setting interface, receive a trigger operation on the function setting control, and display the function setting interface according to the trigger operation.
- the function setting interface The scope includes the scope setting options.
- the scope setting options include click control options, long-press control options, and hybrid control options. Among them, the click control option corresponds to the above-mentioned click control method, and the long-press control option corresponds to the above-mentioned long-press control method.
- the hybrid control option corresponds to the aforementioned hybrid control method, and optionally, an opening operation on the hybrid control option is received, and the hybrid control option is used to instruct to determine the closing method of the sight accessory with the required duration as the critical duration.
- Step 803 Display a second environment interface according to the touch signal, and the second environment interface includes a scope picture for observing the virtual environment through the scope accessory.
- the scope accessory assembled on the virtual prop may be at least one of a mechanical scope, a basic scope, a low-power scope, and a high-power scope.
- Step 804 When the touch end signal of the touch operation is received, determine the touch duration of the touch operation.
- step 805 when the touch duration reaches the required duration, the scope accessory is turned off according to the touch end signal, and the display of the personal perspective screen is restored.
- the long-press control mode is determined as the control mode of the scope accessories, that is, when the control of the scope control control ends During touch operation, close the scope control.
- the required duration is a preset duration.
- the required duration is 1 second.
- the touch end signal is determined to be the control Signal that the scope accessory is closed.
- Step 806 When the touch duration does not reach the required duration, the click signal on the scope control control is determined as a signal for controlling the closing of the scope accessory.
- the touch operation may be regarded as a click operation.
- step 807 when the click signal on the scope control control is received, the scope accessories are closed, and the personal perspective screen is restored.
- a personal perspective screen 900 for observing the virtual environment from the first-person perspective of the virtual object is displayed, and the scope control control 910 is superimposed on the personal perspective screen 900.
- the user performs a touch operation on the scope control control 910, the scope screen 920 is displayed according to the touch operation, and the touch duration of the touch operation reaches the required duration, then when the touch operation ends, the system resumes Displays the personal perspective screen 900; and in the second branch, the user performs a touch operation on the scope control control 910, and displays the scope screen 920 according to the touch operation, and ends the touch operation when the touch time does not reach the required time
- the click operation on the scope control control 910 is received again, the personal view screen 900 is restored.
- the content of the personal view screen 900 may be changed or unchanged. Content.
- time point S is the moment when the user touches the sight control control
- time point A is the moment when the first user ends the touch
- time point B is the moment when the second user ends the touch
- the time point O is the critical moment corresponding to the required duration
- the application method of the scope in the virtual environment determines the corresponding touch signal and the end of the touch by setting the scope control control, and according to whether the touch operation on the scope control control continues.
- Signal so that for the person's angle of view screen that is not observed through the scope accessory and the scope screen that is observed through the scope accessory, only a single touch of the scope accessory is required to be between the person's angle of view screen and the scope screen Switching improves the opening efficiency of the scope accessories, thereby improving the efficiency and accuracy of observation of the virtual environment.
- the required duration is used as the critical duration of the click control mode and the long-press control mode.
- the user can determine the control mode of the scope accessory through a single touch operation, which improves the control efficiency.
- step 1101 the scope accessory is in a closed state.
- the environment interface displayed by the terminal includes a screen for observing the virtual environment from the first person perspective or the third person perspective of the virtual object.
- Step 1102 Determine whether the user touches the scope control control.
- Step 1103 When the user touches the scope control control, the scope accessory is turned on.
- Step 1104 the user ends the touch on the scope control control.
- Step 1105 keep the mirror on state.
- Step 1106 Determine whether the user clicks the scope control control again.
- Step 1107 When the user clicks the scope control control again, the scope accessory is closed.
- step 1201 the scope accessory is in a closed state.
- the environment interface displayed by the terminal includes a screen for observing the virtual environment from the first person perspective or the third person perspective of the virtual object.
- Step 1202 Determine whether the user touches the scope control control.
- Step 1203 When the user touches the scope control control, the scope accessory is turned on.
- Step 1204 It is judged whether the user ends the touch operation on the sight control control.
- Step 1205 When the user ends the touch operation on the scope control control, close the scope accessory.
- Step 1206 When the user does not end the touch operation on the scope control control, keep the scope accessory turned on.
- Step 1301 Select a hybrid control mode.
- Step 1302 Determine whether the user touches the scope control control.
- Step 1304 Record the touch duration of the touch operation.
- Step 1305 Determine whether the user has finished touching the scope control control.
- Step 1306 When the user finishes touching the scope control control, it is determined whether the long-press control mode is triggered.
- Step 1307 when the long-press control mode is triggered, the scope accessory is closed.
- Step 1309 It is judged whether the click operation on the scope control control is received again. Optionally, when a click operation on the scope control control is received again, the scope accessory is closed.
- FIG. 14 is a virtual environment provided by another exemplary embodiment of the present application.
- the flow chart of the application method of the scope is described by taking the application of the method in the terminal as an example. As shown in FIG. 14, the method includes:
- Step 1401 Display a first environment interface.
- the first environment interface includes a personal perspective screen for observing the virtual environment from the personal perspective of a virtual object, the virtual object holding a virtual prop, and the virtual prop is equipped with a scope accessory,
- the first environment interface also includes scope control controls.
- the person perspective picture is a picture displayed in the first person perspective of the virtual object, or may be a picture displayed in the third person perspective of the virtual object.
- Step 1402 Receive a touch operation on the scope control control, and generate a touch signal.
- the method is applied to a terminal with a touch display screen to receive a touch operation on the sight control control on the touch display screen.
- the scope control mode can also be set in the function setting interface, where the scope control mode includes a click control mode, a long press control mode, and a hybrid Any of the control modes.
- the click control method refers to the method of opening the scope accessory by clicking the scope control control, and the method of clicking the scope control control to close the scope accessory;
- the long-press control method refers to the long-pressing the scope control control to turn on the scope accessory, And when the long-press operation is over, the scope accessories are turned off;
- the hybrid control mode refers to the control mode that selects between the click control mode and the long-press control mode with the required time as the critical time.
- the first environment interface also includes a function setting control
- the function setting control is used to open the function setting interface, receive a trigger operation on the function setting control, and display the function setting interface according to the trigger operation.
- the function setting interface It includes the scope setting options.
- the scope setting options include click control options, long press control options, and hybrid control options. Among them, the click control option corresponds to the above click control method, and the long press control option corresponds to the above long press control method
- the hybrid control option corresponds to the hybrid control method described above.
- an opening operation on the hybrid control option is received, and the hybrid control option is used to instruct the required duration as the critical duration to determine the closing method of the scope accessory.
- Step 1403 Display a second environment interface according to the touch signal, and the second environment interface includes a scope picture for observing the virtual environment through the scope accessory.
- the scope accessory assembled on the virtual prop may be at least one of a mechanical scope, a basic scope, a low-power scope, and a high-power scope.
- the scope picture is a picture obtained by observing the virtual environment in the first angle of view through the scope accessory.
- Step 1404 Receive a drag operation determined according to the touch operation.
- a drag operation connected to the touch operation is received when the touch operation is not ended.
- Step 1405 Adjust the first viewing angle direction according to the drag operation to obtain the second viewing angle direction.
- the first viewing angle direction is the viewing angle direction for the camera model to observe the virtual environment in the form of a scope accessory
- the camera model is performed in the virtual environment according to the dragging direction and the dragging distance of the drag operation.
- the second viewing angle direction is also not the viewing angle direction for the camera model to observe the virtual environment in the form of a scope accessory.
- the speed at which the viewing angle direction is adjusted is the first adjustment speed
- the first viewing angle direction is adjusted at the second adjustment speed according to the above drag operation to obtain The second viewing angle direction, wherein the second adjustment speed is slower than the first adjustment speed.
- a third environment interface is displayed, and the third environment interface includes a picture for observing the virtual environment in a second angle of view through the scope accessory.
- Step 1407 When the touch end signal of the touch operation is received, the scope accessory is turned off, and the display of the person perspective screen is restored.
- the user touches the scope control control on the touch display screen as a touch operation on the scope control control.
- the terminal When the user releases the touch on the scope control control, the terminal generates the touch end signal .
- the application method of the scope in the virtual environment provided by the embodiments of the present application, by setting the scope control control, and according to whether the touch operation on the scope control control continues, so as not to observe through the scope accessories Switch between the person’s angle of view screen and the scope screen observed through the scope accessory. Only a single touch of the scope accessory is required to switch between the person angle of view screen and the scope screen, which improves the scope.
- the opening efficiency of accessories improves the efficiency and accuracy of observation of the virtual environment.
- the viewing angle direction for observing the virtual environment is adjusted by receiving the drag operation connected with the touch operation, that is, in a single touch event
- the operation of opening the scope accessories, adjusting the angle of view and closing the scope accessories is realized, and the application efficiency of the scope accessories is improved.
- FIG. 15 is a structural block diagram of an application device for a scope in a virtual environment provided by an exemplary embodiment of the present application.
- the device is configured in a terminal as an example for description.
- the device includes: a display module 1510 And receiving module 1520;
- the display module 1510 is configured to display a first environment interface.
- the first environment interface includes a personal view screen for observing the virtual environment from the personal view of a virtual object, the virtual object holding virtual props, and the virtual The props are equipped with scope accessories, and the first environment interface also includes scope control controls;
- the receiving module 1520 is used to receive the touch operation on the scope control control and generate a touch signal
- the display module 1510 is further configured to display a second environment interface according to the touch signal, and the second environment interface includes a scope picture for observing the virtual environment through the scope accessory;
- the receiving module 1520 is further configured to restore the display of the person perspective screen when the touch end signal of the touch operation is received.
- the receiving module 1520 is further configured to determine the touch duration of the touch operation when the touch end signal of the touch operation is received; when the touch duration reaches the required duration, Turn off the scope accessory according to the touch end signal, and restore the display of the person perspective picture.
- the device further includes:
- the determining module 1530 is configured to determine the click signal on the scope control control as a signal for controlling the closing of the scope accessory when the touch time length does not reach the required time length;
- the receiving module 1520 is further configured to turn off the scope accessory when the click signal on the scope control control is received;
- the display module 1510 is also used to restore the display of the person perspective picture.
- the receiving module 1520 is further configured to receive a trigger operation on the function setting control
- the display module 1510 is further configured to display a function setting interface according to the trigger operation, the function setting interface includes scope setting options, and the scope setting options include hybrid control options;
- the receiving module 1520 is further configured to receive an opening operation on the hybrid control option, and the hybrid control option is used to instruct the required duration as a critical duration to determine the closing manner of the scope accessory.
- the scope picture is a picture obtained by observing the virtual environment in a first angle of view through the scope accessory;
- the receiving module 1520 is further configured to receive a drag operation determined according to the touch operation
- the device also includes:
- the adjustment module 1540 is configured to adjust the first viewing angle direction according to the drag operation to obtain a second viewing angle direction;
- the display module 1510 is also configured to display a third environment interface, the third environment interface including a picture for observing the virtual environment in the second viewing angle direction through the scope accessory.
- the speed at which the viewing angle direction is adjusted is the first adjustment speed
- the adjustment module 1540 is further configured to adjust the first viewing angle direction at a second adjustment speed according to the drag operation, wherein the second adjustment speed is slower than the first adjustment speed.
- the receiving module 1520 is further configured to receive the touch operation on the touch screen
- the device also includes:
- the determining module 1530 is configured to determine the first coordinate corresponding to the touch operation on the touch display screen; determine the second coordinate corresponding to the center point of the sight control control on the touch display screen; when the When the distance between the first coordinate and the second coordinate is less than the required distance, it is determined that the touch operation is a touch operation acting on the scope control control.
- the scope application device in the virtual environment determines the corresponding touch signal and the end of the touch by setting the scope control control, and according to whether the touch operation on the scope control control continues.
- Signal so that for the person's angle of view screen that is not observed through the scope accessory and the scope screen that is observed through the scope accessory, only a single touch of the scope accessory is required to be between the person's angle of view screen and the scope screen Switching improves the opening efficiency of the scope accessories, thereby improving the efficiency and accuracy of observation of the virtual environment.
- scope application device in the virtual environment provided in the above embodiment is only illustrated by the division of the above functional modules. In actual applications, the above functions can be allocated by different functional modules as needed. The internal structure of the device is divided into different functional modules to complete all or part of the functions described above.
- scope application device in the virtual environment provided by the above embodiment and the scope application method in the virtual environment belong to the same concept. The specific implementation process is detailed in the method embodiment, which will not be omitted here. Repeat.
- FIG. 17 shows a structural block diagram of a terminal 1700 provided by an exemplary embodiment of the present application.
- the terminal 1700 can be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, moving picture expert compression standard audio layer 3), MP4 (Moving Picture Experts Group Audio Layer IV, moving picture expert compressing standard audio Level 4) Player, laptop or desktop computer.
- the terminal 1700 may also be called user equipment, portable terminal, laptop terminal, desktop terminal and other names.
- the terminal 1700 includes a processor 1701 and a memory 1702.
- the processor 1701 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on.
- the processor 1701 may adopt at least one hardware form among DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array, Programmable Logic Array). achieve.
- the processor 1701 may also include a main processor and a coprocessor.
- the main processor is a processor used to process data in the wake state, also called a CPU (Central Processing Unit, central processing unit); the coprocessor is A low-power processor used to process data in the standby state.
- the processor 1701 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used to render and draw content that needs to be displayed on the display screen.
- the processor 1701 may further include an AI (Artificial Intelligence) processor, and the AI processor is used to process calculation operations related to machine learning.
- AI Artificial Intelligence
- the memory 1702 may include one or more computer-readable storage media, which may be non-transitory.
- the memory 1702 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
- the non-transitory computer-readable storage medium in the memory 1702 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 1701 to implement the virtual environment provided by the method embodiment of the present application The application method of the middle sight.
- the terminal 1700 may optionally further include: a peripheral device interface 1703 and at least one peripheral device.
- the processor 1701, the memory 1702, and the peripheral device interface 1703 may be connected by a bus or a signal line.
- Each peripheral device can be connected to the peripheral device interface 1703 through a bus, a signal line, or a circuit board.
- the peripheral device includes: at least one of a radio frequency circuit 1704, a touch display screen 1705, a camera 1706, an audio circuit 1707, a positioning component 1708, and a power supply 1709.
- the peripheral device interface 1703 may be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1701 and the memory 1702.
- the processor 1701, the memory 1702, and the peripheral device interface 1703 are integrated on the same chip or circuit board; in some other embodiments, any one of the processor 1701, the memory 1702, and the peripheral device interface 1703 or The two can be implemented on separate chips or circuit boards, which are not limited in this embodiment.
- the radio frequency circuit 1704 is used to receive and transmit RF (Radio Frequency, radio frequency) signals, also called electromagnetic signals.
- the radio frequency circuit 1704 communicates with a communication network and other communication devices through electromagnetic signals.
- the radio frequency circuit 1704 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals.
- the radio frequency circuit 1704 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a user identity module card, and so on.
- the radio frequency circuit 1704 can communicate with other terminals through at least one wireless communication protocol.
- the wireless communication protocol includes but is not limited to: World Wide Web, Metropolitan Area Network, Intranet, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area network and/or WiFi (Wireless Fidelity, wireless fidelity) network.
- the radio frequency circuit 1704 may also include NFC (Near Field Communication) related circuits, which is not limited in this application.
- the display screen 1705 is used to display UI (User Interface).
- the UI can include graphics, text, icons, videos, and any combination thereof.
- the display screen 1705 also has the ability to collect touch signals on or above the surface of the display screen 1705.
- the touch signal may be input to the processor 1701 as a control signal for processing.
- the display screen 1705 may also be used to provide virtual buttons and/or virtual keyboards, also called soft buttons and/or soft keyboards.
- the display screen 1705 there may be one display screen 1705, which is provided with the front panel of the terminal 1700; in other embodiments, there may be at least two display screens 1705, which are respectively arranged on different surfaces of the terminal 1700 or in a folded design; In still other embodiments, the display screen 1705 may be a flexible display screen, which is arranged on the curved surface or the folding surface of the terminal 1700. Furthermore, the display screen 1705 can also be set as a non-rectangular irregular pattern, that is, a special-shaped screen.
- the display screen 1705 may be made of materials such as LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode, organic light-emitting diode).
- the camera assembly 1706 is used to capture images or videos.
- the camera assembly 1706 includes a front camera and a rear camera.
- the front camera is set on the front panel of the terminal, and the rear camera is set on the back of the terminal.
- the camera assembly 1706 may also include a flash.
- the flash can be a single-color flash or a dual-color flash. Dual color temperature flash refers to a combination of warm light flash and cold light flash, which can be used for light compensation under different color temperatures.
- the audio circuit 1707 may include a microphone and a speaker.
- the microphone is used to collect sound waves from the user and the environment, and convert the sound waves into electrical signals and input them to the processor 1701 for processing, or input to the radio frequency circuit 1704 to implement voice communication. For the purpose of stereo collection or noise reduction, there may be multiple microphones, which are respectively set in different parts of the terminal 1700.
- the microphone can also be an array microphone or an omnidirectional acquisition microphone.
- the speaker is used to convert the electrical signal from the processor 1701 or the radio frequency circuit 1704 into sound waves.
- the speaker can be a traditional membrane speaker or a piezoelectric ceramic speaker.
- the speaker When the speaker is a piezoelectric ceramic speaker, it can not only convert the electrical signal into human audible sound waves, but also convert the electrical signal into human inaudible sound waves for purposes such as distance measurement.
- the audio circuit 1707 may also include a headphone jack.
- the positioning component 1708 is used to locate the current geographic position of the terminal 1700 to implement navigation or LBS (Location Based Service, location-based service).
- the positioning component 1708 may be a positioning component based on the GPS (Global Positioning System, Global Positioning System) of the United States, the Beidou system of China, or the Galileo system of Russia.
- the power supply 1709 is used to supply power to various components in the terminal 1700.
- the power source 1709 may be alternating current, direct current, disposable batteries or rechargeable batteries.
- the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery.
- a wired rechargeable battery is a battery charged through a wired line
- a wireless rechargeable battery is a battery charged through a wireless coil.
- the rechargeable battery can also be used to support fast charging technology.
- the terminal 1700 further includes one or more sensors 1710.
- the one or more sensors 1710 include, but are not limited to: an acceleration sensor 1711, a gyroscope sensor 1712, a pressure sensor 1713, a fingerprint sensor 1714, an optical sensor 1715, and a proximity sensor 1716.
- the acceleration sensor 1711 can detect the magnitude of acceleration on the three coordinate axes of the coordinate system established by the terminal 1700.
- the acceleration sensor 1711 can be used to detect the components of the gravitational acceleration on three coordinate axes.
- the processor 1701 may control the touch screen 1705 to display the user interface in a horizontal view or a vertical view according to the gravity acceleration signal collected by the acceleration sensor 1711.
- the acceleration sensor 1711 can also be used for game or user motion data collection.
- the gyroscope sensor 1712 can detect the body direction and rotation angle of the terminal 1700, and the gyroscope sensor 1712 can cooperate with the acceleration sensor 1711 to collect the user's 3D actions on the terminal 1700. Based on the data collected by the gyroscope sensor 1712, the processor 1701 can implement the following functions: motion sensing (such as changing the UI according to the user's tilt operation), image stabilization during shooting, game control, and inertial navigation.
- the pressure sensor 1713 may be disposed on the side frame of the terminal 1700 and/or the lower layer of the touch screen 1705.
- the processor 1701 performs left and right hand recognition or quick operation according to the holding signal collected by the pressure sensor 1713.
- the processor 1701 controls the operability controls on the UI interface according to the pressure operation of the user on the touch display screen 1705.
- the operability control includes at least one of a button control, a scroll bar control, an icon control, and a menu control.
- the fingerprint sensor 1714 is used to collect the user's fingerprint.
- the processor 1701 can identify the user's identity according to the fingerprint collected by the fingerprint sensor 1714, or the fingerprint sensor 1714 can identify the user's identity according to the collected fingerprint. When it is recognized that the user's identity is a trusted identity, the processor 1701 authorizes the user to perform related sensitive operations, including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings.
- the fingerprint sensor 1714 may be provided on the front, back or side of the terminal 1700. When a physical button or a manufacturer logo is provided on the terminal 1700, the fingerprint sensor 1714 can be integrated with the physical button or the manufacturer logo.
- the optical sensor 1715 is used to collect the ambient light intensity.
- the processor 1701 may control the display brightness of the touch screen 1705 according to the intensity of the ambient light collected by the optical sensor 1715. Specifically, when the ambient light intensity is high, the display brightness of the touch screen 1705 is increased; when the ambient light intensity is low, the display brightness of the touch screen 1705 is decreased.
- the processor 1701 may also dynamically adjust the shooting parameters of the camera assembly 1706 according to the ambient light intensity collected by the optical sensor 1715.
- the proximity sensor 1716 also called a distance sensor, is usually arranged on the front panel of the terminal 1700.
- the proximity sensor 1716 is used to collect the distance between the user and the front of the terminal 1700.
- the processor 1701 controls the touch screen 1705 to switch from the on-screen state to the off-screen state; when the proximity sensor 1716 detects When the distance between the user and the front of the terminal 1700 gradually increases, the processor 1701 controls the touch display screen 1705 to switch from the rest screen state to the bright screen state.
- FIG. 17 does not constitute a limitation on the terminal 1700, and may include more or fewer components than shown in the figure, or combine certain components, or adopt different component arrangements.
- An embodiment of the present application also provides a computer device including a memory and a processor.
- the memory stores at least one instruction, at least one program, code set or instruction set, at least one instruction, at least one program, code set or instruction
- the set is loaded by the processor and implements the scope application method in the virtual environment provided in the above embodiment of the present application.
- the embodiment of the present application also provides a storage medium, the storage medium is used to store a computer program, and the computer program is used to execute to implement the application method of the sight in the virtual environment provided in the above embodiment of the present application.
- This application also provides a computer program product, which when the computer program product runs on a computer, causes the computer to execute the application method of the sight in the virtual environment provided by the foregoing method embodiments.
- the program can be stored in a computer-readable storage medium.
- the medium may be a computer-readable storage medium included in the memory in the foregoing embodiment; or may be a computer-readable storage medium that exists alone and is not assembled into the terminal.
- the computer-readable storage medium stores at least one instruction, at least one program, code set or instruction set, and the at least one instruction, the at least one program, the code set or the instruction set is loaded and executed by the processor In order to realize the application method of the sight in the virtual environment provided in the above embodiment of the present application.
- the computer-readable storage medium may include: Read Only Memory (ROM), Random Access Memory (RAM), Solid State Drives (SSD, Solid State Drives), or optical discs.
- random access memory may include resistive random access memory (ReRAM, Resistance Random Access Memory) and dynamic random access memory (DRAM, Dynamic Random Access Memory).
- ReRAM resistive random access memory
- DRAM Dynamic Random Access Memory
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims (16)
- 一种虚拟环境中瞄准镜的应用方法,所述方法由终端执行,所述方法包括:显示第一环境界面,所述第一环境界面中包括以虚拟对象的人称视角对所述虚拟环境进行观察的人称视角画面,所述虚拟对象持有虚拟道具,所述虚拟道具上装配有瞄准镜配件,所述第一环境界面中还包括瞄准镜控制控件;接收在所述瞄准镜控制控件上的触摸操作,生成触摸信号;根据所述触摸信号显示第二环境界面,所述第二环境界面中包括通过所述瞄准镜配件对所述虚拟环境进行观察的瞄准镜画面;当接收到所述触摸操作的触摸结束信号时,恢复显示所述人称视角画面。
- 根据权利要求1所述的方法,所述当接收到所述触摸操作的触摸结束信号时,恢复显示所述人称视角画面,包括:当接收到所述触摸操作的触摸结束信号时,确定所述触摸操作的触摸时长;当所述触摸时长达到要求时长时,根据所述触摸结束信号关闭所述瞄准镜配件,恢复显示所述人称视角画面。
- 根据权利要求2所述的方法,所述方法还包括:当所述触摸时长未达到所述要求时长时,将对所述瞄准镜控制控件的点击信号确定为控制所述瞄准镜配件关闭的信号;当接收到在所述瞄准镜控制控件上的所述点击信号时,关闭所述瞄准镜配件,恢复显示所述人称视角画面。
- 根据权利要求3所述的方法,所述接收在所述瞄准镜控制控件上的触摸操作之前,还包括:接收在功能设置控件上的触发操作;根据所述触发操作显示功能设置界面,所述功能设置界面中包括瞄准镜设置选项,所述瞄准镜设置选项中包括混合式控制选项;接收在所述混合式控制选项上的开启操作,所述混合式控制选项用于指示以所述要求时长为临界时长确定所述瞄准镜配件的关闭方式。
- 根据权利要求1至4任一所述的方法,所述瞄准镜画面为通过所述瞄准镜配件以第一视角方向对所述虚拟环境进行观察的画面;所述根据所述触摸操作显示第二环境界面之后,还包括:接收根据所述触摸操作确定的拖动操作;根据所述拖动操作对所述第一视角方向进行调整,得到第二视角方向;显示第三环境界面,所述第三环境界面中包括通过所述瞄准镜配件以所述第二视角方向对所述虚拟环境进行观察的画面。
- 根据权利要求5所述的方法,当未通过所述瞄准镜配件对所述虚拟环境进行观察时,对视角方向进行调整的速度为第一调整速度;所述根据所述拖动操作对所述第一视角方向进行调整,包括:根据所述拖动操作以第二调整速度对所述第一视角方向进行调整,其中,所述第二调整速度慢于所述第一调整速度。
- 根据权利要求1至4任一所述的方法,所述接收在所述瞄准镜控制控件上的触摸操作,包括:接收在触摸显示屏上的触摸操作;确定所述触摸操作在所述触摸显示屏上对应的第一坐标;确定所述瞄准镜控制控件的中心点在所述触摸显示屏上对应的第二坐标;当所述第一坐标和所述第二坐标之间的距离小于要求距离时,确定所述触摸操作为作用于所述瞄准镜控制控件上的所述触摸操作。
- 一种虚拟环境中瞄准镜的应用装置,所述装置包括:显示模块,用于显示第一环境界面,所述第一环境界面中包括以虚拟对象的人称视角对所述虚拟环境进行观察的人称视角画面,所述虚拟对象持有虚拟道具,所述虚拟道具上装配有瞄准镜配件,所述第一环境界面中还包括瞄准镜控制控件;接收模块,用于接收在所述瞄准镜控制控件上的触摸操作,生成触摸信号;所述显示模块,还用于根据所述触摸信号显示第二环境界面,所述第二环境界面中包括通过所述瞄准镜配件对所述虚拟环境进行观察的瞄准镜画面;所述接收模块,还用于当接收到所述触摸操作的触摸结束信号时,恢复显示所述人称视角画面。
- 根据权利要求8所述的装置,所述接收模块,还用于当接收到所述触摸操作的触摸结束信号时,确定所述触摸操作的触摸时长;当所述触摸时长达 到要求时长时,根据所述触摸结束信号关闭所述瞄准镜配件,恢复显示所述人称视角画面。
- 根据权利要求9所述的装置,所述装置还包括:确定模块,用于当所述触摸时长未达到所述要求时长时,将对所述瞄准镜控制控件的点击信号确定为控制所述瞄准镜配件关闭的信号;所述接收模块,还用于当接收到在所述瞄准镜控制控件上的所述点击信号时,关闭所述瞄准镜配件;所述显示模块,还用于恢复显示所述人称视角画面。
- 根据权利要求10所述的装置,所述接收模块,还用于接收在功能设置控件上的触发操作;所述显示模块,还用于根据所述触发操作显示功能设置界面,所述功能设置界面中包括瞄准镜设置选项,所述瞄准镜设置选项中包括混合式控制选项;所述接收模块,还用于接收在所述混合式控制选项上的开启操作,所述混合式控制选项用于指示以所述要求时长为临界时长确定所述瞄准镜配件的关闭方式。
- 根据权利要求8至11任一所述的装置,所述瞄准镜画面为通过所述瞄准镜配件以第一视角方向对所述虚拟环境进行观察的画面;所述接收模块,还用于接收根据所述触摸操作确定的拖动操作;所述装置还包括:调整模块,用于根据所述拖动操作对所述第一视角方向进行调整,得到第二视角方向;所述显示模块,还用于显示第三环境界面,所述第三环境界面中包括通过所述瞄准镜配件以所述第二视角方向对所述虚拟环境进行观察的画面。
- 根据权利要求12所述的装置,当未通过所述瞄准镜配件对所述虚拟环境进行观察时,对视角方向进行调整的速度为第一调整速度;所述调整模块,还用于根据所述拖动操作以第二调整速度对所述第一视角方向进行调整,其中,所述第二调整速度慢于所述第一调整速度。
- 一种计算机设备,所述计算机设备包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、 所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如权利要求1至7任一所述的虚拟环境中瞄准镜的应用方法。
- 一种存储介质,所述存储介质用于存储计算机程序,所述计算机程序用于执行权利要求1至7任一所述的虚拟环境中瞄准镜的应用方法。
- 一种包括指令的计算机程序产品,当其在计算机上运行时,使得所述计算机执行权利要求1至7任一所述的虚拟环境中瞄准镜的应用方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020217032917A KR20210135594A (ko) | 2019-08-22 | 2020-07-20 | 가상 환경에서 망원경 관찰의 적용 방법 및 관련 장치 |
SG11202109672U SG11202109672UA (en) | 2019-08-22 | 2020-07-20 | Application method and related apparatus of sighting telescope in virtual environment |
JP2021554634A JP2022524802A (ja) | 2019-08-22 | 2020-07-20 | 仮想環境におけるスコープの適用方法及び装置並びにコンピュータ装置及びプログラム |
US17/504,081 US12121798B2 (en) | 2019-08-22 | 2021-10-18 | Method and related apparatus for application of scope in virtual environment |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910780269.9A CN110448908B (zh) | 2019-08-22 | 2019-08-22 | 虚拟环境中瞄准镜的应用方法、装置、设备及存储介质 |
CN201910780269.9 | 2019-08-22 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/504,081 Continuation US12121798B2 (en) | 2019-08-22 | 2021-10-18 | Method and related apparatus for application of scope in virtual environment |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021031765A1 true WO2021031765A1 (zh) | 2021-02-25 |
Family
ID=68488566
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/103000 WO2021031765A1 (zh) | 2019-08-22 | 2020-07-20 | 虚拟环境中瞄准镜的应用方法和相关装置 |
Country Status (5)
Country | Link |
---|---|
JP (1) | JP2022524802A (zh) |
KR (1) | KR20210135594A (zh) |
CN (1) | CN110448908B (zh) |
SG (1) | SG11202109672UA (zh) |
WO (1) | WO2021031765A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024183469A1 (zh) * | 2023-03-07 | 2024-09-12 | 腾讯科技(深圳)有限公司 | 游戏画面的显示方法、装置、设备及计算机可读存储介质 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110448908B (zh) * | 2019-08-22 | 2020-12-01 | 腾讯科技(深圳)有限公司 | 虚拟环境中瞄准镜的应用方法、装置、设备及存储介质 |
CN112330823B (zh) * | 2020-11-05 | 2023-06-16 | 腾讯科技(深圳)有限公司 | 虚拟道具的显示方法、装置、设备及可读存储介质 |
CN112791398B (zh) * | 2021-02-03 | 2024-07-16 | 网易(杭州)网络有限公司 | 游戏中虚拟倍镜的控制方法、装置、电子设备及存储介质 |
CN112774188B (zh) * | 2021-02-03 | 2024-07-23 | 网易(杭州)网络有限公司 | 游戏中虚拟倍镜的调节方法、装置、电子设备及存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107678647A (zh) * | 2017-09-26 | 2018-02-09 | 网易(杭州)网络有限公司 | 虚拟射击主体控制方法、装置、电子设备及存储介质 |
CN108339272A (zh) * | 2018-02-12 | 2018-07-31 | 网易(杭州)网络有限公司 | 虚拟射击主体控制方法及装置、电子设备、存储介质 |
CN110141869A (zh) * | 2019-04-11 | 2019-08-20 | 腾讯科技(深圳)有限公司 | 操作控制方法、装置、电子设备及存储介质 |
CN110448908A (zh) * | 2019-08-22 | 2019-11-15 | 腾讯科技(深圳)有限公司 | 虚拟环境中瞄准镜的应用方法、装置、设备及存储介质 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5238640B2 (ja) * | 2009-08-18 | 2013-07-17 | 株式会社コナミデジタルエンタテインメント | ゲーム装置、ゲーム制御プログラム、及びゲーム制御方法 |
CN103631408B (zh) * | 2012-08-21 | 2019-01-04 | 腾讯科技(深圳)有限公司 | 触控指令的处理方法和装置 |
KR101777630B1 (ko) * | 2017-02-01 | 2017-09-12 | 에스명심 주식회사 | 사격 모바일 게임 시스템 |
CN108939540B (zh) * | 2018-07-04 | 2021-08-24 | 网易(杭州)网络有限公司 | 射击游戏辅助瞄准方法、装置、存储介质、处理器及终端 |
CN109847370A (zh) * | 2019-03-26 | 2019-06-07 | 网易(杭州)网络有限公司 | 射击游戏的控制方法、装置、设备和存储介质 |
CN110090442B (zh) * | 2019-05-16 | 2023-03-24 | 网易(杭州)网络有限公司 | 游戏的开镜方法及装置、电子设备、存储介质 |
-
2019
- 2019-08-22 CN CN201910780269.9A patent/CN110448908B/zh active Active
-
2020
- 2020-07-20 JP JP2021554634A patent/JP2022524802A/ja active Pending
- 2020-07-20 SG SG11202109672U patent/SG11202109672UA/en unknown
- 2020-07-20 KR KR1020217032917A patent/KR20210135594A/ko not_active Application Discontinuation
- 2020-07-20 WO PCT/CN2020/103000 patent/WO2021031765A1/zh active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107678647A (zh) * | 2017-09-26 | 2018-02-09 | 网易(杭州)网络有限公司 | 虚拟射击主体控制方法、装置、电子设备及存储介质 |
CN108339272A (zh) * | 2018-02-12 | 2018-07-31 | 网易(杭州)网络有限公司 | 虚拟射击主体控制方法及装置、电子设备、存储介质 |
CN110141869A (zh) * | 2019-04-11 | 2019-08-20 | 腾讯科技(深圳)有限公司 | 操作控制方法、装置、电子设备及存储介质 |
CN110448908A (zh) * | 2019-08-22 | 2019-11-15 | 腾讯科技(深圳)有限公司 | 虚拟环境中瞄准镜的应用方法、装置、设备及存储介质 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024183469A1 (zh) * | 2023-03-07 | 2024-09-12 | 腾讯科技(深圳)有限公司 | 游戏画面的显示方法、装置、设备及计算机可读存储介质 |
Also Published As
Publication number | Publication date |
---|---|
JP2022524802A (ja) | 2022-05-10 |
CN110448908B (zh) | 2020-12-01 |
KR20210135594A (ko) | 2021-11-15 |
SG11202109672UA (en) | 2021-10-28 |
CN110448908A (zh) | 2019-11-15 |
US20220032176A1 (en) | 2022-02-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108434736B (zh) | 虚拟环境对战中的装备显示方法、装置、设备及存储介质 | |
JP7471369B2 (ja) | 仮想環境におけるアクセサリー選択方法及びその装置、機器並びにコンピュータプログラム | |
WO2020253832A1 (zh) | 控制虚拟对象对虚拟物品进行标记的方法、装置及介质 | |
CN108815851B (zh) | 在虚拟环境中射击时的界面显示方法、设备及存储介质 | |
CN110413171B (zh) | 控制虚拟对象进行快捷操作的方法、装置、设备及介质 | |
CN109529319B (zh) | 界面控件的显示方法、设备及存储介质 | |
US11656755B2 (en) | Method and apparatus for controlling virtual object to drop virtual item and medium | |
US12036474B2 (en) | Virtual object control method and apparatus, device, and readable storage medium | |
CN110755841B (zh) | 虚拟环境中道具的切换方法、装置、设备及可读存储介质 | |
WO2021036577A1 (zh) | 控制虚拟对象的方法和相关装置 | |
WO2021031765A1 (zh) | 虚拟环境中瞄准镜的应用方法和相关装置 | |
CN113398571B (zh) | 虚拟道具的切换方法、装置、终端及存储介质 | |
CN108786110B (zh) | 虚拟环境中的瞄准镜显示方法、设备及存储介质 | |
TWI802978B (zh) | 應用程式內的控制項位置調整方法及裝置、設備及存儲介質 | |
CN111672106B (zh) | 虚拟场景显示方法、装置、计算机设备及存储介质 | |
CN112330823B (zh) | 虚拟道具的显示方法、装置、设备及可读存储介质 | |
WO2021143253A1 (zh) | 虚拟环境中虚拟道具的操作方法、装置、设备及可读介质 | |
CN112451969A (zh) | 虚拟对象控制方法、装置、计算机设备及存储介质 | |
CN112402969B (zh) | 虚拟场景中虚拟对象控制方法、装置、设备及存储介质 | |
CN112044066A (zh) | 界面显示方法、装置、设备及可读存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20854510 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021554634 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20217032917 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20854510 Country of ref document: EP Kind code of ref document: A1 |