CN108744511B - Method, device and storage medium for displaying sighting telescope in virtual environment - Google Patents

Method, device and storage medium for displaying sighting telescope in virtual environment Download PDF

Info

Publication number
CN108744511B
CN108744511B CN201810539548.1A CN201810539548A CN108744511B CN 108744511 B CN108744511 B CN 108744511B CN 201810539548 A CN201810539548 A CN 201810539548A CN 108744511 B CN108744511 B CN 108744511B
Authority
CN
China
Prior art keywords
lens
virtual environment
picture
sighting telescope
sighting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810539548.1A
Other languages
Chinese (zh)
Other versions
CN108744511A (en
Inventor
邓杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810539548.1A priority Critical patent/CN108744511B/en
Publication of CN108744511A publication Critical patent/CN108744511A/en
Application granted granted Critical
Publication of CN108744511B publication Critical patent/CN108744511B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses a method, equipment and a storage medium for displaying a sighting telescope in a virtual environment, and relates to the field of virtual environments. The method comprises the following steps: displaying a first virtual environment picture, the first virtual environment picture comprising a first sighting telescope picture of a virtual object observing the virtual environment using a reflective sighting telescope, an aiming point in the first sighting telescope picture being located at a first position relative to a sighting telescope barrel of the reflective sighting telescope; controlling the virtual object to move in the virtual environment; and displaying a second virtual environment picture, wherein the aiming point in the second sighting telescope picture is located at a second position relative to the sighting telescope lens barrel. The application can carry out more real simulation to the reflective sighting telescope in the real environment, thereby realizing more real display effect and enhancing the immersion sense of the user to the virtual environment.

Description

Method, device and storage medium for displaying sighting telescope in virtual environment
Technical Field
The embodiment of the application relates to the field of virtual environments, in particular to a method, equipment and a storage medium for displaying a sighting telescope in a virtual environment.
Background
Many applications based on three-dimensional virtual environments have shooting elements provided therein. A virtual object in a three-dimensional virtual environment is fired using a virtual firearm. When the sighting telescope is installed on the virtual gun, the virtual object uses the sighting telescope to aim the target object.
A reflex sight is a different sight from an optical sight, and the reflex sight generates a virtual image (e.g., a red dot) at infinity on the sight, and the virtual image is always directed to a target object to be aimed at. In the related art, in a three-dimensional virtual environment, the positions of a camera model corresponding to a virtual object, a lens barrel of a reflective sighting telescope and a lens of the reflective sighting telescope are bound, that is, in an open mode, the lens barrel and the lens of the reflective sighting telescope are displayed in the central position of a screen and are kept unchanged no matter how the visual angle direction of the virtual object changes.
Because the simulation process does not conform to the characteristics of the reflective sighting telescope in the real environment, the sighting telescope display method in the related art is not true enough, and the display effect is poor.
Disclosure of Invention
The embodiment of the application provides a sighting telescope display method, equipment and a storage medium in a virtual environment, and can solve the problem that a reflective sighting telescope simulated in the related technology does not accord with the characteristics of the reflective sighting telescope in a real environment, so that the display effect is not real enough and poor. The technical scheme is as follows:
in one aspect, a method for displaying a sighting telescope in a virtual environment is provided, the method comprising:
displaying a first virtual environment picture comprising a first sighting telescope picture of a virtual object viewing the virtual environment using a reflex sighting telescope, an aiming point in the first sighting telescope picture being at a first position relative to a sighting telescope barrel of the reflex sighting telescope;
controlling the virtual object to perform activities in the virtual environment;
displaying a second virtual environment picture including a second scope picture of the virtual object viewing the virtual environment using the reflex scope, an aiming point in the second scope picture being located at a second position relative to the scope barrel, the second position being different from the first position.
In another aspect, there is provided a scope display apparatus in a virtual environment, the apparatus including:
the display module is used for displaying a first virtual environment picture, wherein the first virtual environment picture comprises a first sighting telescope picture of a virtual object observing the virtual environment by using a reflective sighting telescope, and a sighting point in the first sighting telescope picture is located at a first position relative to a sighting telescope lens barrel of the reflective sighting telescope;
the control module is used for controlling the virtual object to move in the virtual environment;
the display module is configured to display a second virtual environment picture, where the second virtual environment picture includes a second sighting telescope picture in which the virtual object observes the virtual environment using the reflective sighting telescope, and a sighting point in the second sighting telescope picture is located at a second position relative to the sighting telescope barrel, where the second position is different from the first position.
In another aspect, an electronic device is provided, the device comprising a processor and a memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions that is loaded and executed by the processor to implement a method of scope display in a virtual environment as described above.
In another aspect, there is provided a computer readable storage medium having stored therein at least one instruction, at least one program, set of codes, or set of instructions, which is loaded and executed by the processor to implement a method of sighting telescope display in a virtual environment as described above.
In another aspect, a computer program product is provided which, when run on a computer, causes the computer to perform the method of sighting telescope display in a virtual environment as described above.
The beneficial effects that technical scheme that this application embodiment brought include at least:
after the virtual object moves in the virtual environment, a second virtual environment picture different from the first virtual environment picture is displayed, the positions of aiming points in the two virtual environment pictures relative to the sighting telescope lens barrel are different, and the reflective sighting telescope in the real environment can be simulated more truly, so that a more real display effect is realized, and the immersion sense of a user on the virtual environment is enhanced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram of a reflex sight in the real world;
FIG. 2 is a diagram of the positions of the sighting points of a reflex sight in the real world when viewed at different angles;
FIG. 3 is a schematic diagram of a reflective scope when simulated in a virtual environment;
FIG. 4 is a diagram illustrating a view of a virtual environment observed by a reflective sighting telescope in the related art;
FIG. 5 is a block diagram of an electronic device provided by an exemplary embodiment of the present application;
FIG. 6 is a block diagram of a computer system provided in another exemplary embodiment of the present application;
FIG. 7 is a flow chart of a method for scope display in a virtual environment provided by an exemplary embodiment of the present application;
FIG. 8 is an interface schematic diagram of a method for scope display in a virtual environment provided by another exemplary embodiment of the present application;
FIG. 9 is a flow chart of a method for scope display in a virtual environment provided by another exemplary embodiment of the present application;
FIG. 10 is an interface schematic diagram of a method for scope display in a virtual environment provided by another exemplary embodiment of the present application;
FIG. 11 is a schematic diagram illustrating a method for displaying a scope in a virtual environment according to another exemplary embodiment of the present application;
FIG. 12 is an interface schematic of a method for sighting telescope display in a virtual environment according to another exemplary embodiment of the present application;
FIG. 13 is an interface schematic diagram of a method for scope display in a virtual environment provided by another exemplary embodiment of the present application;
FIG. 14 is a block diagram of a scope display device in a virtual environment as provided by an exemplary embodiment of the present application;
fig. 15 is a block diagram of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, the following detailed description of the embodiments of the present application will be made with reference to the accompanying drawings.
First, a number of terms referred to in the embodiments of the present application will be described:
virtual environment: is a virtual environment that is displayed (or provided) by an application when running on a device. The virtual environment may be a simulation environment of a real world, a semi-simulation semi-fictional environment, or a pure fictional environment. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment. The following embodiments illustrate the virtual environment as a three-dimensional virtual environment, but are not limited thereto. Optionally, the virtual environment is also used for virtual environment engagement between at least two virtual characters. For example, the virtual environment is used for fighting between at least two virtual characters by using a virtual gun; for another example, the virtual environment is used for at least two virtual characters to fight using a virtual firearm within a designated geographic area that is continually shrinking as virtual time increases.
Virtual object: refers to a movable object in a virtual environment. The movable object can be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in a three-dimensional virtual environment. Optionally, the virtual object is a three-dimensional volumetric model created based on animated skeletal techniques. Each virtual object has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment.
Viewing angle direction: a viewing direction when viewing in the virtual environment from the first person perspective or the third person perspective of the virtual object. Optionally, when the first person perspective is used, the perspective direction is a direction when the front of the virtual object is observed by the camera model in the virtual environment; when the third person refers to the perspective, the perspective direction is the direction when viewed in the virtual environment by a camera model located around the virtual object (such as behind the brain).
Optionally, the camera model automatically follows the virtual object in the virtual environment, that is, when the position of the virtual object in the virtual environment changes, the camera model changes while following the position of the virtual object in the virtual environment, and the camera model is always within the preset distance range of the virtual object in the virtual environment. Optionally, the relative positions of the camera model and the virtual object do not change during the automatic following process.
The camera model is as follows: the camera model is a three-dimensional model located around a virtual object in a three-dimensional virtual environment, and when a first person perspective is adopted, the camera model is located near the head of the virtual object or located at the head of the virtual object, and when a third person perspective is adopted, the camera model can be located behind the virtual object and bound with the virtual object, or located at any position away from the virtual object by a preset distance, and the virtual object located in the three-dimensional virtual environment can be observed from different angles through the camera model, optionally, the third person perspective is also called as a "first person over-shoulder perspective", and the camera model is located behind the virtual object (such as the head shoulder of a virtual character). Optionally, the camera model is not actually displayed in the three-dimensional virtual environment, i.e. the camera model is not recognizable in the three-dimensional virtual environment displayed by the user interface.
Fig. 1 shows a schematic diagram of a reflective sighting telescope 100 in the real world. Reflective scope 100 may be a red dot scope or a holographic scope. Taking the example that the reflective sighting telescope 100 is a holographic sighting telescope, the reflective sighting telescope 100 includes a laser 11, a plane mirror 12, a collimating mirror 13, a holographic grating 14 and a holographic negative 15. The laser 11 is used for generating a laser beam corresponding to an aiming point, and the laser beam emitted by the laser 11 is reflected by the plane mirror 12, the collimating mirror 13 and the holographic grating 14 in sequence, and then is refracted by the holographic negative film 15 and then is emitted. The resulting emitted light and the ambient light viewed by the reflex sight 100 are superimposed in parallel to form an aiming point remote from the viewer's line of sight.
The reflex sight 100 has the following features: when the direction of the viewer's line of sight is parallel to the optical axis of the reflex sight 100, the display position of the sight in the lens is in the on-center state, as shown in fig. 2 (1); when the direction of the viewer's line of sight is not parallel to the optical axis of the reflex sight 100, the aiming point in the reflex sight 100 is shifted slightly, that is, the display position of the aiming point in the lens is not in the on-center state, as shown in (2), (3), and (4) of fig. 2.
Fig. 3 is a schematic diagram illustrating a principle of simulation of a reflex sight in the related art. A virtual object 320 and a mirrored scope 340 in three dimensions are provided in the three-dimensional virtual environment 300. The virtual object 320 may be hand-held with a virtual firearm equipped with a reflex sight 340. Reflex scope 340 includes scope barrel 342 and scope lens 344. The aiming lens 344 is attached with a lens material, and the lens material 345 is attached to the aiming lens 344 to present an aiming point. Illustratively, the lens material 345 is an image with aiming points drawn on the aiming lens 344, and the portion of the image other than the aiming points may be a transparent image or a semi-transparent image.
Generally, the aiming lens 344 is mounted on a groove 343 of the aiming lens barrel 342, and a lens material 345 on which an aiming point is drawn is attached to the aiming lens 344.
In the related art, the relative positions of the sighting telescope cylinder 342, the sighting lens 344, and the lens material 345 are maintained. That is, in the open mirror mode, no matter how the virtual object moves in the virtual environment, the aiming point in the sighting telescope barrel 342 and the lens material 345 is located at the center of the virtual environment picture, and the virtual environment picture is the sighting telescope picture of the virtual environment observed by the virtual object 320 using the reflex sighting telescope 340, as shown in fig. 4, and no matter how the virtual object moves, the sighting telescope barrel 342 and the aiming point 40 in the related art are located at the center of the screen and remain unchanged.
However, in real-world environments, the movement of the arms of the person in the gun-holding state is slightly slower than the movement of the body. That is, when the person turns left and right, the firearm held by the arm turns at a slightly slower speed, and the aiming point viewed by the person is slightly shifted from the position of the gun sight barrel. The embodiment of the application provides a sighting telescope display method, equipment and a storage medium in a virtual environment, and can provide a more real simulation effect for a reflective sighting telescope.
Referring to fig. 5, a block diagram of an electronic device according to an exemplary embodiment of the present application is shown. The electronic device includes: an operating system 520 and application programs 522.
Operating system 520 is the base software that provides applications 522 with secure access to computer hardware.
Application 522 is an application that supports a virtual environment. Optionally, application 522 is an application that supports a three-dimensional virtual environment. The application 522 may be any one of a virtual reality application, a three-dimensional map program, a Third-person Shooting Game (TPS), a First-person Shooting Game (FPS), an MOBA Game, and a multi-player gunfight type survival Game. The application 522 may be a stand-alone application, such as a stand-alone 3D game program.
Referring to FIG. 6, a block diagram of a computer system provided in an exemplary embodiment of the present application is shown. The computer system 600 includes: a first device 620, a server 640, and a second device 660.
The first device 620 is installed and operated with an application program supporting a virtual environment. The application program may be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, an MOBA game, and a multi-player gunfight type survival game. The first device 620 is a device used by a first user who uses the first device 620 to control a first virtual object located in a virtual environment for activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the first virtual object is a first virtual character, such as a simulated persona or an animated persona.
The first device 620 is connected to the server 640 through a wireless network or a wired network.
The server 640 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 640 is used to provide background services for applications that support a three-dimensional virtual environment. Optionally, the server 640 undertakes primary computing work and the first device 620 and the second device 660 undertake secondary computing work; alternatively, the server 640 undertakes secondary computing work and the first device 620 and the second device 660 undertake primary computing work; alternatively, the server 640, the first device 620, and the second device 660 perform cooperative computing by using a distributed computing architecture.
The second device 660 is installed and operated with an application program supporting a virtual environment. The application program may be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, an FPS game, an MOBA game, and a multi-player gunfight type survival game. The second device 660 is a device used by a second user, who uses the second device 660 to control a second virtual object located in the virtual environment to perform activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the second virtual object is a second virtual character, such as a simulated persona or an animated persona.
Optionally, the first virtual character and the second virtual character are in the same virtual environment. Alternatively, the first avatar and the second avatar may belong to the same team, organization, have a friend relationship, or have temporary communication rights. Alternatively, the first virtual character and the second virtual character may belong to different teams, different organizations, or two groups with enemy.
Alternatively, the applications installed on the first device 620 and the second device 660 are the same, or the applications installed on the two devices are the same type of application for different control system platforms. The first device 620 may generally refer to one of a plurality of devices, and the second device 660 may generally refer to one of a plurality of devices, and this embodiment is illustrated by the first device 620 and the second device 660. The first device 620 and the second device 660 may be of the same or different device types, including: at least one of a game console, a desktop computer, a smartphone, a tablet, an e-book reader, an MP3 player, an MP4 player, and a laptop portable computer. The following embodiments are illustrated where the device is a desktop computer.
Those skilled in the art will appreciate that the number of devices described above may be greater or fewer. For example, the number of the devices may be only one, or several tens or hundreds, or more. The number of devices and the types of devices are not limited in the embodiments of the present application.
Referring to fig. 7, a flowchart of a method for displaying a sighting telescope in a virtual environment according to an exemplary embodiment of the present application is shown. The embodiment is exemplified by applying the method to the electronic device shown in fig. 5, the first device shown in fig. 6, and/or the second device shown in fig. 6, where the method includes:
step 702, displaying a first virtual environment picture, where the first virtual environment picture includes a first sighting telescope picture in which a virtual object observes a virtual environment using a reflective sighting telescope, and an aiming point in the first sighting telescope picture is located at a first position relative to a sighting telescope barrel of the reflective sighting telescope;
optionally, the virtual object is in three-dimensional form. The virtual object is any one of a simulated character, a cartoon character, an animation character, and a game character.
Optionally, various controls may be further superimposed on the first virtual environment screen, and these controls include: at least one of a moving direction control, a firing control, a projectile firing control, a main weapon slot control, an auxiliary weapon slot control, a small map control, a map azimuth guiding control and team friend/enemy unit indication marks.
The first scope view refers to a view of a virtual environment observed with a reflex scope. If the lens of the reflective sighting telescope is circular, the picture of the first sighting telescope is circular. Before the virtual object moves, the aiming point in the first scope picture is located at a first position relative to the scope lens, which may be the midpoint position of the scope barrel.
Step 704, controlling the virtual object to perform activities in the virtual environment;
the device controls the activity of the virtual object in the activity space of the virtual environment. For example, in the virtual environment, the virtual object may be set to move in a plain surrounded by a hill, the plain is the moving space of the virtual object, and the hill surrounded by the plain is an area where the virtual object cannot move.
The movement in the embodiment of the present application refers to a movement in which the trunk movement of the virtual object drives the arm (and the virtual firearm and the reflex sight held by the arm) to change positions. The activity includes: controlling the virtual object to perform at least one of translation, controlling the virtual object to steer in the left/right direction, controlling the virtual object to switch between the top view state and the bottom view state, and controlling the virtual object to switch between the top view state and the bottom view state.
Optionally, the device controls the virtual object to move at a first speed in the virtual environment and controls a scope barrel of the reflex scope to move at a second speed in the virtual environment, the first speed being greater than the second speed.
Step 706, displaying a second virtual environment picture, where the second virtual environment picture includes a second sighting telescope picture in which the virtual object observes the virtual environment using a reflective sighting telescope, and a sighting point in the second sighting telescope picture is located at a second position relative to the sighting telescope barrel, where the second position is different from the first position.
Optionally, various controls may be further superimposed on the second virtual environment screen, and these controls include: at least one of a moving direction control, a firing control, a projectile firing control, a main weapon slot control, an auxiliary weapon slot control, a small map control, a map azimuth guiding control and team friend/enemy unit indication marks.
The second scope view is a view of the virtual environment observed with the reflex scope. If the lens of the reflective sighting telescope is circular, the picture of the second sighting telescope is circular. After the virtual object is active, the aiming point in the second scope view is located at a second position relative to the scope lens, which may be a non-midpoint position of the scope barrel. Optionally, the offset between the second location and the first location is associated with activity of the virtual object. That is, the offset direction between the second position and the first position is the same as the moving direction of the virtual object, and the offset distance between the second position and the first position has a positive correlation with the moving distance of the virtual object.
Schematically, the upper diagram in fig. 8 shows the first virtual environment picture 81, and the scope barrel 342 and the aiming point 40 of the reflex scope in the first virtual environment picture 81 are both located at the midpoint of the first virtual environment picture 81. The sighting telescope lens barrel 342 comprises a first sighting telescope picture 42 in a circular shape, and the sighting point 40 is positioned at the center of the first sighting telescope picture 42; the lower diagram in fig. 8 shows a second virtual environment picture 82, affected by the virtual object turning to the right in the virtual environment, in which second virtual environment picture 82 the telescope barrel 342 of the reflex telescope acts slightly behind the torso of the virtual object, so that the relative position between the telescope barrel 342 and the aiming point 40 is shifted, wherein the aiming point 40 is still at the midpoint position of the second virtual environment picture 82, but not at the midpoint position but at a position to the right of the center with respect to the second telescope picture 44 of the telescope barrel 342.
In summary, in the method for displaying a sighting telescope provided by this embodiment, after the virtual object moves in the virtual environment, the second virtual environment picture different from the first virtual environment picture is displayed, and the positions of the aiming point in the two virtual environment pictures relative to the sighting telescope lens barrel are different, so that a more real simulation can be performed on the reflective sighting telescope in the real environment, thereby achieving a more real display effect and enhancing the immersion of the user on the virtual environment.
Referring to fig. 9, a flowchart of a method for displaying a sighting telescope in a virtual environment according to an exemplary embodiment of the present application is shown. The present embodiment is exemplified by applying the method to the electronic device shown in fig. 5, the first device shown in fig. 6, and/or the second device shown in fig. 6, and the present embodiment can simulate a scene observed in a virtual environment by using a reflective scope. The method comprises the following steps:
step 901, displaying a first virtual environment picture, where the first virtual environment picture includes a first sighting telescope picture in which a virtual object observes a virtual environment using a reflective sighting telescope, and an aiming point in the first sighting telescope picture is located at a first position relative to a sighting telescope lens barrel of the reflective sighting telescope;
optionally, the virtual object is in three-dimensional form. The virtual object is any one of a simulated character, a cartoon character, an animation character, and a game character.
The device has the ability to control one or more virtual objects in the virtual environment. Taking the example that the device controls one virtual object, a camera model corresponding to the virtual object exists in the virtual environment, and the camera model is used for collecting a picture when the virtual environment is observed at the first view angle or the third view angle of the virtual object. Optionally, when the virtual object uses a reflex sight, the camera model may use the reflex sight to observe the virtual environment to obtain a first virtual environment picture, and the first virtual environment picture includes a first sight picture of the reflex sight observing the virtual environment.
Optionally, various controls may be further superimposed on the first virtual environment screen, and these controls include: at least one of a moving direction control, a firing control, a projectile firing control, a main weapon slot control, an auxiliary weapon slot control, a small map control, a map azimuth guiding control and team friend/enemy unit indication marks.
The first scope view refers to a view of the virtual environment observed with the reflex scope. If the lens of the reflective sighting telescope is circular, the picture of the first sighting telescope is circular. Before the virtual object moves, the aiming point in the first scope picture is located at a first position relative to the scope lens, which may be the midpoint position of the scope barrel.
As shown in fig. 10, a screen when the virtual environment is observed using the camera model of the virtual object is displayed on the first virtual environment screen 50, and the screen 50 further includes a first scope screen 52 when the virtual environment is observed using a reflex scope 51. The first scope image 52 may be a circular image, with the virtual environment observed in the first scope image 52 being unamplified or magnified at a preset scale (1.0, 2.0, 4.0, 8.0, or 15.0). The sighting point in the first sighting telescope picture 52 is located at a first position, such as the midpoint of the first sighting telescope picture 52, with respect to the sighting telescope barrel 51 of the reflex sighting telescope.
Step 902, controlling a virtual object to move at a first speed in a virtual environment, and controlling a sighting telescope lens barrel of a reflective sighting telescope to move at a second speed in the virtual environment, wherein the first speed is greater than the second speed;
optionally, the device controls the virtual object to move at a first speed in the virtual environment according to a control signal triggered by a user. When the device is a mobile phone or a tablet with a touch display screen, the control signal may be generated by a user touching and dragging an interactive control, where the interactive control includes: at least one of a control for controlling the translation of the virtual object, a control for controlling the rotation of the virtual object, a control for controlling the looking-up/looking-down of the virtual object, a control for controlling the jumping of the virtual object and a control for controlling the crawling of the virtual object; when the terminal is a desktop computer or a portable laptop, the dragging operation may be generated by a user dragging the interactive control through an external device, such as: and clicking the interactive control by the user through a mouse and dragging.
The movement in the embodiment of the present application refers to the movement of the trunk of the virtual object, which drives the arm (and the virtual firearm and the reflex sight held by the arm) to change positions. The activity includes: controlling the virtual object to perform at least one of panning, controlling the virtual object to steer in a left/right direction, controlling the virtual object to switch between an overhead view state and a viewing state, and controlling the virtual object to switch between an overhead view state and a viewing state.
For more realistic simulation purposes, the sliding speed of the torso portion should be greater than that of the arm portion when the virtual object is moving, i.e. the movement on the arm will be slightly retarded and then follow the same or a corresponding sliding movement of the torso portion. Thus, the device controls the virtual object to move at a first speed in the virtual environment and controls the scope barrel of the reflex scope to move at a second speed in the virtual environment, the first speed being greater than the second speed. Optionally, the difference between the first speed and the second speed is less than a threshold.
Step 903, acquiring the lens orientation of the camera model corresponding to the virtual object;
the camera model has its own three-dimensional coordinates and lens orientation in the virtual environment. Optionally, the lens orientation is characterized by a lens orientation vector. The lens orientation vector may be a unit vector having an azimuth angle, a pitch angle, and a roll angle.
Step 904, moving the aiming point position of the aiming lens of the reflective sighting telescope to the extension line towards the lens;
in a virtual environment, the structure of the reflex sight can be seen with reference to fig. 3.
The device will move the aiming point position of the aiming lens to the lens orientation extension line of the camera model, i.e. the aiming point position is always at the midpoint position of the first/second virtual environment picture.
1. Multiplying the lens orientation vector by a distance parameter to obtain a lens displacement value, wherein the distance parameter is used for indicating the distance of the aiming lens relative to a camera model;
2. adding the coordinate of the camera model in the virtual environment and the lens displacement value to obtain the position coordinate of the aiming lens;
with reference to the left diagram of fig. 11, it is assumed that the virtual object 61 uses a reflex sight to observe a virtual environment before movement, at this time, the lens orientation of the camera model of the virtual object and the sight orientation of the sight lens barrel 62 of the reflex sight are both in a first direction 63, and the center of the sight lens is located on the extension line of the sight orientation; when the virtual object is moving and the lens of the camera model of the virtual object is oriented in the second direction 64, it is necessary to control the center point (i.e. the aiming point) of the aiming lens to be located on the extension line of the lens orientation. If the lens orientation vector of the camera model is V, the distance between the lens of the camera model and the aiming lens is L, and the coordinate of the camera model in the three-dimensional virtual environment is S1, then:
the position coordinates of the sighting lens S2= V × L + S1.
3. The position of the sighting lens in the virtual environment is set according to the position coordinates.
The equipment moves the aiming lens and the lens material on the aiming lens in the virtual environment according to the position coordinate. That is, the lens material moves along with the sighting lens, and the relative position of the lens material and the sighting lens does not change.
Optionally, the apparatus further sets the lens orientation of the sighting lens to be opposite to the lens orientation of the camera model so that a second sighting telescope picture corresponding to the sighting lens can be acquired.
Step 905, generating a second virtual environment picture according to the lens orientation, the lens of the sighting telescope and the sighting lens;
the equipment conducts camera framing according to the orientation of a lens of the camera model, and generates a second virtual environment picture according to pictures formed in a framing frame after the virtual environment passes through a sighting telescope lens barrel and a sighting lens.
Step 906, displaying a second virtual environment picture, where the second virtual environment picture includes a second sighting telescope picture in which the virtual object observes the virtual environment using a reflective sighting telescope, and a sighting point in the second sighting telescope picture is located at a second position relative to the sighting telescope barrel, and the second position is different from the first position.
And the equipment displays the generated second virtual environment picture. The aiming point is positioned in the center of the first virtual environment picture and the second virtual environment picture.
In one illustrative example, when the virtual object is steered to the right, the second virtual environment picture is as shown in fig. 12, the steering of the scope barrel is slightly later than the right steering of the virtual character, and therefore the position of the scope barrel is shifted to the left; in another illustrative example, when the virtual object turns to the left, the second virtual environment picture is as shown in fig. 13, and the sighting telescope barrel turns slightly later than the virtual character's left turn, so the sighting telescope barrel is positioned to the right.
Since the activity of the virtual object in the virtual environment may be continuous, the above steps 903 and 906 may be performed a plurality of times, such as 24 times per second, 60 times per second, and the like.
In summary, in the method for displaying a sighting telescope provided by this embodiment, after the virtual object moves in the virtual environment, the second virtual environment picture different from the first virtual environment picture is displayed, and the positions of the aiming point in the two virtual environment pictures relative to the sighting telescope lens barrel are different, so that a more real simulation can be performed on the reflective sighting telescope in the real environment, thereby achieving a more real display effect and enhancing the immersion of the user on the virtual environment.
In the method for displaying the sighting telescope provided by the embodiment, the virtual object is controlled to rotate at the first speed in the virtual environment, and the reflective sighting telescope is controlled to rotate at the second speed in the virtual environment, so that the trunk action of the virtual object is slightly faster than the arm action, the method better accords with the rules of objects in the real environment, and more real character animation is provided.
In the sighting telescope display method provided by the embodiment, the position of the sighting telescope is generated according to the lens orientation vector, the distance parameter and the coordinate of the camera model, so that even if a sighting telescope lens barrel of the reflex sighting telescope moves in a virtual environment picture, a sighting point on the sighting telescope can be always positioned at the center in the virtual environment picture, and the three-dimensional vertigo feeling of a user when the user watches the virtual environment picture is reduced.
In the following, embodiments of the apparatus of the present application are described, and for details not described in detail in the embodiments of the apparatus, reference may be made to the above-mentioned one-to-one corresponding method embodiments.
FIG. 14 is a block diagram of a scope display device in a virtual environment, as provided by an exemplary embodiment of the present application. The scope display device may be implemented as all or part of an electronic device by software, hardware, or a combination of both. The device includes: a display module 620 and a control module 640.
A display module 620, configured to display a first virtual environment picture, where the first virtual environment picture includes a first sighting telescope picture in which a virtual object views the virtual environment using a reflective sighting telescope, and a sighting point in the first sighting telescope picture is located at a first position relative to a sighting telescope barrel of the reflective sighting telescope;
a control module 640, configured to control the virtual object to perform activities in the virtual environment;
the display module 620 is configured to display a second virtual environment picture, where the second virtual environment picture includes a second sighting telescope picture in which the virtual object observes the virtual environment using the reflective sighting telescope, and a sighting point in the second sighting telescope picture is located at a second position relative to the sighting telescope barrel, where the second position is different from the first position.
In an optional embodiment, the display module 620 is configured to obtain a lens orientation of a camera model corresponding to the virtual object; moving a sighting point position of a sighting lens of the reflex sighting telescope to an extension line of the lens orientation; and generating and displaying the second sighting telescope picture according to the lens orientation, the sighting telescope lens and the sighting telescope.
In an alternative embodiment, the lens orientation is represented by a lens orientation vector;
the display module 620 is configured to multiply the lens orientation vector by a distance parameter to obtain a lens displacement value, where the distance parameter is used to indicate a distance between the sighting lens and the camera model; adding the coordinate of the camera model in the virtual environment to the lens displacement value to obtain the position coordinate of the aiming lens; setting the position of the sighting lens in the virtual environment according to the position coordinates.
In an alternative embodiment, the display module 620 is further configured to set a lens orientation of the collimating lens to be opposite to the lens orientation.
In an alternative embodiment, the control module 640 is further configured to control the virtual object to move at a first speed in the virtual environment, and control the scope barrel of the reflex scope to move at a second speed in the virtual environment, where the first speed is greater than the second speed.
In an optional embodiment, the aiming point is located at a picture center position in both the first virtual environment picture and the second virtual environment picture.
Fig. 15 shows a block diagram of an electronic device 1500 according to an exemplary embodiment of the present invention. The electronic device 1500 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Electronic device 1500 may also be referred to by other names as user equipment, portable electronic device, laptop electronic device, desktop electronic device, and so on.
In general, electronic device 1500 includes: a processor 1501 and memory 1502.
Processor 1501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1501 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). Processor 1501 may also include a main processor and a coprocessor, where the main processor is a processor used to process data in a wake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1501 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, processor 1501 may also include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
The memory 1502 may include one or more computer-readable storage media, which may be non-transitory. The memory 1502 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1502 is used to store at least one instruction for execution by processor 1501 to implement a firearm assembly method in a virtual environment as provided by method embodiments herein.
In some embodiments, the electronic device 1500 may further include: a peripheral interface 1503 and at least one peripheral. The processor 1501, memory 1502, and peripheral interface 1503 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 1503 via buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1504, touch screen display 1505, camera 1506, audio circuitry 1507, positioning assembly 1508, and power supply 1509.
The peripheral device interface 1503 may be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1501 and the memory 1502. In some embodiments, the processor 1501, memory 1502, and peripheral interface 1503 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1501, the memory 1502, and the peripheral interface 1503 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuitry 1504 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 1504 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1504 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1504 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 1504 can communicate with other electronic devices via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1504 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display 1505 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1505 is a touch display screen, the display screen 1505 also has the ability to capture touch signals on or over the surface of the display screen 1505. The touch signal may be input to the processor 1501 as a control signal for processing. In this case, the display screen 1505 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display 1505 may be one, providing the front panel of electronic device 1500; in other embodiments, the display 1505 can be at least two, respectively disposed on different surfaces of the electronic device 1500 or in a foldable design; in still other embodiments, the display 1505 may be a flexible display disposed on a curved surface or a folded surface of the electronic device 1500. Even further, the display 1505 may be configured in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 1505 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.
Camera assembly 1506 is used to capture images or video. Optionally, the camera assembly 1506 includes a front camera and a rear camera. Generally, a front camera is disposed on a front panel of an electronic apparatus, and a rear camera is disposed on a rear surface of the electronic apparatus. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera head assembly 1506 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1507 may include a microphone and speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1501 for processing or inputting the electric signals to the radio frequency circuit 1504 to realize voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of the electronic device 1500. The microphone may also be an array microphone or an omni-directional acquisition microphone. The speaker is then used to convert electrical signals from the processor 1501 or the radio frequency circuitry 1504 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 1507 may also include a headphone jack.
The positioning component 1508 is configured to locate a current geographic Location of the electronic device 1500 to implement navigation or LBS (Location Based Service). The Positioning component 1508 may be a Positioning component based on the united states GPS (Global Positioning System), the chinese beidou System, or the russian galileo System.
The power supply 1509 is used to supply power to the various components in the electronic device 1500. The power supply 1509 may be alternating current, direct current, disposable batteries, or rechargeable batteries. When the power supply 1509 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery can also be used to support fast charge technology.
In some embodiments, the electronic device 1500 also includes one or more sensors 1510. The one or more sensors 1510 include, but are not limited to: acceleration sensor 1511, gyro sensor 1512, pressure sensor 1513, fingerprint sensor 1514, optical sensor 1515, and proximity sensor 1516.
The acceleration sensor 1511 may detect the magnitude of acceleration on three coordinate axes of a coordinate system established with the electronic apparatus 1500. For example, the acceleration sensor 1511 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1501 may control the touch screen 1505 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1511. The acceleration sensor 1511 may also be used for acquisition of motion data of a game or a user.
The gyroscope sensor 1512 may detect a body direction and a rotation angle of the electronic device 1500, and the gyroscope sensor 1512 and the acceleration sensor 1511 may cooperate to collect a 3D motion of the user on the electronic device 1500. The processor 1501 may implement the following functions according to the data collected by the gyro sensor 1512: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensor 1513 may be disposed on a side bezel of the electronic device 1500 and/or underneath the touch screen display 1505. When the pressure sensor 1513 is disposed on the side frame of the electronic device 1500, the holding signal of the user to the electronic device 1500 may be detected, and the processor 1501 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1513. When the pressure sensor 1513 is disposed at a lower layer of the touch display 1505, the processor 1501 controls the operability control on the UI interface according to the pressure operation of the user on the touch display 1505. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The optical sensor 1515 is used to collect ambient light intensity. In one embodiment, processor 1501 may control the brightness of the display on touch screen 1505 based on the intensity of ambient light collected by optical sensor 1515. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1505 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 1505 is turned down. In another embodiment, the processor 1501 may also dynamically adjust the shooting parameters of the camera assembly 1506 according to the ambient light intensity collected by the optical sensor 1515.
A proximity sensor 1516, also referred to as a distance sensor, is typically provided on the front panel of the electronic device 1500. The proximity sensor 1516 is used to capture the distance between the user and the front of the electronic device 1500. In one embodiment, the processor 1501 controls the touch display 1505 to switch from a bright screen state to a dark screen state when the proximity sensor 1516 detects that the distance between the user and the front of the electronic device 1500 is gradually decreased; when the proximity sensor 1516 detects that the distance between the user and the front of the electronic device 1500 gradually becomes larger, the processor 1501 controls the touch display 1505 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 15 is not limiting to electronic device 1500, and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components may be used.
The present application further provides a computer-readable storage medium having at least one instruction, at least one program, set of codes, or set of instructions stored therein, which is loaded and executed by a processor to implement the method for sighting telescope display in a virtual environment provided by the above-mentioned method embodiments.
The present application further provides a computer program product which, when run on an electronic device, causes the electronic device to perform the method of sighting telescope display in a virtual environment of the above-described method embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, and the program may be stored in a computer-readable storage medium, where the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk.
The present invention is not intended to be limited to the particular embodiments shown and described, but is to be accorded the widest scope consistent with the principles and novel features herein disclosed.

Claims (6)

1. A method of sighting telescope display in a virtual environment, the method comprising:
displaying a first virtual environment picture including a first sighting telescope picture of a virtual object viewing the virtual environment using a reflex sighting telescope, an aiming point in the first sighting telescope picture being at a first position relative to a sighting telescope barrel of the reflex sighting telescope, the aiming point in the first sighting telescope picture being at a midpoint position of the first sighting telescope picture;
controlling the virtual object to move at a first speed in the virtual environment and controlling the scope barrel of the reflex scope to move at a second speed in the virtual environment, the first speed being greater than the second speed, a difference between the first speed and the second speed being less than a threshold, the moving including at least one of controlling the virtual object to translate, controlling the virtual object to steer in a left or right direction, controlling the virtual object to switch between a top view state and a bottom view state;
acquiring the lens orientation of a camera model corresponding to the virtual object, wherein the lens orientation is represented by a lens orientation vector;
multiplying the lens orientation vector by a distance parameter to obtain a lens displacement value, wherein the distance parameter is used for indicating the distance of the aiming lens relative to the camera model;
adding the coordinate of the camera model in the virtual environment to the lens displacement value to obtain the position coordinate of the aiming lens;
controlling the aiming lens and a lens material on the aiming lens to move in the virtual environment according to the position coordinate, wherein the relative positions of the lens material and the aiming lens are unchanged, and the lens material is attached to the aiming lens and used for presenting the aiming point;
performing camera framing according to the lens orientation of the camera model, and generating and displaying a second sighting telescope picture according to a picture formed in a framing frame after the virtual environment passes through the sighting telescope lens barrel and the sighting lens, wherein the second sighting telescope picture is a picture obtained by observing the virtual environment by the virtual object through the reflective sighting telescope, a sighting point in the second sighting telescope picture is located at a second position relative to the sighting telescope lens barrel, the second position is different from the first position, the sighting point in the second sighting telescope picture is located at the midpoint position of the second sighting telescope picture, the sighting point is located on an extension line of the lens orientation, and the sighting point is located at the picture center position in the first virtual environment picture and the second virtual environment picture.
2. The method of claim 1, further comprising:
the lens orientation of the sighting lens is set to be opposite to the lens orientation.
3. A sighting telescope display device in a virtual environment, the device comprising:
a display module, configured to display a first virtual environment picture, where the first virtual environment picture includes a first sighting telescope picture in which a virtual object views the virtual environment using a reflective sighting telescope, and an aiming point in the first sighting telescope picture is located at a first position with respect to a sighting telescope barrel of the reflective sighting telescope, and the aiming point in the first sighting telescope picture is located at a midpoint position of the first sighting telescope picture;
a control module to control the virtual object to move at a first speed in the virtual environment and to control the scope barrel of the reflex scope to move at a second speed in the virtual environment, the first speed being greater than the second speed, a difference between the first speed and the second speed being less than a threshold, the moving including at least one of controlling the virtual object to translate, controlling the virtual object to steer in a left or right direction, controlling the virtual object to switch between a top view state and a top view state, controlling the virtual object to switch between a bottom view state and a top view state;
the display module is used for acquiring the lens orientation of the camera model corresponding to the virtual object, and the lens orientation is represented by a lens orientation vector; multiplying the lens orientation vector by a distance parameter to obtain a lens displacement value, wherein the distance parameter is used for indicating the distance of the aiming lens relative to the camera model; adding the coordinate of the camera model in the virtual environment to the lens displacement value to obtain the position coordinate of the aiming lens; controlling the aiming lens and a lens material on the aiming lens to move in the virtual environment according to the position coordinate, wherein the relative positions of the lens material and the aiming lens are unchanged, and the lens material is attached to the aiming lens and used for presenting the aiming point; and performing camera framing according to the lens orientation of the camera model, and generating and displaying a second sighting telescope picture according to a picture formed in a framing frame after the virtual environment passes through the sighting telescope lens barrel and the sighting lens, wherein the second sighting telescope picture is a picture obtained by observing the virtual environment by the virtual object through the reflective sighting telescope, a sighting point in the second sighting telescope picture is located at a second position relative to the sighting telescope lens barrel, the second position is different from the first position, the sighting point in the second sighting telescope picture is located at the midpoint of the second sighting telescope picture, the sighting point is located on the extension line of the lens orientation, and the sighting point is located at the picture center position in the first virtual environment picture and the second virtual environment picture.
4. The apparatus of claim 3,
the display module is further used for setting the lens orientation of the aiming lens to be the opposite direction to the lens orientation.
5. An electronic device comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement a scope display method in a virtual environment as claimed in claim 1 or 2.
6. A computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement a method of sighting telescope display in a virtual environment according to claim 1 or 2.
CN201810539548.1A 2018-05-30 2018-05-30 Method, device and storage medium for displaying sighting telescope in virtual environment Active CN108744511B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810539548.1A CN108744511B (en) 2018-05-30 2018-05-30 Method, device and storage medium for displaying sighting telescope in virtual environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810539548.1A CN108744511B (en) 2018-05-30 2018-05-30 Method, device and storage medium for displaying sighting telescope in virtual environment

Publications (2)

Publication Number Publication Date
CN108744511A CN108744511A (en) 2018-11-06
CN108744511B true CN108744511B (en) 2022-11-29

Family

ID=64004276

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810539548.1A Active CN108744511B (en) 2018-05-30 2018-05-30 Method, device and storage medium for displaying sighting telescope in virtual environment

Country Status (1)

Country Link
CN (1) CN108744511B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110559647B (en) * 2019-09-25 2023-07-14 网易(杭州)网络有限公司 Control method and device for sight display in virtual shooting game, medium and equipment
CN113256710B (en) * 2021-05-21 2022-08-02 深圳市慧鲤科技有限公司 Method and device for displaying foresight in game, computer equipment and storage medium
CN113730909B (en) * 2021-09-14 2023-06-20 腾讯科技(深圳)有限公司 Aiming position display method and device, electronic equipment and storage medium
CN113769398A (en) * 2021-09-14 2021-12-10 网易(杭州)网络有限公司 Processing method, device and equipment of virtual sighting telescope and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105993167A (en) * 2014-02-13 2016-10-05 株式会社万代南梦宫娱乐 Image generation system, image generation method, and information storage medium
CN107029425A (en) * 2016-02-04 2017-08-11 网易(杭州)网络有限公司 A kind of control system of shooting game, method and terminal
CN107029428A (en) * 2016-02-04 2017-08-11 网易(杭州)网络有限公司 A kind of control system of shooting game, method and terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7325354B2 (en) * 2004-05-06 2008-02-05 Insight Technology, Inc. Weapon aiming device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105993167A (en) * 2014-02-13 2016-10-05 株式会社万代南梦宫娱乐 Image generation system, image generation method, and information storage medium
CN107029425A (en) * 2016-02-04 2017-08-11 网易(杭州)网络有限公司 A kind of control system of shooting game, method and terminal
CN107029428A (en) * 2016-02-04 2017-08-11 网易(杭州)网络有限公司 A kind of control system of shooting game, method and terminal

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Parallax free red dot sight;Teodor Stimec;《https://optics-info.com/red-dot-sights/image2/》;20140714;全文 *
三角和全息各自有什么有点和缺点?;代号画中仙;《https://tieba.baidu.com/p/5659946828?pn=3》;20180422;全文 *

Also Published As

Publication number Publication date
CN108744511A (en) 2018-11-06

Similar Documents

Publication Publication Date Title
US11703993B2 (en) Method, apparatus and device for view switching of virtual environment, and storage medium
CN108786110B (en) Method, device and storage medium for displaying sighting telescope in virtual environment
CN109529319B (en) Display method and device of interface control and storage medium
JP7419382B2 (en) Method and apparatus and computer program for controlling a virtual object to mark a virtual item
CN111035918B (en) Reconnaissance interface display method and device based on virtual environment and readable storage medium
CN108744511B (en) Method, device and storage medium for displaying sighting telescope in virtual environment
CN110917616B (en) Orientation prompting method, device, equipment and storage medium in virtual scene
CN111420402B (en) Virtual environment picture display method, device, terminal and storage medium
CN110694273A (en) Method, device, terminal and storage medium for controlling virtual object to use prop
CN111659117B (en) Virtual object display method and device, computer equipment and storage medium
CN110585695B (en) Method, apparatus, device and medium for using near-war property in virtual environment
CN111481934B (en) Virtual environment picture display method, device, equipment and storage medium
CN111589146A (en) Prop operation method, device, equipment and storage medium based on virtual environment
CN110448908B (en) Method, device and equipment for applying sighting telescope in virtual environment and storage medium
CN111273780B (en) Animation playing method, device and equipment based on virtual environment and storage medium
CN111589141B (en) Virtual environment picture display method, device, equipment and medium
CN113577765B (en) User interface display method, device, equipment and storage medium
CN112330823B (en) Virtual prop display method, device, equipment and readable storage medium
CN112221134A (en) Virtual environment-based picture display method, device, equipment and medium
CN111672106A (en) Virtual scene display method and device, computer equipment and storage medium
CN111013137B (en) Movement control method, device, equipment and storage medium in virtual scene
CN112451969A (en) Virtual object control method and device, computer equipment and storage medium
CN110738738B (en) Virtual object marking method, equipment and storage medium in three-dimensional virtual scene
CN112755517B (en) Virtual object control method, device, terminal and storage medium
CN112121438B (en) Operation prompting method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant