CN113926194A - Method, apparatus, device, medium, and program product for displaying picture of virtual scene - Google Patents
Method, apparatus, device, medium, and program product for displaying picture of virtual scene Download PDFInfo
- Publication number
- CN113926194A CN113926194A CN202111226478.2A CN202111226478A CN113926194A CN 113926194 A CN113926194 A CN 113926194A CN 202111226478 A CN202111226478 A CN 202111226478A CN 113926194 A CN113926194 A CN 113926194A
- Authority
- CN
- China
- Prior art keywords
- pupil
- sighting telescope
- state
- frame
- virtual object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/303—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
- A63F2300/307—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for displaying an additional window with a view from the top of the game field, e.g. radar screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8076—Shooting
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application provides a picture display method, a picture display device, picture display equipment, a computer readable storage medium and a computer program product of a virtual scene; the method comprises the following steps: presenting an interface for aiming the virtual object by using a sighting telescope included in the shooting prop by adopting a first person visual angle; a frame of an eyepiece in the sighting telescope is presented in the interface, and a visual field picture of a virtual scene seen by the virtual object through the sighting telescope is displayed in an internal area of the frame; displaying a pupil formed by optically imaging an outer frame of the sighting telescope between the frame and the visual field picture; when the state of the sighting telescope or the state of the virtual object is changed, the display effect of the pupil is controlled to be changed synchronously so as to adapt to the changed state. By the method and the device, the real and vivid scene picture can be displayed so as to improve the presence in the virtual scene.
Description
Technical Field
The present application relates to image processing technologies, and in particular, to a method, an apparatus, a device, a computer-readable storage medium, and a computer program product for displaying a virtual scene.
Background
In applications of virtual scenes such as games, presence is a key factor in deciding whether a player can be brought into a virtual scene, for example, for shooting-type games, a key factor in encouraging a player to be immersed in a shooting scene is: whether a game scene observed by a player through a sighting telescope of a shooting prop is lifelike or not is judged, and a technical means for realizing the telepresence is still lacked in the related technology.
Disclosure of Invention
The embodiment of the application provides a picture display method, a picture display device, equipment, a computer readable storage medium and a computer program product of a virtual scene, which can display real and vivid scene pictures to improve the presence in the virtual scene.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a picture display method of a virtual scene, which comprises the following steps:
presenting an interface for aiming the virtual object by using a sighting telescope included in the shooting prop by adopting a first person visual angle;
a frame of an eyepiece in the sighting telescope is presented in the interface, and a visual field picture of a virtual scene seen by the virtual object through the sighting telescope is displayed in an internal area of the frame;
displaying a pupil formed by optically imaging an outer frame of the sighting telescope between the frame and the visual field picture;
when the state of the sighting telescope or the state of the virtual object is changed, the display effect of the pupil is controlled to be changed synchronously so as to adapt to the changed state.
An embodiment of the present application provides a picture display device of a virtual scene, including:
the first presentation module is used for presenting an interface for aiming the virtual object by using a sighting telescope included in the shooting prop by adopting a first person visual angle;
the second presentation module is used for presenting a frame of an eyepiece in the sighting telescope in the interface, and a view picture of a virtual scene seen by the virtual object through the sighting telescope is displayed in the internal area of the frame;
the third presentation module is used for displaying a pupil formed by optical imaging of the outer frame of the sighting telescope between the frame and the visual field picture;
and the presentation control module is used for controlling the display effect of the pupil to synchronously change when the state of the sighting telescope or the state of the virtual object is changed so as to adapt to the changed state.
In the above scheme, the first presentation module is further configured to respond to a scope opening instruction for the shooting prop and execute a scope opening operation for the sighting telescope;
in the process of executing the open mirror operation, adopting a first-person visual angle to display the process of gradually opening the aiming interface corresponding to the sighting telescope;
and the third rendering module is further used for displaying a change process of the pupil formed by optically imaging the outer frame of the sighting telescope from a full state to a full state between the frame and the visual field picture along with the execution of the mirror opening operation in the process of executing the mirror opening operation of the sighting telescope.
In the above scheme, the sighting telescope further comprises an objective lens and a lens barrel, wherein the objective lens and the eyepiece lens are respectively located at two ends of the lens barrel;
the first presenting module is further configured to display a pupil formed by optically imaging the lens barrel of the sighting telescope between the frame and the view field picture, and a color of the pupil is consistent with a color of the lens barrel.
In the above scheme, the apparatus further comprises:
and the color switching module is used for responding to a color switching instruction aiming at the lens barrel, switching the current color of the frame of the eyepiece in the interface into a target color, and switching the color of the pupil into the target color.
In the above scheme, the third presenting module is further configured to obtain a material of an objective lens included in the scope and a material of the eyepiece;
determining an adapted transparency based on the material of the objective lens and the material of the eyepiece;
and displaying a pupil formed by optically imaging the outer frame of the sighting telescope between the frame and the visual field picture based on the transparency.
In the foregoing solution, the third presenting module is further configured to display a pupil in the first state, which is formed by optically imaging an outer frame of the sighting telescope, between the frame and the view frame when the sighting telescope is in the static state or the virtual object is in the static state;
the first state is static, and the pupil comprises two concentric rings when in the first state, wherein the color of the pupil in the outer ring is the same, and the color of the pupil in the inner ring becomes gradually lighter along the radial direction from the edge of the inner ring to the center of the eyepiece.
In the foregoing solution, the third presenting module is further configured to display a pupil in a second state formed by optically imaging an outer frame of the sighting telescope in a process that the virtual object is in a traveling state;
the second state is dynamic, the pupil partially covers the view field picture in the process that the pupil is in the second state, and the size of the covered view field picture is changed.
In the above scheme, the apparatus further comprises:
and the fourth rendering module is used for dynamically displaying the mask proportion of the pupil to the visual field picture in the process of displaying the pupil in the second state.
In the above scheme, the apparatus further comprises:
the state maintaining module is used for responding to a continuous shooting instruction aiming at the shooting prop and controlling the shooting prop to continuously shoot the target aimed at by the sighting telescope;
and keeping the pupil in a default state during continuous shooting of the object, so that the display effect of the pupil is unchanged during continuous shooting of the object.
In the above scheme, the apparatus further comprises:
a state control module for responding to the firing instruction of the firing prop triggered at the first time, controlling the firing prop to fire the target aimed by the sighting telescope and controlling the firing prop to
And controlling the pupil to be in a default state in a target period taking the first moment as a starting moment so as to enable the display effect of the pupil in the target period to be unchanged.
In the above scheme, the apparatus further comprises:
a display update module for releasing the default state of the pupil and for releasing the default state when a second time arrives after the first time and a time interval between the second time and the first time is the same as the target period
Acquiring the state of the sighting telescope and the state of the virtual object;
and updating the display effect of the pupil according to the state of the sighting telescope and the state of the virtual object.
In the above-mentioned scheme, the shooting stage property still is equipped with side gun sight, the device still includes:
the sighting telescope switching module is used for responding to a sighting telescope switching instruction aiming at the shooting prop and switching the sighting telescope into the side sighting telescope, wherein the magnification of the side sighting telescope is different from that of the sighting telescope;
presenting a sighting interface corresponding to the side sighting telescope, and displaying a first pupil formed by optically imaging an outer frame of the side sighting telescope in the sighting interface, wherein the width of the first pupil corresponds to the magnification of the side sighting telescope.
In the above scheme, the apparatus further comprises:
the magnification selection module is used for presenting at least two magnification options in the interface;
based on the at least two options of magnification, in response to a switching instruction for the sighting telescope, canceling the interface displayed to display the interface of the virtual scene, and controlling the shooting prop to assemble a target sighting telescope indicated by the switching instruction;
presenting a sighting interface corresponding to the target sighting telescope in response to a lens opening instruction triggered based on the interface of the virtual scene;
and displaying a second pupil formed by optically imaging the outer frame of the target sighting telescope in the sighting interface, wherein the width of the second pupil corresponds to the magnification of the target sighting telescope.
In the above scheme, the apparatus further comprises:
the definition adjusting module is used for acquiring light environments in a virtual scene where the virtual object is located, wherein the light intensities in different light environments are different;
and when the light environment in which the virtual object is positioned is changed, synchronously changing the definition of the pupil so as to adapt to the changed light environment.
In the above solution, before the displaying the view picture of the virtual scene seen by the virtual object through the sighting telescope, the apparatus further includes:
the system comprises a surface patch determining module, a pupil setting module and a control module, wherein the surface patch determining module is used for creating a surface patch which is consistent with an eyepiece of the sighting telescope and drawing a chartlet corresponding to the surface patch, the chartlet comprises a first part corresponding to the visual field picture and a second part corresponding to the pupil, and the first part is transparent color;
combining the chartlet with the patch to obtain a target patch;
the third presentation module is further configured to obtain a scene patch corresponding to the virtual scene and a frame patch corresponding to an outer frame of the sighting telescope;
rendering the scene patch, the frame patch and the target patch according to a preset rendering sequence so as to display a pupil formed by optically imaging an outer frame of the sighting telescope between the frame and the visual field picture.
In the foregoing solution, the patch determining module is further configured to obtain an initial map corresponding to the patch, where the initial map includes the first portion and an initial portion corresponding to the pupil, and a color value of each pixel in the initial portion corresponds to a default state of the pupil;
determining the state of the sighting telescope and the state of the virtual object, and determining the central point of pixel sampling according to the state of the sighting telescope and the state of the virtual object;
and based on the central point, carrying out pixel sampling in the initial mapping to obtain a mapping corresponding to the patch.
An embodiment of the present application provides a terminal device, including:
a memory for storing executable instructions;
and the processor is used for realizing the picture display method of the virtual scene provided by the embodiment of the application when the executable instructions stored in the memory are executed.
The embodiment of the present application provides a computer-readable storage medium, which stores executable instructions for causing a processor to execute the executable instructions, so as to implement the method for displaying the virtual scene picture provided by the embodiment of the present application.
The embodiment of the application has the following beneficial effects:
in the process of aiming through the sighting telescope included in the shooting prop, a pupil formed by optical imaging of an outer frame of the sighting telescope is displayed between the frame of the sighting telescope and a visual field picture besides the visual field picture presenting a virtual scene in an aiming interface, and when the state of the sighting telescope or the state of a virtual object is changed, the display effect of the pupil is controlled to be synchronously changed.
Drawings
Fig. 1A is a schematic view of an application mode of a picture displaying method of a virtual scene according to an embodiment of the present application;
fig. 1B is a schematic view of an application mode of a picture displaying method of a virtual scene according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a terminal device 400 provided in an embodiment of the present application;
fig. 3 is a schematic flowchart of a picture displaying method for a virtual scene according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a component structure of a sighting telescope provided by an embodiment of the application;
FIG. 5 is a schematic view of a targeting interface provided by embodiments of the present application;
FIG. 6 is a schematic view of an aiming interface of a process for opening a mirror provided by an embodiment of the present application;
FIG. 7 is a schematic view of a targeting interface of an embodiment of the present application;
FIG. 8 is a schematic view of a display of a scope provided by an embodiment of the present application;
FIG. 9 is a schematic illustration of a map provided by an embodiment of the present application;
FIG. 10 is a schematic view of a map display provided by an embodiment of the present application;
FIG. 11 is a flowchart of a method for generating a map according to an embodiment of the present application;
FIG. 12 is a schematic diagram of a map update provided by an embodiment of the present application;
fig. 13 is a flowchart of a map updating method according to an embodiment of the present application.
Detailed Description
In order to make the objectives, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the attached drawings, the described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts shall fall within the protection scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
In the description that follows, reference is made to the term "first \ second …" merely to distinguish between similar objects and not to represent a particular ordering for the objects, it being understood that "first \ second …" may be interchanged in a particular order or sequence of orders as permitted to enable embodiments of the application described herein to be practiced in other than the order illustrated or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
Before further detailed description of the embodiments of the present application, terms and expressions referred to in the embodiments of the present application will be described, and the terms and expressions referred to in the embodiments of the present application will be used for the following explanation.
1) The client, an application program running in the terminal for providing various services, such as a video playing client, a game client, etc.
2) In response to the condition or state on which the performed operation depends, one or more of the performed operations may be in real-time or may have a set delay when the dependent condition or state is satisfied; there is no restriction on the order of execution of the operations performed unless otherwise specified.
3) The virtual scene is a virtual scene displayed (or provided) when an application program runs on a terminal, and the virtual scene may be a simulation environment of a real world, a semi-simulation semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, the virtual scene may include sky, land, sea, and the like, the land may include environmental elements such as desert, city, and the like, and the user may control the virtual object to move in the virtual scene.
4) Virtual objects, the appearance of various people and objects in the virtual scene that can interact, or movable objects in the virtual scene. The movable object may be a virtual character, a virtual animal, an animation character, etc., such as a character, animal, etc., displayed in a virtual scene. The virtual object may be an avatar in the virtual scene that is virtual to represent the user. The virtual scene may include a plurality of virtual objects, each virtual object having its own shape and volume in the virtual scene and occupying a portion of the space in the virtual scene.
5) The scene data, which represents feature data in the virtual scene, may include, for example, the position of the virtual object in the virtual scene, the time required to wait for various functions provided in the virtual scene (depending on the number of times the same function can be used within a certain time), and attribute values of various states of the game virtual object, such as a life value and a magic value.
6) The stop refers to an entity that limits the light beam in the optical system, such as an aperture that limits the imaging light speed, or an aperture or frame that limits the imaging range. It may be the edge of a lens, a frame or a specially provided screen with holes. The aperture diaphragm or the corresponding image thereof must be outside the optical system, so that the pupil of the eye coincides with the aperture diaphragm, thereby achieving good observation effect.
7) The pupil is an image of the aperture diaphragm and is divided into an entrance pupil and an exit pupil, wherein the entrance pupil is an image formed by an optical system in front of the entrance pupil and is called an entrance pupil for short; the exit pupil is an image formed by its rear optical system, and is simply called an exit pupil.
The embodiment of the application provides a picture display method and device of a virtual scene, terminal equipment, a computer readable storage medium and a computer program product, which can display real and vivid scene pictures to improve the presence in the virtual scene. In order to facilitate easier understanding of the image display method of the virtual scene provided in the embodiment of the present application, an exemplary implementation scenario of the image display method of the virtual scene provided in the embodiment of the present application is first described.
In some embodiments, the virtual scene may be a picture presented in a military exercise simulation, and a user may simulate a tactic, a strategy or a tactics through virtual objects belonging to different teams in the virtual scene, so that the virtual scene has a great guiding effect on the command of military operations.
In other embodiments, the virtual scene may also be an environment for game characters to interact with, for example, game characters to play against in the virtual scene, and the two parties may interact with each other in the virtual scene by controlling actions of the game characters, so that the user may relieve life stress during the game.
In an implementation scenario, referring to fig. 1A, fig. 1A is an application mode schematic diagram of the screen display method for a virtual scene provided in the embodiment of the present application, and is applicable to application modes that can complete calculation of related data of the virtual scene 100 completely depending on the computing capability of the graphics processing hardware of the terminal device 400, such as a game in a single-computer/offline mode, and output of the virtual scene is completed through various different types of terminal devices 400, such as a smart phone, a tablet computer, and a virtual reality/augmented reality device. As an example, types of Graphics Processing hardware include a Central Processing Unit (CPU) and a Graphics Processing Unit (GPU).
When the visual perception of the virtual scene 100 is formed, the terminal device 400 calculates and displays required data through the graphic computing hardware, completes the loading, analysis and rendering of the display data, and outputs a video frame capable of forming the visual perception on the virtual scene at the graphic output hardware, for example, a two-dimensional video frame is displayed on a display screen of a smart phone, or a video frame realizing a three-dimensional display effect is projected on a lens of augmented reality/virtual reality glasses; in addition, in order to enrich the perception effect, the terminal device 400 may also form one or more of auditory perception, tactile perception, motion perception, and taste perception by means of different hardware.
As an example, the terminal device 400 runs a client 410 (e.g. a standalone version of a game application), and outputs a virtual scene including role playing during the running process of the client 410, where the virtual scene may be an environment for game role interaction, such as a plain, a street, a valley, and the like for game role battle; taking the example of displaying the virtual scene 100 at the first-person viewing angle, an interface 101 in which the virtual object is aimed by using a sighting telescope included in the shooting prop is presented in the virtual scene 100; the virtual object may be a game character controlled by a user, that is, the virtual object is controlled by a real user, and will move in the virtual scene 100 in response to an operation of the real user on a controller (e.g., a touch screen, a voice-operated switch, a keyboard, a mouse, a joystick, etc.), for example, when the real user moves the joystick to the right, the virtual object will move to the right in the virtual scene 100, and may also remain stationary in place, jump, and control the virtual object to perform a shooting operation, etc.
For example, an interface 101 in which a virtual object is aimed by using a sighting telescope included in a shooting prop is presented in a virtual scene 100, a frame 102 of an eyepiece in the sighting telescope is presented in the aimed interface 101, a visual field picture 103 of the virtual scene which is viewed by the virtual object through the sighting telescope is shown in the inner area of the frame, and a pupil 104 formed by optically imaging the outer frame of the sighting telescope is displayed between the frame 102 and the visual field picture 103; when the state of the sighting telescope or the state of the virtual object is changed, the display effect of the pupil is controlled to be synchronously changed so as to adapt to the changed state, for example, when the color of the lens barrel in the sighting telescope is black, a black pupil is displayed between the frame 102 and the visual field picture 103, and when the color of the lens barrel in the sighting telescope is switched from black to red, the color of the pupil displayed between the frame 102 and the visual field picture 103 is switched from black to red; thus, compared with the prior art in which all the view frames are displayed in the frame 102, the pupil corresponding to the telescope is also displayed between the frame 102 and the view frame 103, so that a real stereoscopic impression is created, and the telepresence of the user in the virtual scene is improved.
In another implementation scenario, referring to fig. 1B, fig. 1B is a schematic view of an application mode of the screen presentation method for a virtual scene provided in this embodiment, which is applied to a terminal device 400 and a server 200, and is adapted to complete virtual scene calculation depending on the calculation capability of the server 200 and output the application mode of the virtual scene at the terminal device 400.
Taking the example of forming the visual perception of the virtual scene 100, the server 200 performs calculation of display data (e.g., scene data) related to the virtual scene and sends the calculated display data to the terminal device 400 through the network 300, the terminal device 400 relies on graphics computing hardware to complete loading, parsing and rendering of the calculated display data, and relies on graphics output hardware to output the virtual scene to form the visual perception, for example, a two-dimensional video frame may be presented on a display screen of a smartphone, or a video frame realizing a three-dimensional display effect may be projected on a lens of augmented reality/virtual reality glasses; for perception in the form of a virtual scene, it is understood that an auditory perception may be formed by means of a corresponding hardware output of the terminal device 400, for example using a microphone, a tactile perception using a vibrator, etc.
As an example, the terminal device 400 runs a client 410 (e.g., a network-based game application) thereon, and performs game interaction with other users through the connection server 200 (e.g., a game server), the terminal device 400 outputs the virtual scene 100 of the client 410, displays the virtual scene 100 in a first-person perspective, for example, and presents an interface 101 in the virtual scene 100, in which a virtual object is aimed by using a sighting telescope included in the shooting prop; the virtual object may be a game character controlled by a user, that is, the virtual object is controlled by a real user, and will move in the virtual scene 100 in response to an operation of the real user on a controller (e.g., a touch screen, a voice-operated switch, a keyboard, a mouse, a joystick, etc.), for example, when the real user moves the joystick to the right, the virtual object will move to the right in the virtual scene 100, and may also remain stationary in place, jump, and control the virtual object to perform a shooting operation, etc.
For example, an interface 101 in which a virtual object is aimed by using a sighting telescope included in a shooting prop is presented in a virtual scene 100, a frame 102 of an eyepiece in the sighting telescope is presented in the aimed interface 101, a visual field picture 103 of the virtual scene which is viewed by the virtual object through the sighting telescope is shown in the inner area of the frame, and a pupil 104 formed by optically imaging the outer frame of the sighting telescope is displayed between the frame 102 and the visual field picture 103; when the state of the sighting telescope or the state of the virtual object is changed, the display effect of the pupil is controlled to be synchronously changed so as to adapt to the changed state, for example, when the color of the lens barrel in the sighting telescope is black, a black pupil is displayed between the frame 102 and the visual field picture 103, and when the color of the lens barrel in the sighting telescope is switched from black to red, the color of the pupil displayed between the frame 102 and the visual field picture 103 is switched from black to red; thus, compared with the prior art in which all the view frames are displayed in the frame 102, the pupil corresponding to the telescope is also displayed between the frame 102 and the view frame 103, so that a real stereoscopic impression is created, and the telepresence of the user in the virtual scene is improved.
In some embodiments, the terminal device 400 may implement the screen presentation method of the virtual scene provided in the embodiments of the present application by running a computer program, for example, the computer program may be a native program or a software module in an operating system; may be a Native APPlication (APP), i.e. a program that needs to be installed in an operating system to run, such as a shooting game APP (i.e. the client 410 described above); or may be an applet, i.e. a program that can be run only by downloading it to the browser environment; but also a game applet that can be embedded in any APP. In general, the computer programs described above may be any form of application, module or plug-in.
Taking a computer program as an application program as an example, in actual implementation, the terminal device 400 is installed and runs with an application program supporting a virtual scene. The application program may be any one of a First-Person Shooting game (FPS), a third-Person Shooting game, a virtual reality application program, a three-dimensional map program, a military simulation program, or a multi-player gun-battle type survival game. The user uses the terminal device 400 to operate virtual objects located in the virtual scene for activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing, building a virtual building. Illustratively, the virtual object may be a virtual character, such as a simulated character or an animated character, among others.
In other embodiments, the embodiments of the present application may also be implemented by Cloud Technology (Cloud Technology), which refers to a hosting Technology for unifying resources of hardware, software, network, and the like in a wide area network or a local area network to implement calculation, storage, processing, and sharing of data.
The cloud technology is a general term of network technology, information technology, integration technology, management platform technology, application technology and the like applied based on a cloud computing business model, can form a resource pool, is used as required, and is flexible and convenient. Cloud computing technology will become an important support. Background services of the technical network system require a large amount of computing and storage resources.
For example, the server 200 in fig. 1B may be an independent physical server, may also be a server cluster or a distributed system formed by a plurality of physical servers, and may also be a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a network service, cloud communication, a middleware service, a domain name service, a security service, a CDN, and a big data and artificial intelligence platform. The terminal device 400 may be, but is not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, and the like. The terminal device 400 and the server 200 may be directly or indirectly connected through wired or wireless communication, and the embodiment of the present application is not limited thereto.
The structure of the terminal apparatus 400 shown in fig. 1A is explained below. Referring to fig. 2, fig. 2 is a schematic structural diagram of a terminal device 400 provided in an embodiment of the present application, where the terminal device 400 shown in fig. 2 includes: at least one processor 420, memory 460, at least one network interface 430, and a user interface 440. The various components in the terminal device 400 are coupled together by a bus system 450. It is understood that the bus system 450 is used to enable connected communication between these components. The bus system 450 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 450 in fig. 2.
The Processor 420 may be an integrated circuit chip having Signal processing capabilities, such as a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like, wherein the general purpose Processor may be a microprocessor or any conventional Processor, or the like.
The user interface 440 includes one or more output devices 441, including one or more speakers and/or one or more visual display screens, that enable the presentation of media content. The user interface 440 also includes one or more input devices 442 including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display screen, camera, other input buttons and controls.
The memory 460 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard disk drives, optical disk drives, and the like. Memory 460 may optionally include one or more storage devices physically located remote from processor 420.
The memory 460 may include volatile memory or nonvolatile memory, and may also include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read Only Memory (ROM), and the volatile Memory may be a Random Access Memory (RAM). The memory 460 described in embodiments herein is intended to comprise any suitable type of memory.
In some embodiments, memory 460 may be capable of storing data to support various operations, examples of which include programs, modules, and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 461 comprising system programs for handling various basic system services and performing hardware related tasks, such as framework layer, core library layer, driver layer, etc., for implementing various basic services and handling hardware based tasks;
a network communication module 462 for reaching other computing devices via one or more (wired or wireless) network interfaces 430, exemplary network interfaces 430 including: bluetooth, wireless compatibility authentication (WiFi), and Universal Serial Bus (USB), etc.;
a presentation module 463 for enabling presentation of information (e.g., user interfaces for operating peripherals and displaying content and information) via one or more output devices 441 (e.g., display screens, speakers, etc.) associated with user interface 440;
an input processing module 464 for detecting one or more user inputs or interactions from one of the one or more input devices 442 and translating the detected inputs or interactions.
In some embodiments, the picture display apparatus of the virtual scene provided in the embodiments of the present application may be implemented in software, and fig. 2 illustrates the picture display apparatus 465 of the virtual scene stored in the memory 460, which may be software in the form of programs and plug-ins, and includes the following software modules: a first presentation module 4651, a second presentation module 4652, a third presentation module 4653 and a presentation control module 4654, which are logical and thus may be arbitrarily combined or further split according to the implemented functions, the functions of which will be explained below.
In other embodiments, the picture displaying apparatus of the virtual scene provided in this embodiment may be implemented in hardware, for example, the picture displaying apparatus of the virtual scene provided in this embodiment may be a processor in the form of a hardware decoding processor, which is programmed to execute the picture displaying method of the virtual scene provided in this embodiment, for example, the processor in the form of the hardware decoding processor may employ one or more Application Specific Integrated Circuits (ASICs), DSPs, Programmable Logic Devices (PLDs), Complex Programmable Logic Devices (CPLDs), Field Programmable Gate Arrays (FPGAs), or other electronic components.
The following describes a method for displaying a virtual scene in accordance with an embodiment of the present application with reference to the accompanying drawings. The method for displaying the virtual scene picture provided by the embodiment of the present application may be executed by the terminal device 400 in fig. 1A alone, or may be executed by the terminal device 400 and the server 200 in fig. 1B in a cooperation manner.
Next, a description will be given taking as an example a case where the terminal device 400 in fig. 1A alone executes the screen display method of the virtual scene provided in the embodiment of the present application. Referring to fig. 3, fig. 3 is a flowchart illustrating a picture displaying method of a virtual scene according to an embodiment of the present application, and will be described with reference to the steps illustrated in fig. 3.
It should be noted that the method shown in fig. 3 can be executed by various forms of computer programs running on the terminal device 400, and is not limited to the client 410 described above, but may also be the operating system 461, software modules and scripts described above, so that the client should not be considered as limiting the embodiments of the present application.
Step 101: the terminal presents an interface for aiming the virtual object by using a sighting telescope included in the shooting prop by adopting a first person visual angle.
The sighting telescope is used for improving the accuracy of shooting a target by using the shooting prop, and the virtual object can observe a scene picture of a virtual scene through the sighting telescope.
Referring to fig. 4, fig. 4 is a schematic view of a composition structure of a scope according to an embodiment of the present disclosure, as shown in fig. 4, the scope includes an objective lens, an inverting group, an adjusting handwheel, a lens barrel, an eyepiece and other components, wherein the objective lens is located at a front end of the scope and is a component of the scope for receiving an external light source, and the objective lens with a larger diameter can receive more light sources and can also receive more light sources at the same distance, and the diameter of the scope is larger, so that a virtual object using the scope can see a clearer image, and in general, in order to obtain more lighting, a layer of fluoride may be plated on a surface of the objective lens to increase a light transmission amount and reduce a reflection amount, and if the objective lens has a purple or yellow reflection, the objective lens is a reflection caused by a plating film; the adjusting handwheel is positioned in the middle of the lens barrel and comprises a direction handwheel of the sighting telescope and a focusing handwheel of the objective lens, wherein the direction handwheel is used for adjusting the horizontal direction so as to correct windage yaw and the advance of a moving target, images can be clearer through the focusing handwheel, and errors can be reduced. The inverted image group is used for image rectification, because the convex lens forms an inverted and magnified real image outside the double focal length, and because the distance from the ocular lens to the pupil is too small, a magnified and inverted virtual image is seen, and therefore the magnified and inverted image seen from the ocular lens is rectified by the inverted image group. The lens cone is used for loading parts such as an image inverting group and an adjusting hand wheel, plays roles of protecting and transmitting light, and the larger the diameter of the lens cone is, the larger the light brightness is, the lower the refraction angle is, and thus, an image can be clearer. The eyepiece is positioned at the rearmost end of the sighting telescope and is used for further magnifying the image magnified by the objective lens and transmitting the magnified image to human eyes.
Step 102: and a frame of an ocular lens in the sighting telescope is presented in the interface, and a visual field picture of a virtual scene, which is seen by the virtual object through the sighting telescope, is presented in the inner area of the frame.
Step 103: and displaying a pupil formed by optically imaging the outer frame of the sighting telescope between the frame and the visual field picture.
Referring to fig. 5, fig. 5 is a schematic view of an aiming interface provided in the embodiment of the present application, which is to present an interface 501 of a virtual object for aiming with an aiming lens by using a first human-scale viewing angle, present a border 502 of an eyepiece in the interface 501, and present a view screen 503 of a virtual scene seen by the virtual object through the aiming lens in an internal area of the border 502, that is, a scene element of the virtual scene, such as a virtual tree, a stone, a grass, etc., in a corresponding position seen by the aiming lens is displayed in the view screen 503; a pupil, such as a black border 504 having a certain width, is displayed between the bezel 502 and the view frame 503 to shape a real stereoscopic impression.
In some embodiments, the terminal may present an interface at which the virtual object is aimed using a scope included with the shooting prop, using the first-person perspective, by: in response to a scope opening instruction for the shooting prop, performing a scope opening operation for the sighting telescope; in the process of executing the operation of opening the telescope, adopting a first-person visual angle to display the process of gradually opening a sighting interface corresponding to the sighting telescope; accordingly, the terminal can display the pupil formed by optically imaging the outer frame of the sighting telescope between the frame and the visual field picture by the following modes: in the process of executing the opening operation of the sighting telescope, a change process of the pupil formed by optically imaging the outer frame of the sighting telescope from deviation to full is displayed between the frame and the visual field picture along with the execution of the opening operation.
The opening operation refers to a process of moving a sighting telescope included in the shooting prop to a sight line position of a virtual object, which is also called a sighting operation or a focusing operation, for example, see fig. 6, where fig. 6 is a schematic view of a sighting interface of the opening process provided in the embodiment of the present application, in the process of executing the opening operation, the sighting interface for viewing a virtual scene through the sighting telescope is gradually opened, that is, a process from none to full of a view picture in the sighting interface is shown, correspondingly, a pupil changes from partial to full along with the execution of the opening operation, and when the opening operation is completed, the pupil is uniformly distributed between a frame and the view picture.
In some embodiments, in the process of executing the mirror opening operation, if the user performs the mirror opening operation, the display of the sighting interface is cancelled, and the vision switching is performed, for example, the first person vision is switched to the third person vision, and a picture of a virtual scene under the third person vision is presented.
In some embodiments, the sighting telescope further comprises an objective lens and a lens barrel, wherein the objective lens and the eyepiece lens are respectively positioned at two ends of the lens barrel; the terminal can display the pupil formed by optically imaging the outer frame of the sighting telescope between the frame and the visual field picture by the following modes: and displaying a pupil formed by optically imaging the lens barrel of the sighting telescope between the frame and the visual field picture, wherein the color of the pupil is consistent with that of the lens barrel.
Here, in general, a lens barrel connecting an eyepiece lens and an objective lens affects the color of the pupil, and in order to give a realistic feeling to a user, the pupil may be displayed in a color corresponding to the color of the lens barrel, for example, the color of the lens barrel is black, and a black pupil is displayed between the bezel and the view field screen.
In some embodiments, the terminal may implement the switching of pupil colors by: and responding to a color switching instruction for the lens barrel, switching the current color of a frame of the ocular in the interface to a target color, and switching the color of the pupil to the target color.
The color switching instruction can be triggered based on a color switching control presented in the aimed interface, and can also be triggered by triggering a color switching key positioned on the launching prop. In some embodiments, a plurality of colors can be preset for a lens barrel in a sighting telescope for launching the prop, and the plurality of colors are arranged in a certain sequence, when the color of the pupil is switched, the color can be switched in the preset sequence, for example, if the color of the current lens barrel is black, at this time, the frame of an ocular in the aimed interface is black, the color of the pupil is also black, and red in the arrangement sequence is immediately behind black, when the terminal receives a color switching instruction, the color of the ocular in the aimed interface is automatically switched from black to red, and the color of the pupil is automatically switched from black to red. In other embodiments, when the terminal receives the color switching instruction, at least two color options for the user to select may be presented, and when the user selects the option corresponding to the target color, the terminal automatically switches the current color of the bezel of the eyepiece in the aimed interface to the selected target color and switches the color of the pupil from the current color to the target color in response to the selection operation.
In some embodiments, the terminal may display a pupil formed by optically imaging the outer frame of the scope between the frame and the field of view by: acquiring the material of an objective lens and the material of an ocular lens included by a sighting telescope; determining a suitable transparency based on the material of the objective lens and the material of the eyepiece; and displaying a pupil formed by optically imaging the outer frame of the sighting telescope based on the transparency between the frame and the visual field picture.
Here, the effect of the transparency of the pupil is related to the materials of the eyepiece and the objective lens in the scope, and the absorbances and transmittances of the eyepiece and the objective lens are different for different materials, and in order to make the pupil observed through the scope more realistic, the corresponding transparency can be determined based on the materials of the objective lens and the eyepiece lens, and the pupil of the determined transparency can be displayed.
In some embodiments, the terminal may display a pupil formed by optically imaging the outer frame of the scope between the frame and the field of view by: displaying a pupil in a first state formed by optically imaging an outer frame of the sighting telescope between the frame and a visual field picture in the process that the sighting telescope is in a static state or a virtual object is in a static state; when the static pupil is in the first state, the first state comprises two concentric rings, wherein the color of the pupil in the outer ring is the same, and the color of the pupil in the inner ring gradually becomes lighter along the radial direction from the edge of the inner ring to the center of the eyepiece.
For example, in fig. 5, when a virtual object using a scope or the scope is in a stationary state, light rays in a virtual scene are incident into the scope in a direction parallel to the scope, the objective lens is located on the same axis as the eyepiece, the light rays are not blocked, and the pupil viewed through the scope exhibits a fade-in fade effect as the displayed pupil is divided into two concentric rings, wherein the color of the pupil in the outer ring is unchanged and the color of the pupil in the inner ring exhibits a fade effect in a radial direction toward the center of the eyepiece.
In some embodiments, the terminal may display a pupil formed by optically imaging the outer frame of the scope between the frame and the field of view by: displaying a pupil in a second state formed by optically imaging an outer frame of the sighting telescope while the virtual object is in the traveling state; and the second state is dynamic, the pupil partially covers the view field picture in the process that the pupil is in the second state, and the size of the covered view field picture is changed.
Here, when the virtual object is in a traveling state, the shooting prop may be shaken, so that light rays in the virtual scene cannot enter the sighting telescope along a direction parallel to the sighting telescope, that is, the light rays enter the sighting telescope obliquely, and a part of the light rays are blocked by the lens barrel, resulting in an imaging difference, for example, referring to fig. 7, fig. 7 is a schematic view of a sighting interface of an embodiment of the present application, during the traveling of the virtual object, a view field picture is masked by a pupil seen in the sighting interface, and the size of the masked view field picture is dynamically changed along with the traveling of the virtual object.
In some embodiments, the terminal may also dynamically show the mask ratio of the pupil to the field of view while displaying the pupil in the second state. Here, the mask ratio is used to prompt the degree of deviation of the orientation of the shooting prop from the direction of the line of sight of the virtual object, and then prompt the player to adjust the orientation of the shooting prop.
In some embodiments, the terminal can also control the shooting prop to continuously shoot the target aimed by the sighting telescope in response to a continuous shooting instruction for the shooting prop; the pupil is maintained in a default state during continuous shooting of the aimed object, so that the display effect of the pupil during continuous shooting of the aimed object is unchanged.
Here, on the basis of displaying the pupil between the frame and the view picture, in a normal case, when the emission prop is used to shoot the target virtually, the width of the pupil may become large due to an angle problem, or the sighting telescope may shake during the process of opening the mirror, in this case, the presented pupil may be changed, which will all cause a negative effect on the aiming precision.
Step 104: when the state of the scope or the state of the virtual object is changed, the display effect of the control pupil is changed in synchronization to adapt to the changed state.
Here, after displaying the pupil, when the state of the virtual object changes, the display effect of the control pupil changes in synchronization to adapt to the changed state of the scope or the virtual object.
In some embodiments, the terminal may control the shooting prop to shoot the target aimed at by the sighting telescope in response to a shooting instruction for the shooting prop triggered at the first time, and control the pupil to be in a default state in a target period starting at the first time, so that the display effect of the pupil in the target period is unchanged.
When the shooting prop is controlled to shoot a bullet to perform one-time shooting on the aimed target object every time a shooting instruction is triggered, the pupil can be controlled to be in a default state in the next target time period when the shooting prop shoots the target object for the first time, if the pupil is a black edge, the width of the black edge is limited to be kept unchanged in a certain range in the target time period, so that the black edge cannot cover an overlarge view picture, and the requirement of a user for shooting the target object for multiple times in a short time is met.
In some embodiments, when a second time after the first time arrives and a time interval between the second time and the first time is the same as the target period, the default state of the pupil is released and the state of the scope and the state of the virtual object are acquired; and updating the display effect of the pupil according to the state of the sighting telescope and the state of the virtual object.
Here, when the time length for controlling the pupil to be in the default state reaches the target length, the default state of the pupil is released, and the display effect of the pupil is updated in real time according to the state of the scope and the state of the virtual object, so that the updated display effect is adapted to the current state of the scope and the state of the virtual object.
In some embodiments, the shooting prop is further equipped with a side scope, and the terminal is further capable of switching the scope to the side scope in response to a scope switching instruction for the shooting prop, wherein the magnification of the side scope is different from that of the scope; presenting an aiming interface corresponding to the side sighting telescope, and displaying a first pupil formed by optically imaging an outer frame of the side sighting telescope in the aiming interface, wherein the width of the first pupil corresponds to the magnification of the side sighting telescope.
In practical applications, the diameters and lengths of the objective lenses or the ocular lenses of the sighting telescope with different magnifications are different, and the widths of the formed pupils are different, as shown in fig. 8, fig. 8 is a display schematic diagram of the sighting telescope provided by the embodiment of the application, and after the current sighting telescope is switched to the side sighting telescope with different magnifications, the pupil corresponding to the switched magnification of the side sighting telescope is displayed in the sighting interface corresponding to the side sighting telescope.
In some embodiments, in the aimed interface, the terminal may also present at least two magnification options; canceling the interface for displaying the aiming to display the interface of the virtual scene in response to a switching instruction for the sighting telescope based on the options of at least two magnification factors, and controlling the shooting prop to assemble the target sighting telescope indicated by the switching instruction; presenting a sighting interface corresponding to the target sighting telescope in response to a mirror opening instruction triggered by the interface based on the virtual scene; and displaying a second pupil formed by optically imaging the outer frame of the target sighting telescope in the sighting interface corresponding to the target sighting telescope, wherein the width of the second pupil corresponds to the magnification of the target sighting telescope.
Here, in the interface of the virtual scene, the terminal may further present an option for performing scope selection, where magnifications of scopes corresponding to different options are different, when the user selects a target option, the terminal receives a switching instruction for the target scope corresponding to the target option, the terminal cancels the currently aimed interface and controls the shooting tool to provision the target scope in response to the switching instruction, and when a scope opening operation is performed for the target scope, presents a targeting interface corresponding to the target scope and presents a pupil associated with the target scope.
In some embodiments, the terminal may further obtain light environments in a virtual scene in which the virtual object is located, wherein light intensities in different light environments are different; when the light environment of the virtual object is changed, the definition of the pupil is synchronously changed so as to adapt to the changed light environment.
When the light environment in which the virtual object is located is changed, the visibility of the pupil can be synchronously changed in addition to synchronously changing the definition of the pupil.
In some embodiments, before displaying a view picture of a virtual scene seen by a virtual object through a sighting telescope, the terminal may further create a patch in correspondence with an eyepiece of the sighting telescope, and draw a map corresponding to the patch, wherein the map includes a first portion corresponding to the view picture and a second portion corresponding to a pupil, and the first portion is a transparent color; combining the chartlet with the patch to obtain a target patch; accordingly, the terminal can display the pupil formed by optically imaging the outer frame of the sighting telescope between the frame and the visual field picture by the following modes: acquiring a scene patch corresponding to a virtual scene and a frame patch corresponding to an outer frame of a sighting telescope; rendering the scene patch, the frame patch and the target patch according to a preset rendering sequence so as to display a pupil formed by optically imaging an outer frame of the sighting telescope between the frame and the visual field picture.
For example, the rendering sequence of the scene patches is located before the frame patches, the rendering sequence of the frame patches is located before the target patches, and the scene patches, the frame patches, and the target patches are rendered in sequence, so that a pupil formed by optically imaging an outer frame of the sighting telescope can be displayed between the frame and the view field.
In some embodiments, the terminal may render the map corresponding to the patch by: acquiring an initial chartlet corresponding to the surface patch, wherein the initial chartlet comprises a first part and an initial part corresponding to the pupil, and the color value of each pixel in the initial part corresponds to the default state of the pupil; determining the state of a sighting telescope and the state of a virtual object, and determining the central point of pixel sampling according to the state of the sighting telescope and the state of the virtual object; and based on the central point, carrying out pixel sampling in the initial mapping to obtain the mapping corresponding to the patch.
Here, in practical applications, the pupil has attributes such as width size, color, gradient, transparency, etc., and in order to achieve the corresponding pupil effect, the corresponding attributes may be set, for example, the color value of each pixel in the initial map corresponding to the patch is obtained to correspond to a default state of the pupil, such as the first portion of the transparent color, the size of the pupil width (corresponding to the width of the black border), and the size of the gradient color (corresponding to the gradient of the black border); then, determining the state of the sighting telescope and the state of the virtual object, determining a central point of pixel sampling according to the state of the sighting telescope and the state of the virtual object, and finally performing pixel sampling in the initial mapping based on the central point to obtain a mapping corresponding to a patch, for example, drawing a circle by taking the central point as a circle center and the size of the first part as a radius to obtain a circle, wherein a pure white part is in the circle, namely a region where a view picture is located; and drawing circles respectively by taking the central point as the center of a circle and taking the size of the first part as the radius, and drawing circles by taking the sum of the size of the first part and the size of the gradient color as the radius to obtain concentric rings, wherein the inner part of each ring is a gradient area, and the rest parts are black areas.
In practical application, when a virtual object or a sighting telescope is in a motion state, a pupil can be changed along with the change of the virtual object by modifying coordinate values during sampling of a map, for example, when the virtual object moves to the left, a left pupil should be more, so that according to the current motion state of the virtual object, the sampling center is also moved to the left and then sampling is performed, an updated map corresponding to a corresponding patch is obtained, picture rendering is performed based on the updated map, and a desired pupil effect can be presented, for example, a change effect of the pupil from partial to full is presented in a sighting interface along with the execution of a mirror-opening operation.
Next, an exemplary application of the embodiment of the present application in a practical application scenario will be described. Taking a virtual scene as an example, when a virtual object is controlled to observe a game picture through a sighting telescope of a shooting prop, in order to show a real and vivid picture to improve the telepresence of the game, the embodiment of the application provides a picture display method of the virtual scene, wherein a first person visual angle is adopted to present an interface of the virtual object for aiming by using the sighting telescope included in the shooting prop; the method comprises the steps of presenting a frame of an eyepiece in a sighting telescope in a sighting interface, presenting a visual field picture of a virtual scene seen by a virtual object through the sighting telescope in an inner area of the frame of the eyepiece, displaying a pupil formed by optically imaging an outer frame of the sighting telescope between the frame and the visual field picture, and controlling the display effect of the pupil to be synchronously changed to adapt to the changed state when the state of the sighting telescope or the state of the virtual object is changed, so that the vivid pupil effect is presented by utilizing the optical characteristics in reality. The following description will take an example in which the pupil is a black border having a certain width.
As shown in fig. 6, during the process of controlling the virtual object to perform the open mirror operation, the virtual object is gradually opened through the sighting telescope to view the aiming interface of the virtual scene, i.e. the process from none to full of the view field picture in the aiming interface is shown, correspondingly, the pupil (black edge) presented in the aiming interface changes from partial to full along with the performance of the open mirror operation, and when the open mirror operation is completed, the pupil (black edge) is uniformly distributed between the frame and the view field picture.
As shown in fig. 5, when the sighting telescope is in a static state, light in a virtual scene is incident into the sighting telescope along a direction parallel to the sighting telescope, in this case, the objective lens and the eyepiece are positioned on the same axis, the light is not shielded, a pupil (black edge) observed through the sighting telescope presents a gradual-in and gradual-change effect, the displayed black edge is divided into two concentric rings, wherein the color of the black edge in the outer ring is unchanged, and the color of the black edge in the inner ring presents a gradual-out effect towards the radial direction of the center of the eyepiece.
As shown in fig. 7, when the virtual object is in a traveling state, the shooting prop may be shaken, so that the light in the virtual scene cannot be incident on the sighting telescope along a direction parallel to the sighting telescope, that is, the light is incident on the sighting telescope obliquely, and a part of the light is blocked by the lens barrel, resulting in an imaging difference.
In order to enrich the display style of the pupil, the embodiment of the application also provides various selectable debugging options, such as the color of the pupil, the width of the pupil, the gradient of the pupil, the transparency of the pupil and the like, wherein the color of the pupil is consistent with the color of the lens barrel, for example, the color of the lens barrel is black, and the black pupil, referred to as a black edge for short, is displayed between the frame and the view field picture; the width and the gradient of the pupil are related to the incident angle of the light speed; the transparency of the pupil is related to the materials of the ocular lens and the objective lens in the sighting telescope, the absorbance and the transmittance of the ocular lens and the objective lens of different materials are different, and the pupil displaying the corresponding transparency can be determined based on the materials of the objective lens and the ocular lens.
In practical application, on the basis of displaying a pupil (black edge) between a frame and a view picture, under a normal condition, when a virtual object is controlled to shoot a target virtually by using a launching prop, the width of the pupil may become large due to an angle problem, under such a condition, if the presented pupil (black edge) is synchronously changed, the changed pupil (black edge) will cause a negative influence on aiming precision.
During rendering, in order to display a pupil (black edge) effect, a patch which is consistent with an eyepiece of a sighting telescope can be created, and a chartlet corresponding to the patch is drawn, wherein the chartlet comprises a first part corresponding to a visual field picture and a second part corresponding to the pupil (black edge), and the first part is transparent color; combining the chartlet with the patch to obtain a target patch; acquiring a scene patch corresponding to a virtual scene and a frame patch corresponding to an outer frame of a sighting telescope; and rendering the scene patch, the frame patch and the target patch according to a preset rendering sequence, and displaying an interface obtained by rendering, namely an interface aimed by the sighting telescope, wherein the interface presents a frame of the target and a view picture in a content area of the frame so as to display a pupil (black edge) between the frame and the view picture.
Referring to fig. 9, fig. 9 is a schematic diagram of a map provided in an embodiment of the present application, in fig. 9, a white portion is a first portion of a transparent color, a black portion is a second portion of a non-transparent color, in a normal state of the scope, a circular patch is obtained by sampling a color value on the map with a center of the map as a center of a circle and a length and width value (the length and the width are consistent) of the map as a diameter, and a middle gray portion is a transition effect between a pupil (a black edge) viewed through the scope and a view field picture.
In practical application, since attribute requirements such as modifying the width of the pupil (black edge), the gradient of the pupil (black edge), and the transparency of the pupil (black edge) are planned, during sampling, the actual sampling effect of the map can be dynamically modified according to the attribute values configured according to the attribute requirements, the position of the current sampling coordinate, and the transparency, that is, a suitable map can be dynamically generated, as shown in fig. 10, where fig. 10 is a map display diagram provided in the embodiment of the present application, and the map shown in fig. 10 is generated according to the attributes such as the width of the black edge, the gradient, and the transparency.
Referring to fig. 11, fig. 11 is a flowchart of a method for generating a map provided in an embodiment of the present application, where the method includes:
step 201: and acquiring an initial mapping corresponding to the patch.
Wherein the color value of each pixel in the initial map corresponds to a default state of the pupil, such as the first portion of the clear color (i.e., white border in FIG. 10), the size of the black border (corresponding to the width of the black border), and the size of the gradient color (corresponding to the gradient of the black border);
step 202: and determining the state of the sighting telescope and the state of the virtual object, and determining the central point of the pixel sampling according to the state of the sighting telescope and the state of the virtual object.
Step 203: and based on the central point, carrying out pixel sampling in the initial mapping to obtain the mapping corresponding to the patch.
Here, a circle is drawn by taking the central point as the center of the circle and the size of the white edge as the radius to obtain a circle, wherein the circle is internally provided with a pure white part, namely the area where the visual field picture is located; and drawing circles respectively by taking the white side size as a radius and by taking the sum of the white side size and the gradient color size as a radius by taking the central point as a circle center to obtain concentric rings, wherein the inner part of each ring is a gradient area, and the rest parts of each ring are black areas.
In practical application, when the virtual object or the sighting telescope is in a moving state, the pupil can be changed along with the change of the virtual object by modifying the coordinate value during the sampling of the map, if the sampling is not performed by taking the central point as the center of a circle. As shown in fig. 12, fig. 12 is a schematic diagram of map updating provided in the embodiment of the present application, for example, when a virtual object moves to the left, the number of black edges of the left pupil should be increased, so according to the current operating state of the virtual object, the center of the sampling circle is also shifted to the left and then sampled to obtain an updated map corresponding to the corresponding patch, and a desired pupil effect can be presented by performing image rendering based on the updated map, for example, a change effect of pupil (black edge) shifting from full to full is presented in an aimed interface along with the execution of a mirror opening operation.
Referring to fig. 13, fig. 13 is a flowchart of a method for updating a map according to an embodiment of the present application, where the method includes: in step 301, the moving speed of the virtual object is acquired; in step 302, determining a deviation amplitude difference using the center point based on the moving speed of the virtual object, wherein the deviation amplitude difference is proportional to the moving speed of the virtual object; in step 303, an updated center point is determined based on the offset magnitude difference and the original pixel sampled center point; in step 304, based on the updated central point, pixel sampling is performed in the initial map to obtain an updated map corresponding to the patch; after the updated map is obtained, the desired pupil effect can be presented after the picture rendering is performed based on the updated map.
Through the mode, in the process of aiming through the sighting telescope, the real optical phenomenon in real life is introduced into the aiming interface, the real and vivid prop effect can be displayed, and the unique game presence is shaped.
Continuing with the exemplary structure of the screen presentation apparatus 465 of the virtual scene provided in the embodiment of the present application implemented as a software module, in some embodiments, the software module stored in the screen presentation apparatus 465 of the virtual scene in the memory 460 in fig. 2 may include:
a first presenting module 4651, configured to present, using a first person perspective, an interface at which a virtual object is aimed using a sighting telescope included in a shooting prop;
a second rendering module 4652, configured to render, in the interface, a frame of an eyepiece of the sighting telescope, where in an inner region of the frame, a view frame of a virtual scene viewed by the virtual object through the sighting telescope is displayed;
a third rendering module 4653, configured to display a pupil formed by optically imaging an outer frame of the scope between the frame and the view frame;
a presentation control module 4654 configured to control the display effect of the pupil to change synchronously to adapt to the changed state when the state of the scope or the state of the virtual object changes.
In some embodiments, the first presentation module is further configured to perform an open mirror operation for the scope in response to an open mirror instruction for the shooting prop;
in the process of executing the open mirror operation, adopting a first-person visual angle to display the process of gradually opening the aiming interface corresponding to the sighting telescope;
and the third rendering module is further used for displaying a change process of the pupil formed by optically imaging the outer frame of the sighting telescope from a full state to a full state between the frame and the visual field picture along with the execution of the mirror opening operation in the process of executing the mirror opening operation of the sighting telescope.
In some embodiments, the sighting telescope further comprises an objective lens and a lens barrel, wherein the objective lens and the eyepiece lens are respectively positioned at two ends of the lens barrel;
the first presenting module is further configured to display a pupil formed by optically imaging the lens barrel of the sighting telescope between the frame and the view field picture, and a color of the pupil is consistent with a color of the lens barrel.
In some embodiments, the apparatus further comprises:
and the color switching module is used for responding to a color switching instruction aiming at the lens barrel, switching the current color of the frame of the eyepiece in the interface into a target color, and switching the color of the pupil into the target color.
In some embodiments, the third rendering module is further configured to acquire a material of an objective lens included in the scope and a material of the eyepiece;
determining an adapted transparency based on the material of the objective lens and the material of the eyepiece;
and displaying a pupil formed by optically imaging the outer frame of the sighting telescope between the frame and the visual field picture based on the transparency.
In some embodiments, the third rendering module is further configured to display a pupil in the first state formed by optically imaging an outer frame of the scope between the frame and the field of view during the time when the scope is in a static state or the virtual object is in a static state;
the first state is static, and the pupil comprises two concentric rings when in the first state, wherein the color of the pupil in the outer ring is the same, and the color of the pupil in the inner ring becomes gradually lighter along the radial direction from the edge of the inner ring to the center of the eyepiece.
In some embodiments, the third rendering module is further configured to display a pupil in a second state formed by optically imaging an outer frame of the scope while the virtual object is in the travel state;
the second state is dynamic, the pupil partially covers the view field picture in the process that the pupil is in the second state, and the size of the covered view field picture is changed.
In some embodiments, the apparatus further comprises:
and the fourth rendering module is used for dynamically displaying the mask proportion of the pupil to the visual field picture in the process of displaying the pupil in the second state.
In some embodiments, the apparatus further comprises:
the state maintaining module is used for responding to a continuous shooting instruction aiming at the shooting prop and controlling the shooting prop to continuously shoot the target aimed at by the sighting telescope;
and keeping the pupil in a default state during continuous shooting of the object, so that the display effect of the pupil is unchanged during continuous shooting of the object.
In some embodiments, the apparatus further comprises:
a state control module for responding to the firing instruction of the firing prop triggered at the first time, controlling the firing prop to fire the target aimed by the sighting telescope and controlling the firing prop to
And controlling the pupil to be in a default state in a target period taking the first moment as a starting moment so as to enable the display effect of the pupil in the target period to be unchanged.
In some embodiments, the apparatus further comprises:
a display update module for releasing the default state of the pupil and for releasing the default state when a second time arrives after the first time and a time interval between the second time and the first time is the same as the target period
Acquiring the state of the sighting telescope and the state of the virtual object;
and updating the display effect of the pupil according to the state of the sighting telescope and the state of the virtual object.
In some embodiments, the firing prop is further equipped with a side-sighting telescope, the apparatus further comprising:
the sighting telescope switching module is used for responding to a sighting telescope switching instruction aiming at the shooting prop and switching the sighting telescope into the side sighting telescope, wherein the magnification of the side sighting telescope is different from that of the sighting telescope;
presenting a sighting interface corresponding to the side sighting telescope, and displaying a first pupil formed by optically imaging an outer frame of the side sighting telescope in the sighting interface, wherein the width of the first pupil corresponds to the magnification of the side sighting telescope.
In some embodiments, the apparatus further comprises:
the magnification selection module is used for presenting at least two magnification options in the interface;
based on the at least two options of magnification, in response to a switching instruction for the sighting telescope, canceling the interface displayed to display the interface of the virtual scene, and controlling the shooting prop to assemble a target sighting telescope indicated by the switching instruction;
presenting a sighting interface corresponding to the target sighting telescope in response to a lens opening instruction triggered based on the interface of the virtual scene;
and displaying a second pupil formed by optically imaging the outer frame of the target sighting telescope in the sighting interface, wherein the width of the second pupil corresponds to the magnification of the target sighting telescope.
In the above scheme, the apparatus further comprises:
the definition adjusting module is used for acquiring light environments in a virtual scene where the virtual object is located, wherein the light intensities in different light environments are different;
and when the light environment in which the virtual object is positioned is changed, synchronously changing the definition of the pupil so as to adapt to the changed light environment.
In some embodiments, before the displaying the view frame of the virtual scene viewed by the virtual object through the scope, the apparatus further comprises:
the system comprises a surface patch determining module, a pupil setting module and a control module, wherein the surface patch determining module is used for creating a surface patch which is consistent with an eyepiece of the sighting telescope and drawing a chartlet corresponding to the surface patch, the chartlet comprises a first part corresponding to the visual field picture and a second part corresponding to the pupil, and the first part is transparent color;
combining the chartlet with the patch to obtain a target patch;
the third presentation module is further configured to obtain a scene patch corresponding to the virtual scene and a frame patch corresponding to an outer frame of the sighting telescope;
rendering the scene patch, the frame patch and the target patch according to a preset rendering sequence so as to display a pupil formed by optically imaging an outer frame of the sighting telescope between the frame and the visual field picture.
In some embodiments, the patch determination module is further configured to obtain an initial map corresponding to the patch, where the initial map includes the first portion and an initial portion corresponding to the pupil, and a color value of each pixel in the initial portion corresponds to a default state of the pupil;
determining the state of the sighting telescope and the state of the virtual object, and determining the central point of pixel sampling according to the state of the sighting telescope and the state of the virtual object;
and based on the central point, carrying out pixel sampling in the initial mapping to obtain a mapping corresponding to the patch.
Embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the computer instructions. . A method.
The embodiment of the present application provides a computer-readable storage medium storing executable instructions, which when executed by a processor, will cause the processor to execute the method for displaying a virtual scene, for example, the method shown in fig. 3.
In some embodiments, the computer-readable storage medium may be memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash, magnetic surface memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories.
In some embodiments, executable instructions may be written in any form of programming language (including compiled or interpreted languages), in the form of programs, software modules, scripts or code, and may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
By way of example, executable instructions may correspond, but do not necessarily have to correspond, to files in a file system, and may be stored in a portion of a file that holds other programs or data, such as in one or more scripts in a hypertext Markup Language (HTML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
By way of example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network.
The above description is only an example of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, and improvement made within the spirit and scope of the present application are included in the protection scope of the present application.
Claims (20)
1. A picture display method of a virtual scene is characterized by comprising the following steps:
presenting an interface for aiming the virtual object by using a sighting telescope included in the shooting prop by adopting a first person visual angle;
a frame of an eyepiece in the sighting telescope is presented in the interface, and a visual field picture of a virtual scene seen by the virtual object through the sighting telescope is displayed in an internal area of the frame;
displaying a pupil formed by optically imaging an outer frame of the sighting telescope between the frame and the visual field picture;
when the state of the sighting telescope or the state of the virtual object is changed, the display effect of the pupil is controlled to be synchronously changed to adapt to the changed state.
2. The method of claim 1, wherein presenting an interface at which the virtual object is aimed using a scope included with the shooting prop using the first-person perspective comprises:
in response to an open-mirror instruction for the shooting prop, performing an open-mirror operation for the sighting telescope;
in the process of executing the open mirror operation, adopting a first-person visual angle to display the process of gradually opening the aiming interface corresponding to the sighting telescope;
the displaying a pupil formed by optically imaging an outer frame of the scope between the frame and the view field picture includes:
in the process of executing the open operation of the sighting telescope, a change process of deflecting the pupil formed by optically imaging the outer frame of the sighting telescope to the full is displayed between the frame and the visual field picture along with the execution of the open operation.
3. The method of claim 1, wherein the sighting telescope further comprises an objective lens and a barrel, the objective lens and the eyepiece lens being located at two ends of the barrel, respectively;
displaying a pupil formed by optically imaging an outer frame of the scope between the frame and the field of view picture, including:
displaying a pupil formed by optically imaging the lens barrel of the sighting telescope between the frame and the visual field picture, wherein the color of the pupil is consistent with that of the lens barrel.
4. The method of claim 3, wherein the method further comprises:
and in response to a color switching instruction for the lens barrel, switching the current color of a frame of the eyepiece in the interface to a target color, and switching the color of the pupil to the target color.
5. The method of claim 1, wherein displaying a pupil formed by optically imaging an outer frame of the scope between the frame and the field of view comprises:
acquiring the material of an objective lens and the material of an eyepiece which are included by the sighting telescope;
determining an adapted transparency based on the material of the objective lens and the material of the eyepiece;
and displaying a pupil formed by optically imaging the outer frame of the sighting telescope between the frame and the visual field picture based on the transparency.
6. The method of claim 1, wherein displaying a pupil formed by optically imaging an outer frame of the scope between the frame and the field of view comprises:
displaying a pupil in a first state formed by optically imaging an outer frame of the sighting telescope between the frame and the visual field picture in the process that the sighting telescope is in a static state or the virtual object is in a static state;
the first state is static, and the pupil comprises two concentric rings when in the first state, wherein the color of the pupil in the outer ring is the same, and the color of the pupil in the inner ring becomes gradually lighter along the radial direction from the edge of the inner ring to the center of the eyepiece.
7. The method of claim 1, wherein displaying a pupil formed by optically imaging an outer frame of the scope between the frame and the field of view comprises:
displaying a pupil in a second state formed by optically imaging an outer frame of the scope while the virtual object is in a traveling state;
the second state is dynamic, the pupil partially covers the view field picture in the process that the pupil is in the second state, and the size of the covered view field picture is changed.
8. The method of claim 7, wherein the method further comprises:
in displaying the pupil in the second state, dynamically showing a mask ratio of the pupil to the field of view picture.
9. The method of claim 1, wherein the method further comprises:
responding to a continuous shooting instruction aiming at the shooting prop, and controlling the shooting prop to continuously shoot the target aimed by the sighting telescope;
and keeping the pupil in a default state during continuous shooting of the object, so that the display effect of the pupil is unchanged during continuous shooting of the object.
10. The method of claim 1, wherein the method further comprises:
in response to a shooting instruction for the shooting prop triggered at a first time, controlling the shooting prop to shoot an object aimed at by the sighting telescope, and
and controlling the pupil to be in a default state in a target period taking the first moment as a starting moment so as to enable the display effect of the pupil in the target period to be unchanged.
11. The method of claim 10, wherein the method further comprises:
when a second time after the first time arrives and a time interval between the second time and the first time is the same as the target period, releasing the default state of the pupil, and
acquiring the state of the sighting telescope and the state of the virtual object;
and updating the display effect of the pupil according to the state of the sighting telescope and the state of the virtual object.
12. The method of claim 1, wherein the firing prop is further equipped with a side-sighting telescope, the method further comprising:
switching the scope to the side scope in response to a scope switching instruction for the shooting prop, the side scope having a magnification different from the scope;
presenting a sighting interface corresponding to the side sighting telescope, and displaying a first pupil formed by optically imaging an outer frame of the side sighting telescope in the sighting interface, wherein the width of the first pupil corresponds to the magnification of the side sighting telescope.
13. The method of claim 1, wherein the method further comprises:
presenting at least two magnification options in the interface;
based on the at least two options of magnification, in response to a switching instruction for the sighting telescope, canceling the interface displayed to display the interface of the virtual scene, and controlling the shooting prop to assemble a target sighting telescope indicated by the switching instruction;
presenting a sighting interface corresponding to the target sighting telescope in response to a lens opening instruction triggered based on the interface of the virtual scene;
and displaying a second pupil formed by optically imaging the outer frame of the target sighting telescope in the sighting interface, wherein the width of the second pupil corresponds to the magnification of the target sighting telescope.
14. The method of claim 1, wherein the method further comprises:
acquiring light environments in a virtual scene where the virtual object is located, wherein the light intensities in different light environments are different;
and when the light environment in which the virtual object is positioned is changed, synchronously changing the definition of the pupil so as to adapt to the changed light environment.
15. The method of claim 1, wherein prior to presenting a view of a virtual scene viewed by the virtual object through the scope, the method further comprises:
creating a patch which is consistent with an eyepiece of the sighting telescope, and drawing a chartlet corresponding to the patch, wherein the chartlet comprises a first part corresponding to the visual field picture and a second part corresponding to the pupil, and the first part is transparent;
combining the chartlet with the patch to obtain a target patch;
the displaying a pupil formed by optically imaging an outer frame of the scope between the frame and the view field picture includes:
acquiring a scene patch corresponding to the virtual scene and a frame patch corresponding to the outer frame of the sighting telescope;
rendering the scene patch, the frame patch and the target patch according to a preset rendering sequence so as to display a pupil formed by optically imaging an outer frame of the sighting telescope between the frame and the visual field picture.
16. The method of claim 15, wherein said rendering the corresponding map of the patch comprises:
acquiring an initial map corresponding to the patch, wherein the initial map comprises the first part and an initial part corresponding to the pupil, and the color value of each pixel in the initial part corresponds to the default state of the pupil;
determining the state of the sighting telescope and the state of the virtual object, and determining the central point of pixel sampling according to the state of the sighting telescope and the state of the virtual object;
and based on the central point, carrying out pixel sampling in the initial mapping to obtain a mapping corresponding to the patch.
17. An apparatus for displaying a picture of a virtual scene, the apparatus comprising:
the first presentation module is used for presenting an interface for aiming the virtual object by using a sighting telescope included in the shooting prop by adopting a first person visual angle;
the second presentation module is used for presenting a frame of an eyepiece in the sighting telescope in the interface, and a view picture of a virtual scene seen by the virtual object through the sighting telescope is displayed in the internal area of the frame;
the third presentation module is used for displaying a pupil formed by optical imaging of the outer frame of the sighting telescope between the frame and the visual field picture;
and the presentation control module is used for controlling the display effect of the pupil to synchronously change when the state of the sighting telescope or the state of the virtual object is changed so as to adapt to the changed state.
18. A terminal device, comprising:
a memory for storing executable instructions;
a processor, configured to implement the method for displaying a virtual scene according to any one of claims 1 to 16 when executing the executable instructions stored in the memory.
19. A computer-readable storage medium storing executable instructions for implementing the method for displaying a virtual scene according to any one of claims 1 to 16 when executed by a processor.
20. A computer program product comprising a computer program or instructions which, when executed by a processor, implement a method of screen presentation of a virtual scene as claimed in any one of claims 1 to 16.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111226478.2A CN113926194A (en) | 2021-10-21 | 2021-10-21 | Method, apparatus, device, medium, and program product for displaying picture of virtual scene |
CN202111672379.7A CN114100134B (en) | 2021-10-21 | 2021-12-31 | Picture display method, device, equipment, medium and program product of virtual scene |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111226478.2A CN113926194A (en) | 2021-10-21 | 2021-10-21 | Method, apparatus, device, medium, and program product for displaying picture of virtual scene |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113926194A true CN113926194A (en) | 2022-01-14 |
Family
ID=79280830
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111226478.2A Withdrawn CN113926194A (en) | 2021-10-21 | 2021-10-21 | Method, apparatus, device, medium, and program product for displaying picture of virtual scene |
CN202111672379.7A Active CN114100134B (en) | 2021-10-21 | 2021-12-31 | Picture display method, device, equipment, medium and program product of virtual scene |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111672379.7A Active CN114100134B (en) | 2021-10-21 | 2021-12-31 | Picture display method, device, equipment, medium and program product of virtual scene |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN113926194A (en) |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108579083B (en) * | 2018-04-27 | 2020-04-21 | 腾讯科技(深圳)有限公司 | Virtual scene display method and device, electronic device and storage medium |
CN108671540A (en) * | 2018-05-09 | 2018-10-19 | 腾讯科技(深圳)有限公司 | Accessory switching method, equipment and storage medium in virtual environment |
CN110141869A (en) * | 2019-04-11 | 2019-08-20 | 腾讯科技(深圳)有限公司 | Method of controlling operation thereof, device, electronic equipment and storage medium |
CN112221134B (en) * | 2020-11-09 | 2022-05-31 | 腾讯科技(深圳)有限公司 | Virtual environment-based picture display method, device, equipment and medium |
-
2021
- 2021-10-21 CN CN202111226478.2A patent/CN113926194A/en not_active Withdrawn
- 2021-12-31 CN CN202111672379.7A patent/CN114100134B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN114100134B (en) | 2024-09-20 |
CN114100134A (en) | 2022-03-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Linowes | Unity virtual reality projects | |
CN107376349B (en) | Occluded virtual image display | |
CN112090069B (en) | Information prompting method and device in virtual scene, electronic equipment and storage medium | |
US8847984B2 (en) | System and method for forming a composite image in a portable computing device having a dual screen display | |
JP2021507408A (en) | Methods and systems for generating and displaying 3D video in virtual, enhanced, or mixed reality environments | |
CN107209386A (en) | Augmented reality visual field object follower | |
CN112138385B (en) | Virtual shooting prop aiming method and device, electronic equipment and storage medium | |
CN113797536A (en) | Method, apparatus, storage medium, and program product for controlling object in virtual scene | |
CN112057860B (en) | Method, device, equipment and storage medium for activating operation control in virtual scene | |
US20190114841A1 (en) | Method, program and apparatus for providing virtual experience | |
CN113559510B (en) | Virtual skill control method, device, equipment and computer readable storage medium | |
JP2024040309A (en) | Information processing method in virtual scene, device, instrument, medium, and program program | |
US20230330534A1 (en) | Method and apparatus for controlling opening operations in virtual scene | |
Mack et al. | Unreal Engine 4 virtual reality projects: build immersive, real-world VR applications using UE4, C++, and unreal blueprints | |
CN114344896A (en) | Virtual scene-based snap-shot processing method, device, equipment and storage medium | |
CN112402946A (en) | Position acquisition method, device, equipment and storage medium in virtual scene | |
CN112156472B (en) | Control method, device and equipment of virtual prop and computer readable storage medium | |
CN114130006B (en) | Virtual prop control method, device, equipment, storage medium and program product | |
KR102549822B1 (en) | Methods and devices for presenting and manipulating conditionally dependent synthetic reality content threads | |
CN114100134B (en) | Picture display method, device, equipment, medium and program product of virtual scene | |
KR20200100797A (en) | Creation of objectives for objective launchers from synthesized reality settings | |
Seligmann | Creating a mobile VR interactive tour guide | |
Kim | Rupture of the Virtual | |
CN112800252B (en) | Method, device, equipment and storage medium for playing media files in virtual scene | |
CN112891930B (en) | Information display method, device, equipment and storage medium in virtual scene |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20220114 |