CN117765207A - Virtual interface display method, device, equipment and medium - Google Patents

Virtual interface display method, device, equipment and medium Download PDF

Info

Publication number
CN117765207A
CN117765207A CN202211138585.4A CN202211138585A CN117765207A CN 117765207 A CN117765207 A CN 117765207A CN 202211138585 A CN202211138585 A CN 202211138585A CN 117765207 A CN117765207 A CN 117765207A
Authority
CN
China
Prior art keywords
interface
virtual
display
layer
display layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211138585.4A
Other languages
Chinese (zh)
Inventor
马倩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202211138585.4A priority Critical patent/CN117765207A/en
Publication of CN117765207A publication Critical patent/CN117765207A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the disclosure relates to a method, a device, equipment and a medium for displaying a virtual interface, wherein the method comprises the following steps: responding to the view operation of the virtual interface displayed in the interface display layer, and acquiring a standard reference position corresponding to the predetermined virtual interface, wherein the standard reference position is the position of the predetermined standard layer in the virtual reality space, the interface display layer is positioned right behind the standard display layer in parallel in the direction of the vertical coordinate axis in the virtual reality space, and the direction pointing to the user from the inside of the virtual reality space is the positive direction of the vertical coordinate axis; and controlling the interface display layer on which the virtual interface is rendered according to the standard reference position, and translating to the target reference display position along the positive direction of the vertical coordinate axis for display. Therefore, the depth information of the virtual reality space is fully utilized to display the related virtual interface, and the visual display stereoscopic impression of the virtual interface is improved.

Description

Virtual interface display method, device, equipment and medium
Technical Field
The disclosure relates to the technical field of virtual reality, and in particular relates to a method, a device, equipment and a medium for displaying a virtual interface.
Background
Virtual Reality (VR) technology, also known as Virtual environments, moods, or artificial environments, refers to technology that utilizes a computer to generate a Virtual world that can directly impart visual, auditory, and tactile sensations to participants and allow them to interactively observe and operate. Improving the operating experience of VR is becoming a mainstream.
In the related art, an application operation may be implemented in a virtual reality space based on a virtual reality technology, and a display position of a virtual interface is determined in the virtual reality space according to an optimal viewing distance from a user's eyes, and a corresponding virtual interface is displayed in a spatial virtual screen form at the display position.
However, in the above manner of determining the display position of the virtual interface in the virtual reality space at the optimal viewing distance from the eyes of the user, the display manner of the virtual interface does not fully utilize the depth information in the virtual reality space, and the visual experience of the user is affected.
Disclosure of Invention
In order to solve the above technical problems or at least partially solve the above technical problems, the present disclosure provides a method, an apparatus, a device, and a medium for displaying a virtual interface, so as to simulate the display of the virtual interface in the real world, and fully utilize depth information in the virtual reality space, and realize a stereoscopic impression of the display of the virtual interface on the basis of realizing the display of the virtual interface.
The embodiment of the disclosure provides a method for displaying a virtual interface, which comprises the following steps: responding to the view operation of a virtual interface displayed in an interface display layer, and acquiring a predetermined standard reference position corresponding to the virtual interface, wherein the standard reference position is a position of a predetermined standard layer in a virtual reality space, the interface display layer is positioned right behind the standard display layer in parallel in the vertical coordinate axis direction in the virtual reality space, and the direction pointing to a user from the inside of the virtual reality space is the positive direction of the vertical coordinate axis; and controlling an interface display layer on which the virtual interface is rendered according to the standard reference position, and translating to a target reference display position along the positive direction of the vertical coordinate axis for display, wherein the vertical coordinate value corresponding to the target reference display position is smaller than or equal to the vertical coordinate value corresponding to the standard reference position.
The embodiment of the disclosure also provides a display device of the virtual interface, which comprises: the device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for responding to the view operation of a virtual interface displayed in an interface display layer, acquiring a predetermined standard reference position corresponding to the virtual interface, wherein the standard reference position is a position of the predetermined standard layer in a virtual reality space, the interface display layer is positioned right behind the standard display layer in parallel in the vertical coordinate axis direction in the virtual reality space, and the direction pointing to a user from the inside of the virtual reality space is the positive direction of the vertical coordinate axis; and the display processing module is used for controlling the interface display layer on which the virtual interface is rendered according to the standard reference position, and translating the interface display layer to a target reference display position along the positive direction of the vertical coordinate axis for display, wherein the vertical coordinate value corresponding to the target reference display position is smaller than or equal to the vertical coordinate value corresponding to the standard reference position.
The embodiment of the disclosure also provides an electronic device, which comprises: a processor; a memory for storing the processor-executable instructions; the processor is configured to read the executable instruction from the memory, and execute the instruction to implement a method for displaying a virtual interface according to an embodiment of the present disclosure.
The embodiment of the present disclosure also provides a computer-readable storage medium storing a computer program for executing the display method of the virtual interface provided by the embodiment of the present disclosure.
Compared with the prior art, the technical scheme provided by the embodiment of the disclosure has the following advantages:
according to the display scheme of the virtual interface, a standard reference position corresponding to a predetermined virtual interface is obtained in response to a viewing operation of the virtual interface displayed in an interface display layer, wherein the standard reference position is a position of the predetermined standard layer in a virtual reality space, the interface display layer is positioned right behind the standard display layer in the vertical coordinate axis direction in the virtual reality space in parallel, the direction pointing to a user from the inside of the virtual reality space is the positive direction of the vertical coordinate axis, the interface display layer on which the virtual interface is rendered is controlled according to the standard reference position, and the interface display layer is translated to a target reference display position along the positive direction of the vertical coordinate axis for display, wherein the vertical coordinate value corresponding to the target reference display position is smaller than or equal to the vertical coordinate value corresponding to the standard reference position. In the embodiment of the disclosure, the interface display layer with a depth of field relation with the standard display layer is set to display the virtual interface, so that the depth of field effect of virtual interface display is created on the basis of realizing virtual interface rendering display, and the forward vertical axis movement of the virtual interface is controlled when the virtual interface is displayed, so that the display stereoscopic impression of the virtual interface is improved.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
Fig. 1 is a schematic view of an application scenario of a virtual reality device provided by the present disclosure;
fig. 2 is a flow chart of a method for displaying a virtual interface according to an embodiment of the disclosure;
fig. 3 is a schematic view of a display scenario of a virtual interface according to an embodiment of the present disclosure;
fig. 4 is a schematic view of a display scenario of another virtual interface provided by an embodiment of the present disclosure;
fig. 5 is a schematic view of a display scenario of another virtual interface provided in an embodiment of the present disclosure;
fig. 6 is a schematic view of a display scenario of another virtual interface provided by an embodiment of the present disclosure;
fig. 7 is a schematic view of a display scenario of another virtual interface provided by an embodiment of the present disclosure;
fig. 8 is a schematic view of a display scenario of another virtual interface provided by an embodiment of the present disclosure;
fig. 9 is a schematic view of a display scenario of another virtual interface provided by an embodiment of the present disclosure;
Fig. 10 is a schematic view of a display scenario of another virtual interface provided by an embodiment of the present disclosure;
fig. 11 is a schematic structural diagram of a display device of a virtual interface according to an embodiment of the disclosure;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
VR: VR is a technology for creating and experiencing a virtual world, and calculating to generate a virtual environment, which is a multi-source information (the virtual reality mentioned herein at least includes visual perception, and may also include auditory perception, tactile perception, motion perception, and even gustatory perception, olfactory perception, etc.), so as to realize the fused, interactive three-dimensional dynamic view of the virtual environment and the simulation of physical behaviors, and to immerse the user in the simulated virtual reality environment, and to realize the application in various virtual environments such as map, game, video, education, medical treatment, simulation, collaborative training, sales, assistance in manufacturing, maintenance, repair, etc.
The virtual reality device, the terminal for realizing the virtual reality effect in VR, may be provided in the form of glasses, head mounted display (Head Mount Display, HMD) or contact lens for realizing visual perception and other forms of perception, but the form of the virtual reality device is not limited to this, and may be further miniaturized or enlarged as needed.
The virtual reality devices described in embodiments of the present disclosure may include, but are not limited to, the following types:
a computer-side virtual reality (PCVR) device performs related computation of a virtual reality function and data output by using a PC side, and an external computer-side virtual reality device realizes a virtual reality effect by using data output by the PC side.
The mobile virtual reality device supports setting up a mobile terminal (such as a smart phone) in various manners (such as a head-mounted display provided with a special card slot), performing related calculation of a virtual reality function by the mobile terminal through connection with the mobile terminal in a wired or wireless manner, and outputting data to the mobile virtual reality device, for example, watching a virtual reality video through an APP of the mobile terminal.
The integrated virtual reality device has a processor for performing the calculation related to the virtual function, and thus has independent virtual reality input and output functions, and is free from connection with a PC or a mobile terminal, and has high degree of freedom in use.
Virtual objects, objects that interact in an augmented reality scene, objects that are under the control of a user or a robot program (e.g., an artificial intelligence based robot program) are capable of being stationary, moving, and performing various behaviors in an augmented reality scene.
Taking VR scenes as an example, as shown in fig. 1, HMDs are relatively light, ergonomically comfortable, and provide high resolution content with low latency. The sensor (such as a nine-axis sensor) for detecting the gesture in the virtual reality device is arranged in the virtual reality device, and is used for detecting the gesture change of the virtual reality device in real time, if the user wears the virtual reality device, when the gesture of the head of the user changes, the real-time gesture of the head is transmitted to the processor, so as to calculate the gaze point of the sight of the user in the virtual environment.
In this embodiment, when the user wears the HMD device and opens a predetermined virtual application, the HMD device may run a corresponding virtual scene, where the virtual scene may be a simulation environment for the real world, a semi-simulated semi-fictional virtual scene, or a purely fictional virtual scene. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the dimensions of the virtual scene are not limited in the embodiments of the present disclosure. For example, the virtual scene may include a person, sky, land, sea, etc., and the land may include environmental elements such as a desert, a city, etc., where an operation interface corresponding to the virtual application is also displayed in the virtual scene, and the operation interface may be interactively operated by means of a handle device, a bare hand gesture, a virtual wearable device, etc.
In this embodiment, when the user wears the HMD device and opens a predetermined virtual application, the HMD device may run a corresponding virtual reality scene, and display a related virtual interface of the virtual application in the virtual reality scene. The user may interactively control the virtual interface, etc. by means of a handle device, a bare hand gesture, etc.
Wherein, in some possible embodiments, for a virtual interface in a virtual reality scene, considering the rendering logic of the virtual reality scene, the rendering logic of the virtual reality scene may be different from the rendering logic of a preset rendering component in the virtual reality device, for example, the virtual reality scene is rendered based on a Unity engine, and the virtual reality device is english based on an open graphics library of the android: open Graphics Library, openGL), the related functions of the actual application program corresponding to the virtual application are usually developed based on the preset rendering component of the android, so that the rendering of the virtual interface based on the Unity engine cannot be realized, and therefore, in order to realize the normal display of the virtual interface in the virtual reality scene, in the embodiment of the present disclosure, the preset rendering component of the virtual reality device is multiplexed to perform the rendering of the virtual interface. The virtual interface includes, but is not limited to, any one corresponding functional page in the virtual application, a video frame card in the related virtual application, and the like, and the content of the virtual interface specifically displayed can be different according to different scene requirements.
In one embodiment of the disclosure, when a rendering requirement of a virtual interface is detected, interface rendering information of the virtual interface is requested to a server, a preset rendering component of the virtual reality device renders the virtual interface according to the interface rendering information, and then the virtual interface is transmitted to a Unity engine for displaying the virtual interface.
When displaying a virtual interface, the virtual interface is usually displayed on a preset display distance from the eyes of a user in the prior art, and the present disclosure proposes a display method of the virtual interface, which fully uses depth information in a virtual reality space and displays the virtual interface with a depth of field stereoscopic effect. The method is described below with reference to specific examples.
Fig. 2 is a flow chart of a method for displaying a virtual interface according to an embodiment of the present disclosure, where the method may be performed by a display device of the virtual interface, and the device may be implemented by software and/or hardware, and may be generally integrated in an electronic device. As shown in fig. 2, the method includes:
step 201, responding to the view operation of the virtual interface displayed in the interface display layer, obtaining the standard reference position corresponding to the predetermined virtual interface, wherein,
The standard reference position is a position of a predetermined standard image layer in the virtual reality space, the interface display image layer is positioned right behind the standard display image layer in parallel in the direction of a vertical coordinate axis in the virtual reality space, and the direction pointing to the user from the inside of the virtual reality space is the positive direction of the vertical coordinate axis.
In one embodiment of the present disclosure, a standard reference position of a standard display layer in a virtual reality space is first determined, wherein the standard reference position is located within a virtual field of view.
In some possible embodiments, a standard reference position of a standard display layer in a virtual reality space is determined, where the standard display layer in this embodiment may be understood as a layer with a distance from a user's eyes being a preset viewing distance in a region visible to the user in a virtual field of view, where the standard display layer may be understood as a position of the virtual display screen, where the preset viewing distance is a distance that is calibrated according to experimental data and is most suitable for viewing by the user's eyes, for example, may be 1.5 meters.
In some possible embodiments, after responding to the opening operation of the virtual application, the opening operation of the virtual application is considered to be acquired, further, the current sight direction of the user when the opening operation is received is determined, and the standard reference position is determined according to the current sight direction of the user, that is, the standard reference position is consistent with the current sight direction of the user.
When the user wears the HMD device to start, if the opening operation of the virtual application is detected, for example, the triggering operation of the user on the opening control corresponding to the virtual application on the virtual reality device is detected, the opening operation of the virtual application is considered to be obtained.
That is, in this embodiment, when the virtual application is started, the standard reference position of the standard display layer is first determined according to the current line-of-sight direction of the user, so that the standard display layer is located in the current line-of-sight direction of the user as shown in fig. 3, thereby ensuring that the target space position in the line-of-sight direction is consistent with the line-of-sight direction. It should be noted that, after the virtual application is started, the standard reference position of the standard display layer is no longer consistent with the line of sight direction of the user, that is, as shown in fig. 4, the user moves in the human eye direction, the corresponding standard display layer does not move with the eye, and the user can see different directions of the virtual interface at different angles, so that the spatial third dimension of the subsequent virtual interface display is ensured.
It should be noted that, in different application scenarios, the manner of determining the standard reference position according to the current line of sight direction of the user is different, and examples are as follows:
In one embodiment of the present disclosure, a center point location of a virtual reality panorama space is determined, the center point location being located at a center of the virtual reality space, the center point location being related to a shape of the virtual reality space. After determining the position of the center point of the virtual reality panorama space, a preset human eye viewing distance is obtained, and the preset human eye viewing distance is an optimal display distance which is most suitable for the human eye viewing of a user, for example, the preset human eye viewing distance can be 1.5 meters and the like.
Further, starting from the central point position, the position extending to the preset human eye viewing distance according to the current viewing direction of the user is used as a standard reference position, so that consistency of the standard reference position and the viewing direction of the user can be ensured, and viewing experience of the user when the virtual application is started is improved.
For example, as shown in fig. 5, the virtual reality space is a "box-shaped" cube space, the preset viewing distance of human eyes is R1, and the center point position of the virtual reality space is O1, and after determining the current viewing direction of the user when the virtual application is started, the virtual reality space directly extends to the position of the preset viewing distance of human eyes according to the current viewing direction of the user as the standard reference position.
The interface display layer is positioned right behind the standard display layer in parallel in the direction of the vertical coordinate axis in the virtual reality space, the direction pointing to the user from the inside of the virtual reality space is the positive direction of the vertical coordinate axis, and the vertical coordinate value corresponding to the initial position of the interface display layer is smaller than the standard vertical coordinate value of the standard reference position.
That is, in one embodiment of the present disclosure, when determining the standard display layer, the virtual interface is not directly on the standard display layer, but the standard display layer is used as a window, an interface display layer having a depth difference with the standard display layer is disposed inside the window, and the virtual interface is displayed through the interface display layer, so that the virtual interface has a depth difference with respect to the window, and an embedded depth display effect is formed.
As shown in fig. 6, in order to ensure that the virtual interface is located within the virtual field of view, the interface display layer may be located directly behind the labeling display layer in parallel in the direction of the vertical coordinate axis in the virtual reality space, that is, on a certain vertical line parallel to the direction of the vertical coordinate axis (i.e., the Z axis), the interface display layer is located directly behind the labeling display layer, the center points of the standard display layer and the interface display layer are located on the same vertical line parallel to the vertical coordinate axis, and the horizontal coordinate value (i.e., the X axis coordinate value) and the vertical coordinate value (i.e., the Y axis coordinate value) of the center point of the standard display layer and the interface display layer are the same, so that when the interface display layer deviates from the labeling display layer, a part of virtual interface is prevented from deviating from the field of view angle of the user and cannot be observed.
That is, in this embodiment, the standard vertical coordinate value of the standard reference position is larger than the vertical coordinate value corresponding to the interface display layer, and the direction pointing to the user from the inside of the virtual reality space is the positive direction of the vertical coordinate axis.
In one embodiment of the present disclosure, to further facilitate the user's viewing of the virtual interface, the standard display layer may be a transparent layer, so that the user may only feel visually how far or near the virtual interface is, and not feel the display of the standard display layer, where the standard display layer displays the content of the virtual interface in the interface display layer located behind it as a transparent "virtual screen". Of course, in order to meet the display requirements under some scenes, the color and transparency of the standard display layer can be adjusted according to the requirements, so that a user can visually feel that a layer of mask is arranged in front of the interface display layer, and the depth of field sense of the virtual interface display is further improved.
When the initial position of the interface display layer is determined according to the standard reference position, the reference position, which is perpendicular to the negative direction of the coordinate axis and has a first preset distance threshold value with the distance from the standard reference position, can be determined as the initial position, wherein the first preset distance threshold value can be set according to scene requirements, and the interface display layer positioned at the initial position has a depth difference in the Z-axis direction with the standard display layer, so that a depth sense of embedded display in the window is provided.
In one embodiment of the present disclosure, a view operation of a virtual interface displayed in an interface display layer is obtained, where the view operation may be understood as a "river" operation of the virtual interface, the "river" operation may be implemented by a gesture call to the virtual interface, a voice call operation, or the like, or may also be implemented by a virtual manipulation device (e.g., a handle), or the like. When the checking operation is performed by the virtual control device, a trigger direction indication model may be rendered in the virtual reality space according to a control direction corresponding to the virtual control device, where the trigger direction indication model is used to indicate a trigger direction corresponding to the control direction in the virtual reality space, the trigger direction corresponds to the control direction in real time, and the control direction may be controlled by a rotation angle corresponding to a preset rotation button on the virtual control device, and further, as shown in fig. 7, in response to the interface display layer being located in a current trigger direction of the trigger direction indication model.
It should be noted that, in different application scenarios, the trigger direction indication model is in a different form, so that a manner of looking at the corresponding trigger direction directly in the virtual reality space is different, and in some possible embodiments, reference is continued to fig. 7, where the trigger direction indication model includes a ray track model, where a start point of the ray track model is a corresponding spatial position of the virtual control device in the virtual reality space, a ray direction is used to indicate a control direction of the virtual control device, and when an end point of a ray is located on the virtual interface, a viewing operation of the virtual interface is obtained.
In one embodiment of the present disclosure, a standard reference position corresponding to a predetermined virtual interface is acquired in response to a viewing operation of the virtual interface displayed in an interface display layer.
Step 202, controlling an interface display layer on which a virtual interface is rendered according to a standard reference position, and translating to a target reference display position along the positive direction of a vertical coordinate axis for display, wherein the vertical coordinate value corresponding to the target reference display position is smaller than or equal to the vertical coordinate value corresponding to the standard reference position.
In one embodiment of the disclosure, after a standard reference display position corresponding to a virtual interface is obtained, controlling an interface display layer on which the virtual interface is rendered according to the standard reference position, and translating to a target reference display position along a positive direction of a vertical coordinate axis for display, wherein a vertical coordinate value corresponding to the target reference display position is smaller than or equal to a vertical coordinate value corresponding to the standard reference position. Namely, the vertical coordinate value corresponding to the target reference display position is larger than the vertical coordinate axis of the initial position of the interface display layer and smaller than or equal to the vertical coordinate value corresponding to the standard reference position. That is, referring to fig. 8, the interface display layer moves to the standard vertical coordinate position at most, and the virtual interface visually responds to the viewing operation from the direction away from the eyes of the user to the direction close to the eyes of the user, so that an effect of visually displaying the depth of field change is provided, depth information in the virtual reality space is fully utilized, and the stereoscopic impression of the displayed virtual interface is improved.
In some possible embodiments, the interface display layer on which the virtual interface is rendered may be directly moved to the standard vertical coordinate position, that is, the center of the layer of the control interface display layer is moved to the position of the vertical coordinate value corresponding to the standard reference position along the positive direction of the vertical coordinate axis.
In some possible embodiments, the interface display layer rendered with the virtual interface is controlled to translate in a positive direction of the vertical coordinate axis according to a second preset distance threshold, such that the interface display layer moves to a target reference display position, wherein,
the second preset distance threshold is smaller than or equal to the distance between the standard reference position and the initial position, and the second preset distance threshold can be calibrated according to scene requirements.
In addition, when the virtual interface displayed on the interface display layer has different areas with inconsistent depth information, for example, as shown in fig. 9, when the virtual interface includes a main display area and an associated display area, the main display area user displays an operation interface of the currently-unfolded function service, and the associated display area user displays a function entry of other function services, where the associated display area and the main display area have included angles, for example, the associated display area has a bevel angle toward a direction close to a user's eyes, in this scenario, when the interface display layer moves in a positive direction of a vertical axis, if the movement distance is greater, for example, when the interface display layer moves to a standard vertical coordinate value, the distance between the main display area and the user's eyes is smaller than the distance between the associated display area and the user's eyes, and the associated display area may have a display effect of a prominent window, so as to further enhance the stereoscopic impression of the display.
It should be noted that, in this embodiment, the interface display layer is displayed right behind the vertical coordinate axis, instead of being displayed forward of the standard display layer, which is because if the interface display layer is displayed right in front of the vertical coordinate axis, for the purpose of subsequently creating a visual experience of depth change, when the interface display layer is moved backward, a partial area of the virtual interface may be located outside the standard display layer, so that there is a physical conflict, and when the virtual interface is moved to a position with a physical conflict, the virtual interface is not moved backward any more, which affects the visual depth effect displayed in the virtual interface.
In addition, in an embodiment of the disclosure, in order to further improve a stereoscopic display effect of a virtual interface, a background layer may be further displayed at a rear side of the interface display layer, where background information rendered according to preset rendering parameters is rendered in the background layer, where the background information may be image information or backlight information, that is, an initial reference display position of the background layer is determined according to an initial display position of the interface display layer, and a background of the background layer is rendered in the background layer, where a backlight of the image layer may be displayed in the background of the image layer, so as to improve a technical sense of the background display of the image layer, the background layer is parallel to a right rear side of the interface display layer in a vertical coordinate axis direction in a virtual reality space, and the background layer translates in a right direction of the vertical coordinate axis along with translation of the interface display layer, where a distance between the background layer and the image layer is a third preset distance threshold, that is, as shown in fig. 10, a relative orientation and a distance between the background layer and the interface display layer are fixed, when the interface display layer moves, the background layer may move, and when the user views the background layer in a visual sense, especially when the virtual layer has a visual sense, and the visual sense of the background layer may further changes from the visual sense to the visual sense of the visual sense. The layer distance between the background layer and the interface display layer is a third preset distance threshold.
In summary, in the method for displaying a virtual interface according to the embodiments of the present disclosure, in response to a viewing operation of a virtual interface displayed in an interface display layer, a standard reference position corresponding to a predetermined virtual interface is obtained, where the standard reference position is a position of the predetermined standard layer in a virtual reality space, the interface display layer is located directly behind the standard display layer in a vertical coordinate axis direction in the virtual reality space in parallel, a direction pointing to a user from an inside of the virtual reality space is a positive direction of the vertical coordinate axis, the interface display layer on which the virtual interface is rendered is controlled according to the standard reference position, and is translated to a target reference display position along the positive direction of the vertical coordinate axis for display, where a vertical coordinate value corresponding to the target reference display position is less than or equal to a vertical coordinate value corresponding to the standard reference position. In the embodiment of the disclosure, the interface display layer with a depth of field relation with the standard display layer is set to display the virtual interface, so that the depth of field effect of virtual interface display is created on the basis of realizing virtual interface rendering display, and the forward vertical axis movement of the virtual interface is controlled when the virtual interface is displayed, so that the display stereoscopic impression of the virtual interface is improved.
In order to achieve the above embodiments, the present disclosure further provides a display device for a virtual interface.
Fig. 11 is a schematic structural diagram of a display device for a virtual interface according to an embodiment of the present disclosure, where the device may be implemented by software and/or hardware, and may be generally integrated in an electronic device to display the virtual interface. As shown in fig. 11, the apparatus includes: an acquisition module 1110 and a display processing module 1120, wherein,
an obtaining module 1110, configured to obtain a standard reference position corresponding to a predetermined virtual interface in response to a viewing operation of the virtual interface displayed in the interface display layer, where,
the standard reference position is a position of a predetermined standard image layer in the virtual reality space, the interface display image layer is positioned right behind the standard display image layer in parallel in the direction of a vertical coordinate axis in the virtual reality space, and the direction pointing to the user from the inside of the virtual reality space is the positive direction of the vertical coordinate axis;
the display processing module 1120 is configured to control, according to the standard reference position, the display layer of the interface on which the virtual interface is rendered, and translate the display layer to the target reference display position along the positive direction of the vertical coordinate axis for display, where the vertical coordinate value corresponding to the target reference display position is less than or equal to the vertical coordinate value corresponding to the standard reference position.
The display device for the virtual interface provided by the embodiment of the disclosure may execute the display method for the virtual interface provided by any embodiment of the disclosure, and has corresponding functional modules and beneficial effects of the execution method, and the implementation principle is similar and will not be described herein.
To achieve the above embodiments, the present disclosure also proposes a computer program product comprising a computer program/instruction which, when executed by a processor, implements the method of displaying a virtual interface in the above embodiments.
Fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
Referring now in particular to fig. 12, a schematic diagram of a configuration of an electronic device 1200 suitable for use in implementing embodiments of the present disclosure is shown. The electronic device 1200 in the embodiments of the present disclosure may include, but is not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, as well as stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 12 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 12, the electronic device 1200 may include a processor (e.g., a central processor, a graphics processor, etc.) 1201, which may perform various appropriate actions and processes according to programs stored in a Read Only Memory (ROM) 1202 or programs loaded from a memory 1208 into a Random Access Memory (RAM) 1203. In the RAM 1203, various programs and data required for the operation of the electronic apparatus 1200 are also stored. The processor 1201, the ROM 1202, and the RAM 1203 are connected to each other through a bus 1204. An input/output (I/O) interface 1205 is also connected to the bus 1204.
In general, the following devices may be connected to the I/O interface 1205: input devices 1206 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, and the like; an output device 1207 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; memory 1208 including, for example, magnetic tape, hard disk, etc.; and a communication device 1209. The communication means 1209 may allow the electronic device 1200 to communicate wirelessly or by wire with other devices to exchange data. While fig. 12 shows an electronic device 1200 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communications device 1209, or installed from the memory 1208, or installed from the ROM 1202. When executed by the processor 1201, the computer program performs the functions defined above in the display method of the virtual interface of the embodiment of the present disclosure.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to:
and responding to the view operation of the virtual interface displayed in the interface display layer, acquiring a standard reference position corresponding to the predetermined virtual interface, wherein the standard reference position is a position of the predetermined standard layer in the virtual reality space, the interface display layer is positioned right behind the standard display layer in parallel in the vertical coordinate axis direction in the virtual reality space, the direction pointing to the user from the inside of the virtual reality space is the positive direction of the vertical coordinate axis, the interface display layer on which the virtual interface is rendered is controlled according to the standard reference position, and the interface display layer is translated to the target reference display position along the positive direction of the vertical coordinate axis for display, wherein the vertical coordinate value corresponding to the target reference display position is smaller than or equal to the vertical coordinate value corresponding to the standard reference position. In the embodiment of the disclosure, the interface display layer with a depth of field relation with the standard display layer is set to display the virtual interface, so that the depth of field effect of virtual interface display is created on the basis of realizing virtual interface rendering display, and the forward vertical axis movement of the virtual interface is controlled when the virtual interface is displayed, so that the display stereoscopic impression of the virtual interface is improved.
The electronic device may write computer program code for performing the operations of the present disclosure in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (12)

1. The display method of the virtual interface is characterized by comprising the following steps of:
responding to the view operation of the virtual interface displayed in the interface display layer, acquiring a predetermined standard reference position corresponding to the virtual interface, wherein,
the standard reference position is a position of a predetermined standard image layer in a virtual reality space, the interface display image layer is positioned right behind the standard display image layer in parallel in the direction of a vertical coordinate axis in the virtual reality space, and the direction pointing to a user from the inside of the virtual reality space is a positive direction of the vertical coordinate axis;
and controlling an interface display layer on which the virtual interface is rendered according to the standard reference position, and translating to a target reference display position along the positive direction of the vertical coordinate axis for display, wherein the vertical coordinate value corresponding to the target reference display position is smaller than or equal to the vertical coordinate value corresponding to the standard reference position.
2. The method of claim 1, comprising, prior to said obtaining a predetermined standard reference location for said virtual interface:
and determining a standard reference position of the standard display layer in the virtual reality space, wherein the standard reference position is positioned in the virtual field of view.
3. The method of claim 2, wherein determining the standard reference position of the standard display layer in virtual reality space comprises:
responding to the opening operation of the virtual application;
and determining the current sight direction of the user when the opening operation is received, and determining the standard reference position according to the current sight direction of the user.
4. A method according to claim 3, wherein said determining said standard reference position from said user's current gaze direction comprises:
determining the position of a central point of the virtual reality space and acquiring a preset human eye watching distance;
starting from the center point position, a position extending to the preset human eye viewing distance according to the current sight line direction of the user is used as the standard reference position.
5. The method of claim 1, comprising, prior to said responding to a viewing operation of a virtual interface displayed in an interface display layer:
In the negative direction of the vertical coordinate axis, determining the position with the distance from the standard reference position being a first preset distance threshold as the initial position of the interface display layer;
and rendering the virtual interface at the initial position where the display layer of the interface is positioned.
6. The method of claim 1, wherein the responding to the viewing operation of the virtual interface displayed in the interface display layer comprises:
rendering a trigger direction indication model in the virtual reality space according to a control direction corresponding to virtual control equipment, wherein the trigger direction indication model is used for indicating a trigger direction corresponding to the control direction in the virtual reality space, and the trigger direction corresponds to the control direction in real time;
and responding to the interface display layer being positioned in the current trigger direction of the trigger direction indication model.
7. The method of claim 1, wherein controlling the interface display layer rendered with the virtual interface according to the standard reference position, panning to a target reference display position along the positive direction of the vertical coordinate axis for display, comprises:
controlling the interface display layer on which the virtual interface is rendered, translating the interface display layer to the positive direction of the vertical coordinate axis according to a second preset distance threshold value so as to enable the interface display layer to move to the target reference display position, wherein,
And the second preset distance threshold is smaller than or equal to the distance between the standard reference position and the initial position of the interface display layer.
8. The method of any one of claim 1 to 7,
the virtual reality space also comprises a background layer corresponding to the interface display layer, wherein the background layer is rendered with background information rendered according to preset rendering parameters,
the background layer is parallel to the right rear of the interface display layer in the direction of the vertical coordinate axis in the virtual reality space, the background layer translates along with the translation of the interface display layer in the positive direction of the vertical coordinate axis, and the layer distance between the background layer and the interface display layer is a third preset distance threshold.
9. The method of any one of claim 1 to 7,
the standard display layer is a transparent layer.
10. A display device for a virtual interface, comprising:
an acquisition module for responding to the view operation of the virtual interface displayed in the interface display layer and acquiring the predetermined standard reference position corresponding to the virtual interface,
The standard reference position is a position of a predetermined standard image layer in a virtual reality space, the interface display image layer is positioned right behind the standard display image layer in parallel in the direction of a vertical coordinate axis in the virtual reality space, and the direction pointing to a user from the inside of the virtual reality space is a positive direction of the vertical coordinate axis;
and the display processing module is used for controlling the interface display layer on which the virtual interface is rendered according to the standard reference position, and translating the interface display layer to a target reference display position along the positive direction of the vertical coordinate axis for display, wherein the vertical coordinate value corresponding to the target reference display position is smaller than or equal to the vertical coordinate value corresponding to the standard reference position.
11. An electronic device, the electronic device comprising:
a processor;
a memory for storing the processor-executable instructions;
the processor is configured to read the executable instructions from the memory and execute the executable instructions to implement the method for displaying a virtual interface according to any one of claims 1-9.
12. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program for executing the method of displaying a virtual interface according to any one of the preceding claims 1-9.
CN202211138585.4A 2022-09-19 2022-09-19 Virtual interface display method, device, equipment and medium Pending CN117765207A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211138585.4A CN117765207A (en) 2022-09-19 2022-09-19 Virtual interface display method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211138585.4A CN117765207A (en) 2022-09-19 2022-09-19 Virtual interface display method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN117765207A true CN117765207A (en) 2024-03-26

Family

ID=90316821

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211138585.4A Pending CN117765207A (en) 2022-09-19 2022-09-19 Virtual interface display method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN117765207A (en)

Similar Documents

Publication Publication Date Title
US10999412B2 (en) Sharing mediated reality content
CN111488056A (en) Manipulating virtual objects using tracked physical objects
CN114900625A (en) Subtitle rendering method, device, equipment and medium for virtual reality space
US20230405475A1 (en) Shooting method, apparatus, device and medium based on virtual reality space
CN117765207A (en) Virtual interface display method, device, equipment and medium
CN117641025A (en) Model display method, device, equipment and medium based on virtual reality space
US20240078734A1 (en) Information interaction method and apparatus, electronic device and storage medium
CN117641026A (en) Model display method, device, equipment and medium based on virtual reality space
US20240028130A1 (en) Object movement control method, apparatus, and device
CN118343924A (en) Virtual object motion processing method, device, equipment and medium
CN117632063A (en) Display processing method, device, equipment and medium based on virtual reality space
CN118349105A (en) Virtual object presentation method, device, equipment and medium
CN117632391A (en) Application control method, device, equipment and medium based on virtual reality space
CN117631810A (en) Operation processing method, device, equipment and medium based on virtual reality space
CN117572994A (en) Virtual object display processing method, device, equipment and medium
CN117710611A (en) Display processing method, device, equipment and medium based on virtual reality space
CN118057466A (en) Control method and device based on augmented reality, electronic equipment and storage medium
CN117640919A (en) Picture display method, device, equipment and medium based on virtual reality space
CN117899456A (en) Display processing method, device, equipment and medium of two-dimensional assembly
CN117435041A (en) Information interaction method, device, electronic equipment and storage medium
CN117435040A (en) Information interaction method, device, electronic equipment and storage medium
CN117197400A (en) Information interaction method, device, electronic equipment and storage medium
CN117631904A (en) Information interaction method, device, electronic equipment and storage medium
CN117991889A (en) Information interaction method, device, electronic equipment and storage medium
CN117376591A (en) Scene switching processing method, device, equipment and medium based on virtual reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination