WO2017113488A1 - 一种虚拟现实设备中显示2d应用界面的方法和装置 - Google Patents

一种虚拟现实设备中显示2d应用界面的方法和装置 Download PDF

Info

Publication number
WO2017113488A1
WO2017113488A1 PCT/CN2016/074160 CN2016074160W WO2017113488A1 WO 2017113488 A1 WO2017113488 A1 WO 2017113488A1 CN 2016074160 W CN2016074160 W CN 2016074160W WO 2017113488 A1 WO2017113488 A1 WO 2017113488A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual reality
displayed
texture
reality scene
virtual
Prior art date
Application number
PCT/CN2016/074160
Other languages
English (en)
French (fr)
Inventor
李立纲
Original Assignee
北京小鸟看看科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京小鸟看看科技有限公司 filed Critical 北京小鸟看看科技有限公司
Priority to US15/115,092 priority Critical patent/US10902663B2/en
Publication of WO2017113488A1 publication Critical patent/WO2017113488A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Definitions

  • the present invention relates to the field of virtual reality, and in particular, to a method and apparatus for displaying a 2D application interface in a virtual reality device.
  • the virtual reality system is a computer simulation system that can create and experience a virtual world. It is a multi-source information fusion interactive 3D dynamic vision and system simulation of entity behavior. It uses a computer to generate a simulation environment to make users Immerse yourself in the environment. For example, a user completes an interaction with a virtual reality system by wearing a virtual reality device such as a virtual reality helmet.
  • Android has gradually become the first choice for operating systems in many virtual reality systems due to its open source and complete ecological environment.
  • the virtual reality device requires that the left-eye image and the right-eye image are respectively rendered to generate a stereoscopic effect, and the existing Android application is not separately developed for the virtual reality device, and is usually a 2D application interface, which cannot meet the requirements of the virtual reality device.
  • a large number of Android applications cannot be used in virtual reality systems, which leads to the problem of lack of virtual reality system applications and poor ecological environment.
  • the present invention has been made in order to provide a method and apparatus for displaying a 2D application interface in a virtual reality device that overcomes the above problems or at least partially solves the above problems.
  • a method for displaying a 2D application interface in a virtual reality device includes: acquiring textures of one or more 2D application interfaces to be displayed; determining a virtual reality scene to be displayed, and The virtual reality scene is written into the frame buffer of the Android system by using the OpenGL function in the left and right split screens; the content in the frame buffer of the Android system is respectively drawn on the left and right screens of the virtual reality device to form the virtual reality scene.
  • a virtual screen; the acquired textures of the one or more 2D application interfaces to be displayed are respectively drawn onto the virtual screen in the virtual reality scene of the left and right screens.
  • the acquiring the one or more 2D application interface textures to be displayed includes: separately applying corresponding layers for the one or more 2D application interfaces to be displayed; and calling the SurfaceFlinger module responsible for displaying the synthesis in the Android system.
  • the setUpHWComposer() function of the SurfaceFlinger module The synthesis method is labeled as a GLES synthesis method, and each layer is synthesized by a GLES synthesis method.
  • the synthesizing the layers by using the GLES synthesis method includes: determining a display relationship of each layer; creating a texture object bound to the GL_TEXTURE_2D and a frame buffer object bound to the GL_FRAMBUFFER through the OpenGL function, And the frame buffer object is associated with the texture object; and the 2D application interface to be displayed in each layer is drawn into the texture object according to the display relationship of each layer.
  • the respectively drawing the acquired textures of the one or more 2D application interfaces to be displayed onto the virtual screen in the virtual reality scene of the left and right screens comprises: from the associated with the texture objects
  • the texture of the one or more 2D application interfaces to be displayed is obtained in the frame buffer object, and is respectively drawn to the virtual screen in the virtual reality scene of the left and right screens by using an OpenGL function.
  • the determining the virtual reality scenario to be displayed includes: obtaining, by the sensor of the virtual reality device, user head state data, and determining a virtual reality scenario to be displayed according to the user head state data.
  • an apparatus for displaying a 2D application interface in a virtual reality device including:
  • a 2D application interface processing unit configured to acquire textures of one or more 2D application interfaces to be displayed
  • a virtual reality scene processing unit configured to determine a virtual reality scene to be displayed, and write the virtual reality scene into a frame buffer of the Android system by using an OpenGL function in a left and right split screen manner;
  • a drawing unit configured to respectively draw content in the frame buffer of the Android system onto the left and right screens of the virtual reality device to form a virtual screen in the virtual reality scene; and the one or more to be displayed to be displayed
  • the texture of the 2D application interface is respectively drawn onto the virtual screen in the virtual reality scene of the left and right screens.
  • the 2D application interface processing unit includes:
  • a layer application module configured to separately apply a corresponding layer to the one or more 2D application interfaces to be displayed
  • the synthesis module is used to call the SurfaceFlinger module responsible for display synthesis in the Android system, and the synthesis mode is marked as GLES synthesis mode in the setUpHWComposer() function of the SurfaceFlinger module, and each layer is synthesized by using the GLES synthesis method.
  • the synthesizing module is specifically configured to determine a display relationship of each layer; create a texture object bound to GL_TEXTURE_2D and a frame buffer object bound to GL_FRAMBUFFER through an OpenGL function, and cache the frame object.
  • the 2D application interface to be displayed in each layer is drawn into the texture object.
  • the drawing unit is configured to obtain the texture of the one or more 2D application interfaces to be displayed from the frame buffer object associated with the texture object, and respectively use the OpenGL function to draw to the left and right On the virtual screen in the virtual reality scene of the screen.
  • the virtual reality scene processing unit is configured to obtain user head state data by using the sensor of the virtual reality device, and determine a virtual reality scene to be displayed according to the user head state data.
  • the technical solution of the present invention is directed to the problem that the existing 2D application interface cannot be rendered on the virtual reality device in the virtual reality system, and the following technical means are adopted: First, one or more 2D to be displayed are acquired. Apply the texture of the interface, and further determine the virtual reality scene to be displayed, use the OpenGL function to write to the frame buffer of the Android system in the left and right split screen mode, and use the Android system to read the content in the system frame buffer for drawing.
  • the technical basis is to realize the display of the virtual reality scene on the left and right screens of the virtual reality device to form a virtual screen in the virtual reality scene; finally, the textures of the acquired one or more 2D application interfaces to be displayed are respectively drawn to the left and right screens.
  • the virtual reality scene in the virtual screen so that the 2D application interface simultaneously renders the left and right eye images, has a three-dimensional sense, so that a large number of existing Android applications can be applied in the virtual reality system, the cost is low, and the method is simple. Improve the ecological environment of the virtual reality system, suitable for practical use.
  • FIG. 1 is a flow chart showing a method of displaying a 2D application interface in a virtual reality device according to an embodiment of the present invention
  • FIG. 2 is a schematic structural diagram of an apparatus for displaying a 2D application interface in a virtual reality device according to an embodiment of the present invention.
  • FIG. 1 a flowchart of a method for displaying a 2D application interface in a virtual reality device according to an embodiment of the present invention is shown. As shown in FIG. 1, the method includes:
  • Step S110 Acquire textures of one or more 2D application interfaces to be displayed.
  • the texture is loaded into the graphics card memory, and the specific content of the application interface that needs to be displayed on the screen of the display device, such as a picture in pvr format.
  • the user needs to run the application A, and when the application A starts, the auxiliary advertisement application B is also run, wherein the application B only occupies the portion of the center of the screen for display. At this time, it is also necessary to synthesize a plurality of 2D application interfaces to be displayed to obtain a synthesized texture.
  • step S120 the virtual reality scene to be displayed is determined, and the virtual reality scene is written into the frame buffer of the Android system by using the OpenGL function in a left-right split screen manner.
  • Step S130 The content in the frame buffer of the Android system is respectively drawn on the left and right screens of the virtual reality device to form a virtual screen in the virtual reality scene.
  • writing data of a specific format to the frame buffer Framebuffer of the Android system means outputting the display content to the screen, so no processing is performed on the data in the Framebuffer between step S120 and step S130, only the steps are performed.
  • the data content written in the framebuffer in S120 is displayed on the left and right screens of the virtual reality device, so that a virtual screen in the virtual reality scene can be formed.
  • a grid may be established for the left screen and the right screen of the virtual reality system, so that in step S120, when the virtual reality scene is written into the frame buffer of the Android system by using the OpenGL function to split the screen, it is required.
  • the information related to the grid of the virtual reality scene is drawn to draw the virtual reality scene onto the respective grids of the left screen and the right screen for display.
  • Step S140 The acquired textures of the one or more 2D application interfaces to be displayed are respectively drawn on the virtual screen in the virtual reality scene of the left and right screens.
  • the virtual screen in the virtual reality scene may be a rear projection television in the living room;
  • the virtual reality scene is a movie theater, and the virtual screen in the virtual reality scene may be a movie screen, and the like.
  • the problem of rendering on the virtual reality device adopts the following technical means: First, acquire the texture of one or more 2D application interfaces to be displayed, and further determine the virtual reality scene to be displayed, and use the OpenGL function to split the screen left and right.
  • the method is written into the frame buffer of the Android system, and the technical basis of the content of the system frame buffer is read by the Android system to realize the display of the virtual reality scene on the left and right screens of the virtual reality device, forming a virtual reality scene.
  • the acquired textures of the one or more 2D application interfaces to be displayed are respectively drawn onto the virtual screen in the virtual reality scene of the left and right screens, so that the 2D application interface simultaneously renders the left and right eye images,
  • the stereoscopic effect makes a large number of existing Android applications can be applied in the virtual reality system, and the cost is low, the method is simple, and the ecological environment of the virtual reality system is improved, which is suitable for practical use.
  • step S110 acquiring one or more 2D application interface textures to be displayed specifically includes: separately applying for corresponding one or more 2D application interfaces to be displayed.
  • Layer call the SurfaceFlinger module responsible for display synthesis in Android system, mark the synthesis mode as GLES synthesis mode in the setUpHWComposer() function of SurfaceFlinger module, and synthesize each layer by GLES synthesis.
  • the Android system when there are a plurality of 2D application interfaces that need to be displayed on the screen of the display device, a plurality of 2D application interfaces to be displayed are required to be combined.
  • the software to synthesize the 2D application interface you need to call the SurfaceFlinger module. Before that, you need to create a Client class and then apply a layer to SurfaceFlinger.
  • the synthetic processing 2D application interface can call the Overlay in a hardware manner, or the SurfaceFlinger can be invoked in a software manner, wherein the hardware method is relatively simple, but is not suitable for implementing the technical solution of the present invention. Therefore, in this embodiment, an implementation method of calling SurfaceFlinger is provided.
  • synthesizing each layer by using a GLES synthesis method includes: determining a display relationship of each layer; and creating a texture object bound to GL_TEXTURE_2D through an OpenGL function. And a frame buffer object bound to GL_FRAMBUFFER, and the frame buffer object is associated with the texture object; according to the display relationship of each layer, the 2D application interface to be displayed in each layer is drawn into the texture object.
  • glTexImage2D (GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, 0);
  • glFramebufferTexture2D (GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, mScreenTexture, 0);
  • the composite result of the doComposeSurfaces() function can be drawn into the created texture object, thereby converting the 2D application interface into a texture object.
  • the texture object is bound to another newly created frame buffer object, and, in particular, in one embodiment of the present invention, the acquired textures of the one or more 2D application interfaces to be displayed are respectively drawn to the left and right.
  • the virtual screen in the virtual reality scene of the screen includes: acquiring textures of one or more 2D application interfaces to be displayed from the frame buffer objects associated with the texture objects, and respectively drawing the virtual reality scenes to the left and right screens by using OpenGL functions.
  • the OpenGL ES function can be specifically used to finish drawing the texture on the virtual screen, and creating a texture object bound to GL_TEXTURE_2D and a frame buffer object bound to GL_FRAMBUFFER, and the frame buffer object and texture. The step in which the object performs the association process.
  • determining the virtual reality scenario to be displayed includes: obtaining user head state data by using a sensor of the virtual reality device, and determining a virtual reality scenario to be displayed according to the user head state data.
  • the user's head state data is obtained from the sensor sensor data of the virtual reality device, such as the angle of the head, the orientation, and the like.
  • the specific value of the rotation vector rotation vector may be obtained from a linear acceleration sensor, and determined according to the data.
  • the virtual reality scene to be displayed needs to be presented at different angles when rendered on the screen of the virtual reality device. In this way, the user can feel in a virtual scene, greatly enriching the user experience, and has an immersive enjoyment.
  • FIG. 2 is a schematic structural diagram of an apparatus for displaying a 2D application interface in a virtual reality device according to an embodiment of the present invention.
  • the apparatus 200 for displaying a 2D application interface in a virtual reality device includes:
  • the 2D application interface processing unit 210 is configured to acquire textures of one or more 2D application interfaces to be displayed.
  • the virtual reality scene processing unit 220 is configured to determine a virtual reality scene to be displayed, and use the OpenGL function to modify the virtual reality scene.
  • Split screen mode is written into the frame buffer of the Android system;
  • the drawing unit 230 is configured to respectively draw the content in the frame buffer of the Android system onto the left and right screens of the virtual reality device to form a virtual screen in the virtual reality scene; and the acquired 2D application interface to be displayed The textures are respectively drawn onto the virtual screen in the virtual reality scene of the left and right screens.
  • the device shown in FIG. 2 adopts the following technical means for the problem that the existing 2D application interface cannot be rendered on the virtual reality device in the virtual reality system:
  • the 2D application interface processing unit 210 acquires one or more The texture of the 2D application interface to be displayed
  • the virtual reality scene processing unit 220 further determines the virtual reality scene to be displayed, and writes it to the frame buffer of the Android system by using the OpenGL function in the left and right split screen manner
  • the drawing unit 230 utilizes Android.
  • the system reads the content of the content in the system frame buffer for drawing, and realizes the display of the virtual reality scene on the left and right screens of the virtual reality device to form a virtual screen in the virtual reality scene; finally, one or more to be acquired
  • the texture of the displayed 2D application interface is respectively drawn onto the virtual screen in the virtual reality scene of the left and right screens, so that the 2D application interface simultaneously renders the left and right eye images, and has a stereoscopic effect, so that a large number of existing Android applications can be applied in the virtual
  • the cost is low, the method is simple, and the virtual reality system is improved.
  • Ecological environment suitable for practical use.
  • the 2D application interface processing unit 210 includes:
  • the layer application module 211 is configured to separately apply a corresponding layer for one or more 2D application interfaces to be displayed;
  • the synthesizing module 212 is configured to call the SurfaceFlinger module responsible for display synthesis in the Android system, mark the synthesis mode as a GLES synthesis mode in the setUpHWComposer() function of the SurfaceFlinger module, and synthesize each layer layer by using the GLES synthesis method.
  • the synthesizing module 212 is specifically configured to determine a display relationship of each layer; and create a texture object bound to the GL_TEXTURE_2D and a frame buffer object bound to the GL_FRAMBUFFER through the OpenGL function. And the frame buffer object is associated with the texture object; according to the display relationship of each layer, the 2D application interface to be displayed in each layer is drawn into the texture object.
  • the drawing unit 230 is configured to acquire, from a frame buffer object associated with the texture object, textures of one or more 2D application interfaces to be displayed, and respectively use OpenGL functions to draw the textures. Go to the virtual screen in the virtual reality scene on the left and right screens.
  • the virtual reality scene processing unit 220 is configured to obtain user head state data through a sensor of the virtual reality device, and determine a virtual reality scene to be displayed according to the user head state data.
  • the technical solution of the present invention is directed to the problem that the existing 2D application interface cannot be rendered on the virtual reality device in the virtual reality system, and the following technical means are adopted: First, one or more to be displayed are acquired. The texture of the 2D application interface, and further determine the virtual reality scene to be displayed, and use the OpenGL function to write to the frame buffer of the Android system in the left and right split screen manner, and use the Android system to read the content in the system frame buffer for drawing.
  • the technical basis is to realize the display of the virtual reality scene on the left and right screens of the virtual reality device to form a virtual screen in the virtual reality scene; finally, the texture of the acquired one or more 2D application interfaces to be displayed is respectively drawn to the left and right
  • the virtual screen in the virtual reality scene of the screen so that the 2D application interface simultaneously renders the left and right eye images, and has a stereoscopic effect, so that a large number of existing Android applications can be applied in the virtual reality system, the cost is low, the method is simple, and the improvement is
  • the ecological environment of the virtual reality system is suitable for practical use.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

一种虚拟现实设备中显示2D应用界面的方法和装置。其中所述方法包括:获取一个或多个待显示的2D应用界面的纹理(S110);确定待显示的虚拟现实场景,并将所述虚拟现实场景使用OpenGL函数以左右分屏的方式写入安卓系统的帧缓存中(S120);将所述安卓系统的帧缓存中的内容分别绘制到虚拟现实设备的左右屏幕上,形成所述虚拟现实场景中的虚拟屏幕(S130);将获取的所述一个或多个待显示的2D应用界面的纹理分别绘制到左右屏幕的所述虚拟现实场景中的虚拟屏幕上(S140)。该方法使得2D应用界面同时渲染出左右眼的画面,具有立体感,使得大量的现有安卓应用可以应用在虚拟现实系统中。

Description

一种虚拟现实设备中显示2D应用界面的方法和装置 技术领域
本发明涉及虚拟现实领域,特别涉及一种虚拟现实设备中显示2D应用界面的方法和装置。
发明背景
虚拟现实系统是一种可以创建和体验虚拟世界的计算机仿真系统,是一种多源信息融合的交互式的三维动态视景和实体行为的系统仿真,它利用计算机生成一种模拟环境,使用户沉浸到该环境中。例如,用户通过佩戴虚拟现实头盔等虚拟现实设备来完成与虚拟现实系统的交互。
近年来,安卓系统因其开源、生态环境完整的优点逐渐成为众多虚拟现实系统中操作系统的首选。但是,虚拟现实设备要求分别渲染出左眼画面和右眼画面来产生立体感,而现有的安卓应用并非为虚拟现实设备单独开发设计,通常都是2D应用界面,满足不了虚拟现实设备的需求,导致海量的安卓应用无法在虚拟现实系统中使用,从而出现了虚拟现实系统应用匮乏、生态环境不良的问题。
发明内容
鉴于上述问题,提出了本发明以便提供一种克服上述问题或者至少部分地解决上述问题的虚拟现实设备中显示2D应用界面的方法和装置。
依据本发明的一个方面,提供了一种虚拟现实设备中显示2D应用界面的方法,包括:获取一个或多个待显示的2D应用界面的纹理;确定待显示的虚拟现实场景,并将所述虚拟现实场景使用OpenGL函数以左右分屏的方式写入安卓系统的帧缓存中;将所述安卓系统的帧缓存中的内容分别绘制到虚拟现实设备的左右屏幕上,形成所述虚拟现实场景中的虚拟屏幕;将获取的所述一个或多个待显示的2D应用界面的纹理分别绘制到左右屏幕的所述虚拟现实场景中的虚拟屏幕上。
可选地,所述获取一个或多个待显示的2D应用界面纹理包括:为所述一个或多个待显示的2D应用界面分别申请对应的图层;调用安卓系统中负责显示合成的SurfaceFlinger模块,在SurfaceFlinger模块的setUpHWComposer()函数中将 合成方式标记为GLES合成方式,使用GLES合成方式对各图层进行合成处理。
可选地,所述使用GLES合成方式对各图层进行合成处理包括:确定各图层的显示关系;通过OpenGL函数创建一个绑定到GL_TEXTURE_2D的纹理对象和一个绑定到GL_FRAMBUFFER的帧缓存对象,并将所述帧缓存对象与所述纹理对象进行关联处理;按所述各图层的显示关系,将所述各图层中的待显示的2D应用界面绘制到所述纹理对象中。
可选地,所述将获取的所述一个或多个待显示的2D应用界面的纹理分别绘制到左右屏幕的所述虚拟现实场景中的虚拟屏幕上包括:从与所述纹理对象相关联的所述帧缓存对象中获取所述一个或多个待显示的2D应用界面的纹理,使用OpenGL函数分别绘制到左右屏幕的所述虚拟现实场景中的虚拟屏幕上。
可选地,所述确定待显示的虚拟现实场景包括:通过所述虚拟现实设备的传感器得到用户头部状态数据,根据用户头部状态数据确定待显示的虚拟现实场景。
依据本发明的另一方面,提供了一种虚拟现实设备中显示2D应用界面的装置,包括:
2D应用界面处理单元,用于获取一个或多个待显示的2D应用界面的纹理;
虚拟现实场景处理单元,用于确定待显示的虚拟现实场景,并将所述虚拟现实场景使用OpenGL函数以左右分屏的方式写入安卓系统的帧缓存中;
绘制单元,用于将所述安卓系统的帧缓存中的内容分别绘制到虚拟现实设备的左右屏幕上,形成所述虚拟现实场景中的虚拟屏幕;以及将获取的所述一个或多个待显示的2D应用界面的纹理分别绘制到左右屏幕的所述虚拟现实场景中的虚拟屏幕上。
可选地,所述2D应用界面处理单元包括:
图层申请模块,用于为所述一个或多个待显示的2D应用界面分别申请对应的图层;
合成模块,用于调用安卓系统中负责显示合成的SurfaceFlinger模块,在SurfaceFlinger模块的setUpHWComposer()函数中将合成方式标记为GLES合成方式,使用GLES合成方式对各图层进行合成处理。
可选地,所述合成模块,具体用于确定各图层的显示关系;通过OpenGL函数创建一个绑定到GL_TEXTURE_2D的纹理对象和一个绑定到GL_FRAMBUFFER的帧缓存对象,并将所述帧缓存对象与所述纹理对象进行关 联处理;按所述各图层的显示关系,将所述各图层中的待显示的2D应用界面绘制到所述纹理对象中。
可选地,所述绘制单元,具体用于从与所述纹理对象相关联的所述帧缓存对象中获取所述一个或多个待显示的2D应用界面的纹理,使用OpenGL函数分别绘制到左右屏幕的所述虚拟现实场景中的虚拟屏幕上。
可选地,所述虚拟现实场景处理单元,具体用于通过所述虚拟现实设备的传感器得到用户头部状态数据,根据用户头部状态数据确定待显示的虚拟现实场景。
由上述可知,本发明的技术方案,针对现有的2D应用界面无法在虚拟现实系统中的虚拟现实设备上渲染的问题,采用了如下的技术手段:首先,获取一个或多个待显示的2D应用界面的纹理,并进一步确定待显示的虚拟现实场景,将其使用OpenGL函数以左右分屏的方式写入安卓系统的帧缓存中,并利用安卓系统读取系统帧缓存中的内容进行绘制的技术基础,来实现虚拟现实场景在虚拟现实设备的左右屏幕上的显示,形成虚拟现实场景中的虚拟屏幕;最后,将获取的一个或多个待显示的2D应用界面的纹理分别绘制到左右屏幕的所述虚拟现实场景中的虚拟屏幕上,从而使得2D应用界面同时渲染出左右眼的画面,具有立体感,使得大量的现有安卓应用可以应用在虚拟现实系统中,成本低,方法简单,改善了虚拟现实系统的生态环境,适于实用。
上述说明仅是本发明技术方案的概述,为了能够更清楚了解本发明的技术手段,而可依照说明书的内容予以实施,并且为了让本发明的上述和其它目的、特征和优点能够更明显易懂,以下特举本发明的具体实施方式。
附图简要说明
通过阅读下文优选实施方式的详细描述,各种其他的优点和益处对于本领域普通技术人员将变得清楚明了。附图仅用于示出优选实施方式的目的,而并不认为是对本发明的限制。而且在整个附图中,用相同的参考符号表示相同的部件。在附图中:
图1示出了根据本发明一个实施例的一种虚拟现实设备中显示2D应用界面的方法的流程图;
图2示出了根据本发明一个实施例的一种虚拟现实设备中显示2D应用界面的装置的结构示意图。
具体实施方式
下面将参照附图更详细地描述本公开的示例性实施例。虽然附图中显示了本公开的示例性实施例,然而应当理解,可以以各种形式实现本公开而不应被这里阐述的实施例所限制。相反,提供这些实施例是为了能够更透彻地理解本公开,并且能够将本公开的范围完整的传达给本领域的技术人员。
参见图1,示出了根据本发明一个实施例的一种虚拟现实设备中显示2D应用界面的方法的流程图,如图1所示,该方法包括:
步骤S110,获取一个或多个待显示的2D应用界面的纹理。
在安卓系统中,纹理即是加载到显卡显存中的,需要在显示设备的屏幕上显示的应用界面的具体内容,如pvr格式的图片。而以一种较复杂的情况为例,用户需要运行应用A,而在应用A启动时,还会运行附属的广告应用B,其中应用B只占用屏幕中心位置的部分进行显示。此时,还需要对多个待显示的2D应用界面进行合成处理,得到合成后的纹理。
步骤S120,确定待显示的虚拟现实场景,并将虚拟现实场景使用OpenGL函数以左右分屏的方式写入安卓系统的帧缓存中。
步骤S130,将安卓系统的帧缓存中的内容分别绘制到虚拟现实设备的左右屏幕上,形成虚拟现实场景中的虚拟屏幕。
在安卓系统中,向安卓系统的帧缓存Framebuffer写入特定格式的数据,就意味着向屏幕输出显示内容,因此在步骤S120和步骤S130之间不对Framebuffer中的数据做任何处理,仅是将步骤S120中写入Framebuffer的数据内容显示在虚拟现实设备的左右屏幕上,即可形成虚拟现实场景中的虚拟屏幕。具体地,可以为虚拟现实系统的左侧屏幕和右侧屏幕建立网格,这样步骤S120中,在将虚拟现实场景使用OpenGL函数左右分屏的方式写入安卓系统的帧缓存中时,就需要包括绘制虚拟现实场景的网格的相关信息,以便将虚拟现实场景绘制到左侧屏幕和右侧屏幕各自的网格上进行显示。
步骤S140,将获取的一个或多个待显示的2D应用界面的纹理分别绘制到左右屏幕的虚拟现实场景中的虚拟屏幕上。
例如,虚拟现实场景为客厅,则虚拟现实场景中的虚拟屏幕可以为客厅中的背投电视;虚拟现实场景为电影院,则虚拟现实场景中的虚拟屏幕可以为电影荧幕,等等。
可见,图1所示的方法,针对现有的2D应用界面无法在虚拟现实系统中的 虚拟现实设备上渲染的问题,采用了如下的技术手段:首先,获取一个或多个待显示的2D应用界面的纹理,并进一步确定待显示的虚拟现实场景,将其使用OpenGL函数以左右分屏的方式写入安卓系统的帧缓存中,并利用安卓系统读取系统帧缓存中的内容进行绘制的技术基础,来实现虚拟现实场景在虚拟现实设备的左右屏幕上的显示,形成虚拟现实场景中的虚拟屏幕;最后,将获取的一个或多个待显示的2D应用界面的纹理分别绘制到左右屏幕的虚拟现实场景中的虚拟屏幕上,从而使得2D应用界面同时渲染出左右眼的画面,具有立体感,使得大量的现有安卓应用可以应用在虚拟现实系统中,成本低,方法简单,改善了虚拟现实系统的生态环境,适于实用。
在本发明的一个实施例中,图1所示的方法中,步骤S110,获取一个或多个待显示的2D应用界面纹理具体包括:为一个或多个待显示的2D应用界面分别申请对应的图层;调用安卓系统中负责显示合成的SurfaceFlinger模块,在SurfaceFlinger模块的setUpHWComposer()函数中将合成方式标记为GLES合成方式,使用GLES合成方式对各图层进行合成处理。
如前文所述,在安卓系统中,当需要在显示设备的屏幕上进行显示的2D应用界面有多个时,需要对待显示的多个2D应用界面进行合成处理。采用软件方式合成处理2D应用界面需要调用SurfaceFlinger模块,在此之前需要创建一个Client类,然后向SurfaceFlinger申请一个图层。在安卓系统中,合成处理2D应用界面可以采用硬件方式调用Overlay,也可以采用软件方式调用SurfaceFlinger,其中硬件方式较为简单,但不适合实现本发明的技术方案。因此,在本实施例中提供了一种调用SurfaceFlinger的实现方法。同样地,在调用SurfaceFlinger的过程中,需要执行以下流程:调用函数preComposition()准备开始合成;调用函数rebuilidLayerStacks重建要合成的图层Layer堆栈;调用setUpHWComposer()函数设置合成采用的方式。在现有技术中,考虑到合成效率,SurfaceFlinger几乎都会采用HWC合成方式来对各图层进行合成处理,但这同样并不适合本实施例中的技术方案中,因此,在本实施例中,需要将合成方式标记为GLES合成方式。
在上述流程之后,SurfaceFlinger还要继续调用doDebugFlashRegions()函数进行调试,之后进入最重要的合成处理。SurfaceFlinger会调用doComposition()函数来进行合成,该函数执行过程中会调用doDisplayComposition()函数来完成每个显示设备的合成,以实现在多个显示设备,如平板电脑、智能手机上的显示。 doDisplayComposition()在执行过程中还会进一步调用doComposeSurfaces()函数来将每个图层Layer中的内容绘制到安卓系统的Framebuffer上。而在本发明的技术方案中,由于安卓系统的Framebuffer上需要写入虚拟现实场景,因此不能以这种方式处理待显示的一个或多个2D应用界面。
因此具体地,在本发明的一个实施例中,上述方法中,使用GLES合成方式对各图层进行合成处理包括:确定各图层的显示关系;通过OpenGL函数创建一个绑定到GL_TEXTURE_2D的纹理对象和一个绑定到GL_FRAMBUFFER的帧缓存对象,并将帧缓存对象与纹理对象进行关联处理;按各图层的显示关系,将各图层中的待显示的2D应用界面绘制到纹理对象中。
下面给出了一种创建一个绑定到GL_TEXTURE_2D的纹理对象和一个绑定到GL_FRAMBUFFER的帧缓存对象,并将帧缓存对象与纹理对象进行关联处理的具体实施方式的代码示例:
//生成Texture对象并绑定到GL_TEXTURE_2D,然后进行初始化
glGenTextures(1,&mScreenTexture);
glBindTexture(GL_TEXTURE_2D,mScreenTexture);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,width,height,0,GL_RGBA,GL_UNSIGNED_BYTE,0);
//生成Framebuffer对象并绑定到GL_FRAMEBUFFER
glGenFramebuffers(1,&mFramebuffer);
glBindFramebuffer(GL_FRAMEBUFFER,mFramebuffer);
//将Texture和Framebuffer关联起来
glFramebufferTexture2D(GL_FRAMEBUFFER,GL_COLOR_ATTACHMENT0,GL_TEXTURE_2D,mScreenTexture,0);
通过这种方法,可以让doComposeSurfaces()函数的合成结果绘制到创建的纹理对象中,从而将2D应用界面转换为一个纹理对象。
本领域技术人员应当理解,在本实施例中提供的用GLES合成方式对各图层进行合成处理的示例并不代表对其他实施方式的限制。本实施例中实施方式的优点在于,在不使用安卓系统的Frambuffer的情况下将各图层中的待显示的2D应用界面按显示关系绘制到新建的纹理对象中。
而该纹理对象是绑定到另一个新建的帧缓存对象Frambuffer的,因此具体地,在本发明的一个实施例中,将获取的一个或多个待显示的2D应用界面的纹理分别绘制到左右屏幕的虚拟现实场景中的虚拟屏幕上包括:从与纹理对象相关联的帧缓存对象中获取一个或多个待显示的2D应用界面的纹理,使用OpenGL函数分别绘制到左右屏幕的虚拟现实场景中的虚拟屏幕上。在上述实施例中都可以具体使用OpenGL ES函数来完成将纹理绘制在虚拟屏幕上、以及创建一个绑定到GL_TEXTURE_2D的纹理对象和一个绑定到GL_FRAMBUFFER的帧缓存对象,并将帧缓存对象与纹理对象进行关联处理的步骤。
在本发明的一个实施例中,上述方法中,确定待显示的虚拟现实场景包括:通过虚拟现实设备的传感器得到用户头部状态数据,根据用户头部状态数据确定待显示的虚拟现实场景。
例如,从虚拟现实设备的传感器sensor数据中得到用户头部状态数据,例如头部的角度、面向方向等等,具体可以从如线性加速度传感器中得到旋转矢量rotation vector的具体数值,根据这些数据确定待显示的虚拟现实场景需要在虚拟现实设备的屏幕上渲染时所呈现的不同角度。这样,用户就可以感觉到身处一个虚拟的场景中,极大地丰富了用户的体验,具有身临其境的极佳享受。
图2示出了根据本发明一个实施例的一种虚拟现实设备中显示2D应用界面的装置的结构示意图,如图2所示,虚拟现实设备中显示2D应用界面的装置200包括:
2D应用界面处理单元210,用于获取一个或多个待显示的2D应用界面的纹理;虚拟现实场景处理单元220,用于确定待显示的虚拟现实场景,并将虚拟现实场景使用OpenGL函数以左右分屏的方式写入安卓系统的帧缓存中;
绘制单元230,用于将安卓系统的帧缓存中的内容分别绘制到虚拟现实设备的左右屏幕上,形成虚拟现实场景中的虚拟屏幕;以及将获取的一个或多个待显示的2D应用界面的纹理分别绘制到左右屏幕的虚拟现实场景中的虚拟屏幕上。
可见,图2所示的装置,针对现有的2D应用界面无法在虚拟现实系统中的虚拟现实设备上渲染的问题,采用了如下的技术手段:首先,2D应用界面处理单元210获取一个或多个待显示的2D应用界面的纹理,虚拟现实场景处理单元220进一步确定待显示的虚拟现实场景,将其使用OpenGL函数以左右分屏的方式写入安卓系统的帧缓存中,绘制单元230利用安卓系统读取系统帧缓存中的内容进行绘制的技术基础,来实现虚拟现实场景在虚拟现实设备的左右屏幕上的显示,形成虚拟现实场景中的虚拟屏幕;最后,将获取的一个或多个待显示的2D应用界面的纹理分别绘制到左右屏幕的虚拟现实场景中的虚拟屏幕上,从而使得2D应用界面同时渲染出左右眼的画面,具有立体感,使得大量的现有安卓应用可以应用在虚拟现实系统中,成本低,方法简单,改善了虚拟现实系统的生态环境,适于实用。
在本发明的一个实施例中,图2所示的装置中,2D应用界面处理单元210包括:
图层申请模块211,用于为一个或多个待显示的2D应用界面分别申请对应的图层;
合成模块212,用于调用安卓系统中负责显示合成的SurfaceFlinger模块,在SurfaceFlinger模块的setUpHWComposer()函数中将合成方式标记为GLES合成方式,使用GLES合成方式对各图层进行合成处理。
在本发明的一个实施例中,上述装置中,合成模块212,具体用于确定各图层的显示关系;通过OpenGL函数创建一个绑定到GL_TEXTURE_2D的纹理对象和一个绑定到GL_FRAMBUFFER的帧缓存对象,并将帧缓存对象与纹理对象进行关联处理;按各图层的显示关系,将各图层中的待显示的2D应用界面绘制到纹理对象中。
在本发明的一个实施例中,上述装置中,绘制单元230,具体用于从与纹理对象相关联的帧缓存对象中获取一个或多个待显示的2D应用界面的纹理,使用OpenGL函数分别绘制到左右屏幕的虚拟现实场景中的虚拟屏幕上。
在本发明的一个实施例中,上述装置中,虚拟现实场景处理单元220,具体用于通过虚拟现实设备的传感器得到用户头部状态数据,根据用户头部状态数据确定待显示的虚拟现实场景。
需要说明的是,上述各装置实施例的具体实施方式与前述对应方法实施例的具体实施方式相同,在此不再赘述。
综上所述,本发明的技术方案,针对现有的2D应用界面无法在虚拟现实系统中的虚拟现实设备上渲染的问题,采用了如下的技术手段:首先,获取一个或多个待显示的2D应用界面的纹理,并进一步确定待显示的虚拟现实场景,将其使用OpenGL函数以左右分屏的方式写入安卓系统的帧缓存中,并利用安卓系统读取系统帧缓存中的内容进行绘制的技术基础,来实现虚拟现实场景在虚拟现实设备的左右屏幕上的显示,形成虚拟现实场景中的虚拟屏幕;最后,将获取的一个或多个待显示的2D应用界面的纹理分别绘制到左右屏幕的虚拟现实场景中的虚拟屏幕上,从而使得2D应用界面同时渲染出左右眼的画面,具有立体感,使得大量的现有安卓应用可以应用在虚拟现实系统中,成本低,方法简单,改善了虚拟现实系统的生态环境,适于实用。
以上所述仅为本发明的较佳实施例而已,并非用于限定本发明的保护范围。凡在本发明的精神和原则之内所作的任何修改、等同替换、改进等,均包含在本发明的保护范围内。

Claims (10)

  1. 一种虚拟现实设备中显示2D应用界面的方法,其特征在于,所述方法包括:
    获取一个或多个待显示的2D应用界面的纹理;
    确定待显示的虚拟现实场景,并将所述虚拟现实场景使用OpenGL函数以左右分屏的方式写入安卓系统的帧缓存中;
    将所述安卓系统的帧缓存中的内容分别绘制到虚拟现实设备的左右屏幕上,形成所述虚拟现实场景中的虚拟屏幕;
    将获取的所述一个或多个待显示的2D应用界面的纹理分别绘制到左右屏幕的所述虚拟现实场景中的虚拟屏幕上。
  2. 如权利要求1所述的方法,其特征在于,所述获取一个或多个待显示的2D应用界面的纹理包括:
    为所述一个或多个待显示的2D应用界面分别申请对应的图层;
    调用安卓系统中负责显示合成的SurfaceFlinger模块,在SurfaceFlinger模块的setUpHWComposer()函数中将合成方式标记为GLES合成方式,使用GLES合成方式对各图层进行合成处理。
  3. 如权利要求2所述的方法,其特征在于,所述使用GLES合成方式对各图层进行合成处理包括:
    确定各图层的显示关系;
    通过OpenGL函数创建一个绑定到GL_TEXTURE_2D的纹理对象和一个绑定到GL_FRAMBUFFER的帧缓存对象,并将所述帧缓存对象与所述纹理对象进行关联处理;
    按所述各图层的显示关系,将所述各图层中的待显示的2D应用界面绘制到所述纹理对象中。
  4. 如权利要求3所述的方法,其特征在于,所述将获取的所述一个或多个待显示的2D应用界面的纹理分别绘制到左右屏幕的所述虚拟现实场景中的虚拟屏幕上包括:
    从与所述纹理对象相关联的所述帧缓存对象中获取所述一个或多个待显示的2D应用界面的纹理,使用OpenGL函数分别绘制到左右屏幕的所述虚拟现实场景中的虚拟屏幕上。
  5. 如权利要求4所述的方法,其特征在于,所述确定待显示的虚拟现实场景包括:通过所述虚拟现实设备的传感器得到用户头部状态数据,根据用户头部状态数据确定待显示的虚拟现实场景。
  6. 一种虚拟现实设备中显示2D应用界面的装置,其特征在于,所述装置包括:
    2D应用界面处理单元,用于获取一个或多个待显示的2D应用界面的纹理;
    虚拟现实场景处理单元,用于确定待显示的虚拟现实场景,并将所述虚拟现实场景使用OpenGL函数以左右分屏的方式写入安卓系统的帧缓存中;
    绘制单元,用于将所述安卓系统的帧缓存中的内容分别绘制到虚拟现实设备的左右屏幕上,形成所述虚拟现实场景中的虚拟屏幕;以及将获取的所述一个或多个待显示的2D应用界面的纹理分别绘制到左右屏幕的所述虚拟现实场景中的虚拟屏幕上。
  7. 如权利要求6所述的装置,其特征在于,所述2D应用界面处理单元包括:
    图层申请模块,用于为所述一个或多个待显示的2D应用界面分别申请对应的图层;
    合成模块,用于调用安卓系统中负责显示合成的SurfaceFlinger模块,在SurfaceFlinger模块的setUpHWComposer()函数中将合成方式标记为GLES合成方式,使用GLES合成方式对各图层进行合成处理。
  8. 如权利要求7所述的装置,其特征在于,
    所述合成模块,具体用于确定各图层的显示关系;通过OpenGL函数创建一个绑定到GL_TEXTURE_2D的纹理对象和一个绑定到GL_FRAMBUFFER的帧缓存对象,并将所述帧缓存对象与所述纹理对象进行关联处理;按所述各图层的显示关系,将所述各图层中的待显示的2D应用界面绘制到所述纹理对象中。
  9. 如权利要求8所述的装置,其特征在于,
    所述绘制单元,具体用于从与所述纹理对象相关联的所述帧缓存对象中获取所述一个或多个待显示的2D应用界面的纹理,使用OpenGL函数分别绘制到左右屏幕的所述虚拟现实场景中的虚拟屏幕上。
  10. 如权利要求9所述的装置,其特征在于,
    所述虚拟现实场景处理单元,具体用于通过所述虚拟现实设备的传感器得到用户头部状态数据,根据用户头部状态数据确定待显示的虚拟现实场景。
PCT/CN2016/074160 2015-12-31 2016-02-19 一种虚拟现实设备中显示2d应用界面的方法和装置 WO2017113488A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/115,092 US10902663B2 (en) 2015-12-31 2016-02-19 Method and apparatus for displaying 2D application interface in virtual reality device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201511026678.8A CN105447898B (zh) 2015-12-31 2015-12-31 一种虚拟现实设备中显示2d应用界面的方法和装置
CN201511026678.8 2015-12-31

Publications (1)

Publication Number Publication Date
WO2017113488A1 true WO2017113488A1 (zh) 2017-07-06

Family

ID=55558029

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/074160 WO2017113488A1 (zh) 2015-12-31 2016-02-19 一种虚拟现实设备中显示2d应用界面的方法和装置

Country Status (3)

Country Link
US (1) US10902663B2 (zh)
CN (1) CN105447898B (zh)
WO (1) WO2017113488A1 (zh)

Families Citing this family (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105657408B (zh) * 2015-12-31 2018-11-30 北京小鸟看看科技有限公司 虚拟现实场景的实现方法和虚拟现实装置
US20170262049A1 (en) * 2016-03-11 2017-09-14 Empire Technology Development Llc Virtual reality display based on orientation offset
US10043302B2 (en) 2016-04-18 2018-08-07 Beijing Pico Technology Co., Ltd. Method and apparatus for realizing boot animation of virtual reality system
CN105957130B (zh) * 2016-04-18 2019-08-02 北京小鸟看看科技有限公司 一种实现虚拟现实系统平面启动动画的方法和装置
CN105913477B (zh) * 2016-04-18 2019-02-12 北京小鸟看看科技有限公司 一种实现虚拟现实系统立体启动动画的方法和装置
CN106098022B (zh) * 2016-06-07 2019-02-12 北京小鸟看看科技有限公司 一种缩短图像延迟的方法和装置
CN106126021A (zh) * 2016-06-21 2016-11-16 上海乐相科技有限公司 一种界面显示方法及装置
CN106201724A (zh) * 2016-07-14 2016-12-07 乐视控股(北京)有限公司 一种虚拟现实vr设备及其用户界面ui事件的处理方法
CN107817894A (zh) * 2016-09-12 2018-03-20 中兴通讯股份有限公司 显示处理方法及装置
CN106507093A (zh) * 2016-09-26 2017-03-15 北京小鸟看看科技有限公司 一种虚拟现实设备的显示模式切换方法和装置
CN106570927A (zh) * 2016-10-14 2017-04-19 惠州Tcl移动通信有限公司 基于Android系统实现虚拟现实的方法、终端及系统
CN106502396B (zh) * 2016-10-20 2020-10-23 网易(杭州)网络有限公司 虚拟现实系统、基于虚拟现实的交互方法及装置
CN108604385A (zh) * 2016-11-08 2018-09-28 华为技术有限公司 一种应用界面显示方法及装置
CN106598514B (zh) * 2016-12-01 2020-06-09 惠州Tcl移动通信有限公司 一种终端设备中切换虚拟现实模式的方法及系统
CN108228121B (zh) * 2016-12-15 2021-05-07 中科创达软件股份有限公司 一种浏览器分屏的方法、装置及移动终端
CN106792093B (zh) * 2016-12-20 2023-09-19 飞狐信息技术(天津)有限公司 视频分屏播放方法、装置及播放终端
CN106851240A (zh) * 2016-12-26 2017-06-13 网易(杭州)网络有限公司 图像数据处理的方法及装置
WO2018137304A1 (zh) * 2017-01-26 2018-08-02 华为技术有限公司 一种2d应用在vr设备中的显示方法及终端
CN106990838B (zh) * 2017-03-16 2020-11-13 惠州Tcl移动通信有限公司 一种虚拟现实模式下锁定显示内容的方法及系统
CN107132914A (zh) * 2017-04-10 2017-09-05 青岛海信移动通信技术股份有限公司 用于vr显示的终端分屏显示的方法及终端
CN107277483B (zh) * 2017-05-11 2019-05-14 深圳市冠旭电子股份有限公司 一种虚拟现实显示方法、装置及虚拟现实眼镜
WO2019006650A1 (zh) * 2017-07-04 2019-01-10 腾讯科技(深圳)有限公司 虚拟现实内容的显示方法和装置
KR101990373B1 (ko) * 2017-09-29 2019-06-20 클릭트 주식회사 가상현실 영상 제공 방법 및 이를 이용한 프로그램
CN107846584B (zh) * 2017-11-02 2019-05-07 中国电子科技集团公司第二十八研究所 基于场景管理开发库的虚拟现实自适应桌面同步投影方法
CN108090944A (zh) * 2017-12-29 2018-05-29 深圳多哚新技术有限责任公司 一种自适应vr显示方法及装置
US11694392B2 (en) * 2018-05-22 2023-07-04 Apple Inc. Environment synthesis for lighting an object
CN108921050B (zh) * 2018-06-14 2021-10-15 华中科技大学 一种基于移动端的虚拟现实图像处理系统
CN109308742A (zh) * 2018-08-09 2019-02-05 重庆爱奇艺智能科技有限公司 一种在虚拟现实的3d场景中运行2d应用的方法和装置
CN110597577A (zh) * 2019-05-31 2019-12-20 珠海全志科技股份有限公司 一种头戴可视设备及其分屏显示方法和装置
US11727272B2 (en) 2019-06-19 2023-08-15 Nvidia Corporation LIDAR-based detection of traffic signs for navigation of autonomous vehicles
CN111459266A (zh) * 2020-03-02 2020-07-28 重庆爱奇艺智能科技有限公司 一种在虚拟现实的3d场景中操作2d应用的方法和装置
CN112230775A (zh) * 2020-10-21 2021-01-15 潍坊歌尔电子有限公司 弹窗处理方法、装置及计算机可读存储介质
CN112200901A (zh) * 2020-10-30 2021-01-08 南京爱奇艺智能科技有限公司 目标应用的三维显示方法、装置和虚拟现实设备
WO2022155150A1 (en) * 2021-01-13 2022-07-21 Arris Enterprises Llc Rendering scrolling captions
CN113342220B (zh) * 2021-05-11 2023-09-12 杭州灵伴科技有限公司 窗口渲染方法、头戴式显示套件和计算机可读介质
CN113589927B (zh) * 2021-07-23 2023-07-28 杭州灵伴科技有限公司 分屏显示方法、头戴式显示设备和计算机可读介质
CN114185508B (zh) * 2022-02-15 2022-04-26 麒麟软件有限公司 在Linux兼容安卓系统上进行左右分屏的方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101266546A (zh) * 2008-05-12 2008-09-17 深圳华为通信技术有限公司 一种实现操作系统三维显示的方法和一种三维操作系统
WO2012031406A1 (zh) * 2010-09-10 2012-03-15 青岛海信信芯科技有限公司 3d电视界面的显示方法和装置
CN102937968A (zh) * 2012-10-11 2013-02-20 上海交通大学 一种基于Canvas的双目3D网页实现方法及系统
CN103081002A (zh) * 2010-08-10 2013-05-01 索尼公司 2d到3d用户界面内容数据转换
CN105192982A (zh) * 2015-09-07 2015-12-30 北京小鸟看看科技有限公司 可调节式虚拟现实头盔的图像矫正方法和系统

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2494434A (en) * 2011-09-08 2013-03-13 Sharp Kk Conversion of Graphics for Multi-View 3D Displays
US20160188279A1 (en) * 2014-12-27 2016-06-30 Intel Corporation Mode-switch protocol and mechanism for hybrid wireless display system with screencasting and native graphics throwing
US9886086B2 (en) * 2015-08-21 2018-02-06 Verizon Patent And Licensing Inc. Gesture-based reorientation and navigation of a virtual reality (VR) interface

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101266546A (zh) * 2008-05-12 2008-09-17 深圳华为通信技术有限公司 一种实现操作系统三维显示的方法和一种三维操作系统
CN103081002A (zh) * 2010-08-10 2013-05-01 索尼公司 2d到3d用户界面内容数据转换
WO2012031406A1 (zh) * 2010-09-10 2012-03-15 青岛海信信芯科技有限公司 3d电视界面的显示方法和装置
CN102937968A (zh) * 2012-10-11 2013-02-20 上海交通大学 一种基于Canvas的双目3D网页实现方法及系统
CN105192982A (zh) * 2015-09-07 2015-12-30 北京小鸟看看科技有限公司 可调节式虚拟现实头盔的图像矫正方法和系统

Also Published As

Publication number Publication date
US20180018806A1 (en) 2018-01-18
CN105447898B (zh) 2018-12-25
US10902663B2 (en) 2021-01-26
CN105447898A (zh) 2016-03-30

Similar Documents

Publication Publication Date Title
WO2017113488A1 (zh) 一种虚拟现实设备中显示2d应用界面的方法和装置
US11076142B2 (en) Real-time aliasing rendering method for 3D VR video and virtual three-dimensional scene
WO2017113681A1 (zh) 一种基于虚拟现实技术的视频图像处理方法及装置
US20230039100A1 (en) Multi-layer reprojection techniques for augmented reality
WO2021135320A1 (zh) 一种视频的生成方法、装置及计算机系统
CN109478344A (zh) 用于合成图像的方法和设备
JP2014021570A (ja) 動画像生成装置
US11589027B2 (en) Methods, systems, and media for generating and rendering immersive video content
WO2019076348A1 (zh) 一种虚拟现实vr界面生成的方法和装置
WO2017185761A1 (zh) 2d视频播放方法及装置
JP6553184B2 (ja) デジタルビデオのレンダリング
JP2016529593A (ja) 立体シーンのインターリーブ方式のタイル型レンダリング
JP2019527899A (ja) 仮想深度を用いて3d相互環境を生成するためのシステム及び方法
WO2017113729A1 (zh) 360度图像加载方法、加载模块及移动终端
KR102558294B1 (ko) 임의 시점 영상 생성 기술을 이용한 다이나믹 영상 촬영 장치 및 방법
US20230106679A1 (en) Image Processing Systems and Methods
CN105808220B (zh) 应用程序显示三维立体效果的方法及装置
CA3155612A1 (en) Method and system for providing at least a portion of content having six degrees of freedom motion
CN110597577A (zh) 一种头戴可视设备及其分屏显示方法和装置
CN106412562A (zh) 三维场景中显示立体内容的方法及其系统
WO2010139984A1 (en) Device and method of display
CN105954969A (zh) 一种用于幻影成像的3d引擎及其实现方法
KR100893855B1 (ko) 3차원 포그라운드와 2차원 백그라운드 결합 방법 및 3차원어플리케이션 엔진
KR20130081569A (ko) 3d 영상을 출력하기 위한 장치 및 방법
JP7197716B2 (ja) 相互運用可能な3d画像コンテンツのハンドリング

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 15115092

Country of ref document: US

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16880297

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16880297

Country of ref document: EP

Kind code of ref document: A1