WO2023216619A1 - 3d 显示方法和 3d 显示设备 - Google Patents

3d 显示方法和 3d 显示设备 Download PDF

Info

Publication number
WO2023216619A1
WO2023216619A1 PCT/CN2022/143110 CN2022143110W WO2023216619A1 WO 2023216619 A1 WO2023216619 A1 WO 2023216619A1 CN 2022143110 W CN2022143110 W CN 2022143110W WO 2023216619 A1 WO2023216619 A1 WO 2023216619A1
Authority
WO
WIPO (PCT)
Prior art keywords
display device
data
display
target user
gyroscope
Prior art date
Application number
PCT/CN2022/143110
Other languages
English (en)
French (fr)
Inventor
贺曙
徐万良
Original Assignee
广东未来科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广东未来科技有限公司 filed Critical 广东未来科技有限公司
Publication of WO2023216619A1 publication Critical patent/WO2023216619A1/zh

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

Definitions

  • This application belongs to the field of naked-eye 3D display, and particularly relates to a 3D display method and a 3D display device.
  • Naked-eye 3D technology is implemented based on the parallax of the human eye, that is, there will be image differences between the left eye and the right eye when observing the same target.
  • the image seen by the left eye and the image seen by the right eye are synthesized in the brain. It becomes the 3D content we see; thus, by doing some processing on the screen, the images with parallax are mapped to the left eye and right eye of the person respectively, and the person looks like a 3D image.
  • the 3D game engine will encapsulate complex graphics algorithms inside the module and provide a simple and effective SDK interface to the outside world.
  • the purpose of this application is to provide a 3D display method and a 3D display device.
  • the display image corresponding to the layer marked as 3D display mode is interleaved through the interleaving program created by the rendering engine and then rendered to the display screen to present the naked eye view. 3D effect.
  • the first aspect of the embodiment of the present application provides a 3D display method, including:
  • an interleaving program to interleave the display image corresponding to the current layer according to the physical parameters of the 3D display device to obtain the 3D display image corresponding to the current layer.
  • the interleaving program is through the 3D
  • the rendering engine corresponding to the display device is created;
  • the 3D display image is rendered and then displayed.
  • the second aspect of this application provides a 3D display device, including;
  • a judgment unit configured to judge whether the current layer is marked as a 3D display mode if the display image corresponding to the current layer is rendered to the screen of the 3D display device;
  • An acquisition unit configured to acquire the physical parameters of the 3D display device if the current layer is marked as the 3D display mode
  • An interleaving unit configured to call an interleaving program to interleave the display image corresponding to the current layer according to the physical parameters of the 3D display device to obtain the 3D display image corresponding to the current layer.
  • the interleaving program Created by the rendering engine corresponding to the 3D display device;
  • a display unit is used to render and display the 3D display image.
  • the display image corresponding to the current layer can be interleaved according to the physical parameters of the 3D display device through an additionally created interleaving program, and the Rendering shows.
  • Figure 1 is a schematic flowchart of a 3D display method provided by an embodiment of the present application.
  • Figure 2 is a schematic diagram of the virtual structure of a 3D display device provided by an embodiment of the present application.
  • Figure 3 is a schematic diagram of the hardware structure of a 3D display device provided by an embodiment of the present application.
  • Figure 1 is a schematic flow chart of the 3D display method provided by the embodiment of the present application, including:
  • step 101 If the display image corresponding to the current layer is rendered to the screen of the 3D display device, determine whether the current layer is marked as 3D display mode. If the current layer is marked as 3D display mode, perform step 102.
  • the 3D display device when the 3D display device renders the display image corresponding to the current layer and displays it on the screen, it can determine whether the current layer is marked as a 3D display mode. If the current layer is marked as a 3D display mode, , then execute step 102. If the current layer is not marked as 3D display mode, render according to the normal rendering method.
  • the 3D display device can apply for the corresponding layer for the object to be displayed in advance.
  • the display module corresponding to the 3D display device can be called to create a layer, wherein the display module corresponding to the 3D display device is based on The systems vary.
  • the display module in the Android system is SurfaceFlinger.
  • it can also be a display module in other operating systems, such as the display module in the IOS system.
  • the following uses the Android system as an example to illustrate:
  • the 3D display device When the 3D display device creates a layer through the SurfaceFlinger display module, it first passes the data packet name corresponding to the target object (the target object can be a game or a video, and is not specifically limited) to SurfaceFlinger, and determines the data to be rendered.
  • the target layer corresponding to the target object and determine the default package name list (the data package names of multiple objects including the target object in the default package name list, the objects in the default package name list are all required Whether there is a data package name of the target object in the object that turns on the 3D display mode).
  • the SurfaceFlinger display module determines whether the current game needs to turn on the 3D display mode based on the default package name list.
  • the preset package If the data package name of the game is stored in the name list, it is determined that the game needs to turn on the 3D display mode, that is, the final display effect of the game is a 3D display effect, and then the target layer corresponding to the game is marked as 3D display. mode, and set the SurfaceFlinger display module to use OpenGLES to perform window overlay processing on the target layer.
  • the 3D display device when it determines that the current layer is marked as a 3D display mode, it can obtain the physical parameters of the 3D display device.
  • the method of obtaining the physical parameters of the 3D display device is not limited here. For example, a prompt can be issued. Instructions are input by the user, and of course can also be obtained in other ways.
  • the physical parameters refer to the fitting angle and width of the grating corresponding to the 3D display device. Of course, the physical parameters can also include other parameters corresponding to naked-eye 3D display, such as the viewpoint width of the grating and other parameters, which are not specified. limited.
  • the SurfaceFlinger display module when it uses OpenGLES to merge layers, it will create a default opengl program P1 (program) in the rendering engine (RenderEngine) to render each layer to the screen corresponding to the 3D display device.
  • P1 program
  • the 3D display device determines that the current layer is marked as 3D display mode, it can call the interleaving program created by the rendering engine corresponding to the 3D display device to interleave the current layer according to the physical parameters of the 3D display device, and obtain the current layer corresponding to 3D display image.
  • RenderEngine will create an additional interleaving program P2 (program) to interleave the layer according to the physical parameters of the 3D display device. Then render to the screen. Understandably, the extra interleaving program only needs to be built on the first render, not every time.
  • P2 program
  • step 102 is not limited here. It can be executed at the same time as step 101, before step 101, or after step 102, until it is necessary to call the interleaving program to display the image according to the 3D display.
  • the physical parameters of the device can be obtained before interleaving the display image corresponding to the current image.
  • the interleaved display image can be rendered and displayed on the screen of the 3D display device. 3D display effect of the target object.
  • the 3D display device can also implement human eye tracking naked eye 3D display, as detailed below:
  • the target user is the user currently viewing the screen of the 3D display device
  • the 3D display image displayed after rendering is adjusted according to the final human eye angle data.
  • the eye position data of the target user at the current moment and the gyroscope data corresponding to the 3D display device can be determined.
  • the eye position data of the target user Including observation angle data and observation distance data.
  • the 3D display device when the 3D display device determines the target user's eye position data at the current moment, the 3D display device can obtain the target user's face image through the camera, and determine the left pupil position and right pupil position in the face image; and use the left and right pupils to relative Deviation from the origin of the picture, the horizontal viewing angle is calculated as, and the vertical viewing angle is, where A is the viewing angle calibration constant data; and the observation angle is calculated based on the horizontal viewing angle and the vertical viewing angle, and based on the observation angle and the target user's interpupillary distance (the target The user's interpupillary distance can be calculated based on the target user's left pupil position and right pupil position) to calculate the observation distance data, where B is the viewing distance calibration constant data, used to represent the left and right interpupillary distance.
  • the deviation of the left and right pupils from the origin of the picture refers to the deviation of the midpoint between the left and right pupils from the origin of the picture.
  • the 3D display device determines the gyroscope data corresponding to the 3D display device, it can obtain the stored first pitch angle value V1 of the gyroscope (the 3D display device will store the gyroscope's pitch angle value when it changes.
  • the first pitch angle value V1 is stored.
  • One pitch angle value refers to the last stored pitch angle value of the gyroscope), and monitors the refresh command corresponding to the 3D display device, and calculates the second pitch angle value V2 of the gyroscope when the screen of the 3D display device is refreshed; Finally, the pitch angle change value of the gyroscope (that is, the gyroscope data) is obtained based on the first pitch angle value V1 and the second pitch angle value V2.
  • the pitch angle value can be marked by a three-dimensional standard coordinate system, and the three-dimensional spatial coordinates can intuitively reflect the angular relationship between the human eye and the 3D display device.
  • the gyroscope can output the attitude information of the 3D display device at high frequency in a short period of time.
  • the gyroscope can be used to calculate the pitch angle change value of the gyroscope between two adjacent human eye detections to instantly output the pitch of the 3D display device.
  • the state i.e., posture information
  • the final human eye angle data corresponding to the target user can be determined based on the eye position data and gyroscope data, that is, the initial human eye position data and the pitch angle change value are superimposed.
  • the final human eye angle data is obtained, so that the 3D display device can adjust the 3D display screen through the stereoscopic game engine based on the human eye angle data, where the angle data between the target user's eyes and the screen of the 3D display device is V0,
  • the V0 represents the angular relationship between the current target user's eyes and the screen of the 3D display device in the three-dimensional standard coordinate system;
  • the 3D display image displayed after rendering is adjusted according to the final human eye angle data.
  • the 3D display image can be observed when the target user moves or the 3D display device moves.
  • the 3D display device adjusts the 3D display image according to the final human eye angle data.
  • the human eye angle data Vx rotates or moves the 3D display image.
  • the rotation angle of the 3D display image is in the opposite direction to the change angle of the human eye position, and has a linear relationship.
  • the ratio is the distance from the human eye to the screen to the distance between the human eye and the 3D display image.
  • the actual depth of field of the scene is multiplied by the adjustment parameters to give users a more realistic 3D virtual scene experience.
  • the gyroscope 13 is used to assist in capturing the position of the human eye. Even during shaking, the gyroscope can transmit the device's pitch angle change value in real time to detect the human eye in real time, so that the user can always view the 3D display image.
  • the display image corresponding to the current layer can be interleaved according to the physical parameters of the 3D display device through an additionally created interleaving program, and rendered and displayed.
  • Figure 2 is a schematic diagram of the virtual structure of a 3D display device provided by an embodiment of the present application.
  • the 3D display device 200 includes:
  • the judgment unit 201 is used to judge whether the current layer is marked as a 3D display mode if the display image corresponding to the current layer is rendered to the screen of the 3D display device;
  • the acquisition unit 202 is configured to acquire the physical parameters of the 3D display device if the current layer is marked as the 3D display mode;
  • the interleaving unit 203 is used to call an interleaving program to interleave the display image corresponding to the current layer according to the physical parameters of the 3D display device to obtain the 3D display image corresponding to the current layer.
  • the program is created through the rendering engine corresponding to the 3D display device;
  • the display unit 204 is used to render and display the 3D display image.
  • the interleaving unit 203 is also used to:
  • the target layer is marked as the 3D display mode.
  • the interleaving unit 203 is also used to:
  • the target user is the user currently viewing the screen of the 3D display device
  • the 3D display image displayed after rendering is adjusted according to the final human eye angle data.
  • the target user's eye position data includes observation angle data and observation distance data.
  • the interleaving unit 203 determines that the target user's eye position data at the current moment includes:
  • the viewing distance data is calculated based on the viewing angle data and the interpupillary distance of the target user.
  • the interleaving unit 203 determines that the gyroscope data corresponding to the 3D display device includes:
  • Gyroscope data corresponding to the 3D display device is determined according to the first pitch angle value and the second pitch angle value.
  • FIG. 3 is a schematic diagram of the hardware structure of the 3D display device provided by the embodiment of the present application.
  • the 3D display device 3 includes:
  • Receiver 301, transmitter 302, processor 303 and memory 304 (the number of processors 303 in the 3D display device 300 may be one or more, one processor is taken as an example in Figure 3).
  • the receiver 301, the transmitter 302, the processor 303 and the memory 304 may be connected through a bus or other means, wherein the connection through the bus is taken as an example in Figure 3.
  • Memory 304 may include read-only memory and random access memory and provides instructions and data to processor 303 . Portion of memory 304 may also include NVRAM.
  • the memory 304 stores an operating system and operating instructions, executable modules or data structures, or a subset thereof, or an extended set thereof, where the operating instructions may include various operating instructions for implementing various operations.
  • the operating system may include various system programs that are used to implement various basic services and handle hardware-based tasks.
  • the processor 303 controls the operation of the 3D display device, and the processor 303 may also be called a CPU.
  • various components of the 3D display device are coupled together through a bus system.
  • the bus system may also include a power bus, a control bus, a status signal bus, etc.
  • various buses are called bus systems in the figure.
  • the 3D display method disclosed in the above embodiments of the present application can be applied to the processor 303 or implemented by the processor 303 .
  • the processor 303 may be an integrated circuit chip with signal processing capabilities. During the implementation process, each step of the method shown in FIG. 1 can be completed by instructions in the form of hardware integrated logic circuits or software in the processor 303 .
  • the above-mentioned processor 303 may be a general-purpose processor, DSP, ASIC, FPGA or other programmable logic device, discrete gate or transistor logic device, or discrete hardware component. Each method, step and logical block diagram disclosed in the embodiment of this application can be implemented or executed.
  • a general-purpose processor may be a microprocessor or the processor may be any conventional processor, etc.
  • the steps of the method disclosed in conjunction with the embodiments of the present application can be directly implemented by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other mature storage media in this field.
  • the storage medium is located in the memory 304.
  • the processor 303 reads the information in the memory 304 and completes the steps of the above method in combination with its hardware.
  • Embodiments of the present application also provide a computer-readable medium that includes computer-executable instructions.
  • the computer-executable instructions enable the server to execute the 3D display method described in the above embodiments.
  • the implementation principles and technical effects are similar and will not be described again here.
  • the device embodiments described above are only illustrative.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physically separate.
  • the physical unit can be located in one place, or it can be distributed across multiple network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • the connection relationship between modules indicates that there are communication connections between them, which can be specifically implemented as one or more communication buses or signal lines.
  • the present application can be implemented by software plus necessary general hardware. Of course, it can also be implemented by dedicated hardware including dedicated integrated circuits, dedicated CPUs, dedicated memories, Special components, etc. to achieve. In general, all functions performed by computer programs can be easily implemented with corresponding hardware. Moreover, the specific hardware structures used to implement the same function can also be diverse, such as analog circuits, digital circuits or special-purpose circuits. circuit etc. However, for this application, software program implementation is a better implementation in most cases. Based on this understanding, the technical solution of the present application can be embodied in the form of a software product in essence or that contributes to the existing technology.
  • the computer software product is stored in a readable storage medium, such as a computer floppy disk. , U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disk or optical disk, etc., including a number of instructions to cause a computer device (which can be a personal computer, server, or network device, etc.) to execute the methods described in various embodiments of the present application.
  • a computer device which can be a personal computer, server, or network device, etc.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transferred from a website, computer, server, or data center Transmission to another website site, computer, server or data center by wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) means.
  • wired such as coaxial cable, optical fiber, digital subscriber line (DSL)
  • wireless such as infrared, wireless, microwave, etc.
  • the computer-readable storage medium may be any available medium that a computer can store, or a data storage device such as a server or data center integrated with one or more available media.
  • the available media may be magnetic media (eg, floppy disk, hard disk, magnetic tape), optical media (eg, DVD), or semiconductor media (eg, Solid State Disk (SSD)), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Processing Or Creating Images (AREA)

Abstract

提供一种3D显示方法及相关设备,通过渲染引擎创建的交织程序对被标记为3D 显示模式的图层所对应的显示图像进行交织后渲染至显示屏幕上,以呈现裸眼3D的效果。该方法包括:若当前图层所对应的显示图像被渲染至3D显示设备的屏幕,则判断当前图层是否被标记为3D显示模式(101);若当前图层被标记为3D显示模式,则获取3D显示设备的物理参数(102);调用交织程序,以根据3D显示设备的物理参数对当前图层所对应的显示图像进行交织,得到当前图层所对应的3D显示图像(103),交织程序为通过3D显示设备所对应的渲染引擎创建得到;将3D显示图像进行渲染后显示(104)。

Description

3D显示方法和3D显示设备 技术领域
本申请属于裸眼3D显示领域,特别涉及一种3D显示方法和3D显示设备。
背景技术
裸眼3D技术是根据人眼的视差来实现的,即人的左眼和右眼在观察同一个目标时会有图像上的差异,左眼看到的影像跟右眼看到的影像在大脑里合成就成为我们看到的3D内容;由此,通过在屏幕上做一些处理,将存在视差的图像分别映射到人的左眼和右眼,人看起来就是3D影像。
3D游戏引擎会把复杂的图形算法封装在模块内部,对外则提供简捷、有效的SDK接口。
但是目前并没有将游戏通过裸眼3D技术进行展示,实现游戏的显示画面的裸眼3D显示效果。
技术问题
本申请的目的在于提供一种3D显示方法和3D显示设备,通过渲染引擎创建的交织程序对被标记为3D显示模式的图层所对应的显示图像进行交织后渲染至显示屏幕上,以呈现裸眼3D的效果。
技术解决方案
本申请实施例第一方面提供了一种3D显示方法,包括:
若当前图层所对应的显示图像被渲染至3D显示设备的屏幕,则判断所述当前图层是否被标记为3D显示模式;
若所述当前图层被标记为所述3D显示模式,则获取所述3D显示设备的物理参数;
调用交织程序,以根据所述3D显示设备的物理参数对所述当前图层所对应的显示图像进行交织,得到所述当前图层所对应的3D显示图像,所述交织程序为通过所述3D显示设备所对应的渲染引擎创建得到;
将所述3D显示图像进行渲染后显示。
本申请第二方面提供了一种3D显示设备,包括;
判断单元,用于若当前图层所对应的显示图像被渲染至3D显示设备的屏幕,则判断所述当前图层是否被标记为3D显示模式;
获取单元,用于若所述当前图层被标记为所述3D显示模式,则获取所述3D显示设备的物理参数;
交织单元,用于调用交织程序,以根据所述3D显示设备的物理参数对所述当前图层所对应的显示图像进行交织,得到所述当前图层所对应的3D显示图像,所述交织程序为通过所述3D显示设备所对应的渲染引擎创建得到;
显示单元,用于将所述3D显示图像进行渲染后显示。
有益效果
相对于相关技术,本申请提供的实施例中,3D显示装置在进行3D显示时,可以通过额外创建的交织程序根据3D显示设备的物理参数对当前图层所对应的显示图像进行交织,并进行渲染显示。
附图说明
图1为本申请实施例提供的3D显示方法的流程示意图;
图2为本申请实施例提供的3D显示设备的虚拟结构示意图;
图3为本申请实施例提供的3D显示设备的硬件结构示意图。
本发明的最佳实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请的一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
下面从3D显示设备的角度对本申请实施例提供的3D显示方法进行说明,请参阅图1,图1为本申请实施例提供的3D显示方法的流程示意图,包括:
101、若当前图层所对应的显示图像被渲染至3D显示设备的屏幕,则判断当前图层是否被标记为3D显示模式,若当前图层被标记为3D显示模式,则执行步骤102。
本实施例中,3D显示设备在将当前图层所对应的显示图像进行渲染并显示至屏幕时,可以判断当前图层是否被标记为3D显示模式,若该当前图层被标记为3D显示模式,则执行步骤102,若该当前图层未被标记为3D显示模式,则按照正常的渲染方式进行渲染。
需要说明的是,3D显示设备可以提前为待显示对象申请对应的图层,具体的,可以调用该3D显示设备所对应的显示模块创建图层,其中,该3D显示设备所对应的显示模块根据系统不同而不同,例如安卓系统中的显示模块为SurfaceFlinger,当然也还可以是其他操作系统中的显示模块,如IOS系统中的显示模块,下面以安卓系统为例进行说明:
3D显示设备在通过SurfaceFlinger显示模块创建图层时,首先将目标对象(该目标对象可以是游戏,也可以是视频,具体不做限定)所对应的数据包名称传递到SurfaceFlinger,并确定待渲染的目标对象所对应的目标图层,并判断预设包名列表(该预设包名列表中包括目标对象在内的多个对象的数据包名称,该预设包名列表中的对象均是需要开启3D显示模式的对象)中是否存在目标对象的数据包名称,以游戏为例进行说明,该SurfaceFlinger显示模块根据预设包名列表判断当前游戏是否需要开启3d显示模式,若该预设的包名列表中存储有该游戏的数据包名称,则确定该游戏需要开启3D显示时模式,也即该游戏的最终显示效果为3D显示效果,进而将该游戏所对应的目标图层标记为3D显示模式,并设置SurfaceFlinger显示模块使用OpenGLES对该目标图层进行窗口叠加处理。
102、若当前图层被标记为3D显示模式,则获取3D显示设备的物理参数。
本实施例中,3D显示设备在确定当前图层被标记为3D显示模式时,可以获取3D显示设备的物理参数,此处并不限定获取该3D显示设备的物理参数的方式,例如可以发出提示指令,由用户进行输入,当然也还可以是其他的方式获取。该物理参数指的是该3D显示设备所对应的光栅的贴合角度和宽度,当然该物理参数也还可以包括其他与裸眼3D显示所对应的参数,例如光栅的视点宽度等参数,具体不做限定。
103、调用交织程序,以根据3D显示设备的物理参数对当前图像所对应的显示图像进行交织,得到当前图层所对应的3D显示图像。
本实施例中,SurfaceFlinger显示模块使用OpenGLES对图层进行合并时,会在渲染引擎(RenderEngine)创建一个默认的opengl程序P1(program)用来将各个图层渲染到3D显示设备所对应的屏幕上,若3D显示设备确定当前图层被标记成为3d显示模式,可以调用3D显示设备所对应的渲染引擎创建的交织程序将当前图层根据3D显示设备的物理参数进行交织,得到当前图层所对应的3D显示图像。
需要说明的是,被标记为3D显示模式的图层在第一次被渲染时会在RenderEngine创建一个额外的交织程序P2(program)用于将该图层根据3d显示设备的物理参数进行交织后再渲染到屏幕上。可以理解的是,只需要在首次渲染时构建额外的交织程序即可,而无需每次构建。
需要说明的是,此处并不限定步骤102的执行顺序,可以与步骤101同时执行,也可以在步骤101之前执行,也可以在步骤102之后执行,至需要在调用交织程序,以根据3D显示设备的物理参数对当前图像所对应的显示图像进行交织之前获取到3D显示设备的物理参数即可。
104、将3D显示图像进行渲染后显示。
本实施例中,3D显示设备在通过额外创建的交织程序将当前图层根据3D显示设备的物理参数进行交织后,可以将交织后的显示图像进行渲染后显示与3D显示设备的屏幕上,实现目标对象的3D显示效果。
一个实施例中,3D显示设备还可以实现人眼追踪裸眼3D显示,下面进行具体说明:
确定当前时刻目标用户的眼睛位置数据以及3D显示设备所对应的陀螺仪数据,目标用户为当前查看3D显示设备的屏幕的用户;
根据目标用户的眼睛位置数据以及陀螺仪数据确定目标用户所对应的最终人眼角度数据;
根据最终人眼角度数据对渲染后显示的3D显示图像进行调整。
本实施例中,当目标用户在观看3D显示设备的屏幕显示的裸眼3D图像时,可以确定当前时刻目标用户的眼睛位置数据以及3D显示设备所对应的陀螺仪数据,该目标用户的眼睛位置数据包括观察角度数据以及观察距离数据。
其中,3D显示设备在确定当前时刻目标用户的眼睛位置数据时, 3D显示设备可以通过摄像头获取目标用户的人脸图像,并确定人脸图像中左瞳孔位置和右瞳孔位置;并通过左右瞳孔相对画面原点的偏离,计算得出横向视角为,纵向视角为,其中,A为视角校准常量数据;并根据横向视角和纵向视角计算出观察角度,并根据观察角度以及目标用户的瞳距(该目标用户的瞳距可以根据目标用户的左瞳孔位置以及右瞳孔位置进行计算得到)计算观察距离数据,其中B为视距校准常量数据, 用于表示所述左右瞳距。可以理解的是,该左右瞳孔相对画面原点的偏离指的是左右瞳孔之间的中点相对画面原点的偏离。
3D显示设备在确定3D显示设备所对应的陀螺仪数据时,可以获取存储的陀螺仪的第一俯仰角值V1(3D显示设备在陀螺仪的俯仰角值有变化时会进行存储,此处第一俯仰角值指的是最后一次存储的陀螺仪的俯仰角值),并监听3D显示设备所对应的刷新指令,计算出陀螺仪在3D显示设备的屏幕刷新时的第二俯仰角值V2;最终根据第一俯仰角值V1和第二俯仰角值V2得出陀螺仪的俯仰角变化值(也即陀螺仪数据)。
需要说明的是,该俯仰角值可以通过三维标准坐标系进行标示,通过三维空间坐标可以直观地反映人眼与3D显示设备之间的角度关系。利用陀螺仪可以在短时间内高频次输出3D显示设备姿态信息的特点,通过陀螺仪计算出在相邻两次人眼检测之间陀螺仪的俯仰角变化值以即时输出3D显示设备的俯仰状态(即姿态信息),使得摄像头的帧间空白时间得到了弥补。
在确定目标用户的眼睛位置数据以及陀螺仪数据之后,可以根据眼睛位置数据以及陀螺仪数据确定目标用户所对应的最终人眼角度数据,也即将初始人眼位置数据及俯仰角变化值相叠加,得出最终人眼角度数据,由此3D显示设备可以基于该人眼角度数据通过立体游戏引擎来调整3D显示画面,其中,目标用户的眼睛与3D显示设备的屏幕之间的角度数据为V0,该V0代表当前目标用户的眼睛与3D显示设备的屏幕在三维标准坐标系中的角度关系;该最终人眼角度数据为Vx,其计算公式为:Vx=V0+Vp。将陀螺仪的俯仰角变化值与初始人眼位置数据叠加,最终可以得出精确可靠的最终人眼角度数据。
最后,根据最终人眼角度数据对渲染后显示的3D显示图像进行调整,可以在目标用户移动或者是3D显示设备移动的情况下,均可以观察到3D显示图像,具体的,3D显示设备根据最终人眼角度数据Vx将3D显示图像进行旋转或移动,其中,该 3D显示图像的旋转角度,与人眼位置变化角度为相反方向,并具有线性关系,比例为人眼到屏幕距离与3D显示图像中景物的实际景深,并乘以调节参数,以带给用户体验更为真实的3D虚拟场景。
可以理解的是,当用户在3D显示设备上玩裸眼3d游戏的时候,会触摸屏幕,进而不可避免的会造成设备轻微的晃动,本申请中通过通过陀螺仪13辅助对人眼位置进行捕捉,即使在晃动中也可以通过陀螺仪实时传输设备的俯仰角变化值实时对人眼进行检测使得用户始终可以观看到3D显示图像。
综上所述,可以看出,本申请提供的实施例中,可以通过额外创建的交织程序根据3D显示设备的物理参数对当前图层所对应的显示图像进行交织,并进行渲染显示。
上面从3D显示方法的角度对本申请实施例进行说明,下面从3D显示设备的角度对本申请实施例进行说明。
请参阅图2,图2为本申请实施例提供的3D显示设备的虚拟结构示意图,所述3D显示设备200包括:
判断单元201,用于若当前图层所对应的显示图像被渲染至3D显示设备的屏幕,则判断所述当前图层是否被标记为3D显示模式;
获取单元202,用于若所述当前图层被标记为所述3D显示模式,则获取所述3D显示设备的物理参数;
交织单元203,用于调用交织程序,以根据所述3D显示设备的物理参数对所述当前图层所对应的显示图像进行交织,得到所述当前图层所对应的3D显示图像,所述交织程序为通过所述3D显示设备所对应的渲染引擎创建得到;
显示单元204,用于将所述3D显示图像进行渲染后显示。
一种可能的设计中,所述交织单元203还用于:
调用所述3D显示设备所对应的显示模块创建图层;
确定目标对象所对应的目标图层,所述目标对象为待渲染的对象;
判断预设包名列表中是否存在所述目标对象的数据包名称;
若所述预设包名列表中存在所述目标对象的数据包名称,则将所述目标图层标记为所述3D显示模式。
一种可能的设计中,所述交织单元203还用于:
确定当前时刻目标用户的眼睛位置数据以及所述3D显示设备所对应的陀螺仪数据,所述目标用户为当前查看所述3D显示设备的屏幕的用户;
根据所述目标用户的眼睛位置数据以及所述陀螺仪数据确定所述目标用户所对应的最终人眼角度数据;
根据所述最终人眼角度数据对渲染后显示的所述3D显示图像进行调整。
一种可能的设计中,所述目标用户的眼睛位置数据包括观察角度数据以及观察距离数据,所述交织单元203确定当前时刻目标用户的眼睛位置数据包括:
确定所述当前时刻所述目标用户的左瞳孔位置以及右瞳孔位置;
根据所述目标用户左瞳孔和右瞳孔与画面原点的偏移、所述左瞳孔位置以及所述右瞳孔位置确定横向视角以及纵向视角;
根据所述横向视角以及所述纵向视角确定所述观察角度数据;
根据所述观察角度数据以及所述目标用户的瞳距计算所述观察距离数据。
一种可能的设计中,所述交织单元203确定所述3D显示设备所对应的陀螺仪数据包括:
获取存储的所述陀螺仪的第一俯仰角值;
确定监听到所述3D显示设备所对应的刷新指令时所述陀螺仪的第二俯仰角值;
根据所述第一俯仰角值以及所述第二俯仰角值确定所述3D显示设备所对应的陀螺仪数据。
接下来介绍本申请实施例提供的另一种3D显示设备,请参阅图3所示,图3为本申请实施例提供的3D显示设备的硬件结构示意图,3D显示设备3包括:
接收器301、发射器302、处理器303和存储器304 (其中3D显示设备300中的处理器303的数量可以一个或多个,图3中以一个处理器为例)。在本申请的一些实施例中,接收器301、发射器302、处理器303和存储器304可通过总线或其它方式连接,其中,图3中以通过总线连接为例。
存储器304可以包括只读存储器和随机存取存储器,并向处理器303提供指令和数据。存储器304的一部分还可以包括NVRAM。存储器304存储有操作系统和操作指令、可执行模块或者数据结构,或者它们的子集,或者它们的扩展集,其中,操作指令可包括各种操作指令,用于实现各种操作。操作系统可包括各种系统程序,用于实现各种基础业务以及处理基于硬件的任务。
处理器303控制3D显示设备的操作,处理器303还可以称为CPU。具体的应用中,3D显示设备的各个组件通过总线系统耦合在一起,其中总线系统除包括数据总线之外,还可以包括电源总线、控制总线和状态信号总线等。但是为了清楚说明起见,在图中将各种总线都称为总线系统。
上述本申请实施例揭示的所述3D显示方法可以应用于处理器303中,或者由处理器303实现。处理器303可以是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述图1所示的方法的各步骤可以通过处理器303中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器303可以是通用处理器、DSP、ASIC、FPGA或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器304,处理器303读取存储器304中的信息,结合其硬件完成上述方法的步骤。
本申请实施例还提供一种计算机可读介质,包含计算机执行指令,计算机执行指令能够使服务器执行上述实施例描述的3D显示方法,其实现原理和技术效果类似,此处不再赘述。
另外需说明的是,以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。另外,本申请提供的装置实施例附图中,模块之间的连接关系表示它们之间具有通信连接,具体可以实现为一条或多条通信总线或信号线。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到本申请可借助软件加必需的通用硬件的方式来实现,当然也可以通过专用硬件包括专用集成电路、专用CPU、专用存储器、专用元器件等来实现。一般情况下,凡由计算机程序完成的功能都可以很容易地用相应的硬件来实现,而且,用来实现同一功能的具体硬件结构也可以是多种多样的,例如模拟电路、数字电路或专用电路等。但是,对本申请而言更多情况下软件程序实现是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在可读取的存储介质中,如计算机的软盘、U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述的方法。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。
所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存储的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘Solid State Disk(SSD))等。
最后应说明的是:以上各实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述各实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (10)

  1. 一种3D显示方法,其特征在于,包括:
    若当前图层所对应的显示图像被渲染至3D显示设备的屏幕,则判断所述当前图层是否被标记为3D显示模式;
    若所述当前图层被标记为所述3D显示模式,则获取所述3D显示设备的物理参数;
    调用交织程序,以根据所述3D显示设备的物理参数对所述当前图层所对应的显示图像进行交织,得到所述当前图层所对应的3D显示图像,所述交织程序为通过所述3D显示设备所对应的渲染引擎创建得到;
    将所述3D显示图像进行渲染后显示。
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    调用所述3D显示设备所对应的显示模块创建图层;
    确定目标对象所对应的目标图层,所述目标对象为待渲染的对象;
    判断预设包名列表中是否存在所述目标对象的数据包名称;
    若所述预设包名列表中存在所述目标对象的数据包名称,则将所述目标图层标记为所述3D显示模式。
  3. 根据权利要求1或2所述的方法,其特征在于,所述方法还包括:
    确定当前时刻目标用户的眼睛位置数据以及所述3D显示设备所对应的陀螺仪数据,所述目标用户为当前查看所述3D显示设备的屏幕的用户;
    根据所述目标用户的眼睛位置数据以及所述陀螺仪数据确定所述目标用户所对应的最终人眼角度数据;
    根据所述最终人眼角度数据对渲染后显示的所述3D显示图像进行调整。
  4. 根据权利要求3所述的方法,其特征在于,所述目标用户的眼睛位置数据包括观察角度数据以及观察距离数据,所述确定当前时刻目标用户的眼睛位置数据包括:
    确定所述当前时刻所述目标用户的左瞳孔位置以及右瞳孔位置;
    根据所述目标用户左瞳孔和右瞳孔与画面原点的偏移、所述左瞳孔位置以及所述右瞳孔位置确定横向视角以及纵向视角;
    根据所述横向视角以及所述纵向视角确定所述观察角度数据;
    根据所述观察角度数据以及所述目标用户的瞳距计算所述观察距离数据。
  5. 根据权利要求3所述的方法,其特征在于,所述确定所述3D显示设备所对应的陀螺仪数据包括:
    获取存储的所述陀螺仪的第一俯仰角值;
    确定监听到所述3D显示设备所对应的刷新指令时所述陀螺仪的第二俯仰角值;
    根据所述第一俯仰角值以及所述第二俯仰角值确定所述3D显示设备所对应的陀螺仪数据。
  6. 一种3D显示设备,其特征在于,包括;
    判断单元,用于若当前图层所对应的显示图像被渲染至3D显示设备的屏幕,则判断所述当前图层是否被标记为3D显示模式;
    获取单元,用于若所述当前图层被标记为所述3D显示模式,则获取所述3D显示设备的物理参数;
    交织单元,用于调用交织程序,以根据所述3D显示设备的物理参数对所述当前图层所对应的显示图像进行交织,得到所述当前图层所对应的3D显示图像,所述交织程序为通过所述3D显示设备所对应的渲染引擎创建得到;
    显示单元,用于将所述3D显示图像进行渲染后显示。
  7. 根据权利要求6所述的3D显示设备,其特征在于,所述交织单元还用于:
    调用所述3D显示设备所对应的显示模块创建图层;
    确定目标对象所对应的目标图层,所述目标对象为待渲染的对象;
    判断预设包名列表中是否存在所述目标对象的数据包名称;
    若所述预设包名列表中存在所述目标对象的数据包名称,则将所述目标图层标记为所述3D显示模式。
  8. 根据权利要求6或7所述的3D显示设备,其特征在于,所述交织单元还用于:
    确定当前时刻目标用户的眼睛位置数据以及所述3D显示设备所对应的陀螺仪数据,所述目标用户为当前查看所述3D显示设备的屏幕的用户;
    根据所述目标用户的眼睛位置数据以及所述陀螺仪数据确定所述目标用户所对应的最终人眼角度数据;
    根据所述最终人眼角度数据对渲染后显示的所述3D显示图像进行调整。
  9. 根据权利要求8所述的3D显示设备,其特征在于,所述目标用户的眼睛位置数据包括观察角度数据以及观察距离数据,所述交织单元确定当前时刻目标用户的眼睛位置数据包括:
    确定所述当前时刻所述目标用户的左瞳孔位置以及右瞳孔位置;
    根据所述目标用户左瞳孔和右瞳孔与所述画面原点的偏移、所述左瞳孔位置以及所述右瞳孔位置确定横向视角以及纵向视角;
    根据所述横向视角以及所述纵向视角确定所述观察角度数据;
    根据所述观察角度数据以及所述目标用户的瞳距计算所述观察距离数据。
  10. 根据权利要求9所述的3D显示设备,其特征在于,所述交织单元确定所述3D显示设备所对应的陀螺仪数据包括:
    获取存储的所述陀螺仪的第一俯仰角值;
    确定监听到所述3D显示设备所对应的刷新指令时所述陀螺仪的第二俯仰角值;
    根据所述第一俯仰角值以及所述第二俯仰角值确定所述3D显示设备所对应的陀螺仪数据。
     
PCT/CN2022/143110 2022-02-11 2022-12-29 3d 显示方法和 3d 显示设备 WO2023216619A1 (zh)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN202210129155 2022-02-11
CN202210494259.0 2022-05-07
CN202210494259.0A CN114928739A (zh) 2022-02-11 2022-05-07 3d显示方法、装置及存储介质

Publications (1)

Publication Number Publication Date
WO2023216619A1 true WO2023216619A1 (zh) 2023-11-16

Family

ID=82808173

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/143110 WO2023216619A1 (zh) 2022-02-11 2022-12-29 3d 显示方法和 3d 显示设备

Country Status (2)

Country Link
CN (1) CN114928739A (zh)
WO (1) WO2023216619A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114928739A (zh) * 2022-02-11 2022-08-19 广东未来科技有限公司 3d显示方法、装置及存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2384010A2 (en) * 2010-04-30 2011-11-02 Lg Electronics Inc. Method for controlling operations of image display apparatus and shutter glasses used for the image display apparatus
CN107341004A (zh) * 2017-06-16 2017-11-10 深圳康得新智能显示科技有限公司 开机界面的显示方法及装置、终端
CN108600733A (zh) * 2018-05-04 2018-09-28 成都泰和万钟科技有限公司 一种基于人眼跟踪的裸眼3d显示方法
CN109922326A (zh) * 2019-03-29 2019-06-21 深圳市新致维科技有限公司 裸眼3d视频图像的分辨率确定方法、装置、介质及设备
CN112714302A (zh) * 2019-10-25 2021-04-27 苏州苏大维格科技集团股份有限公司 裸眼3d图像的制作方法及装置
CN113347410A (zh) * 2021-06-01 2021-09-03 广东未来科技有限公司 一种利用陀螺仪辅助人眼追踪的3d显示方法及装置
CN113411574A (zh) * 2021-06-17 2021-09-17 纵深视觉科技(南京)有限责任公司 一种裸眼3d显示效果的评价方法、装置、介质及系统
CN114928739A (zh) * 2022-02-11 2022-08-19 广东未来科技有限公司 3d显示方法、装置及存储介质

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103248905A (zh) * 2013-03-22 2013-08-14 深圳市云立方信息科技有限公司 一种模仿全息3d场景的显示装置和视觉显示方法
CN113079364A (zh) * 2021-03-24 2021-07-06 纵深视觉科技(南京)有限责任公司 一种静态对象的立体显示方法、装置、介质及电子设备
CN113379897A (zh) * 2021-06-15 2021-09-10 广东未来科技有限公司 应用于3d游戏渲染引擎的自适应虚拟视图转立体视图的方法及装置
CN113426113B (zh) * 2021-07-05 2024-06-25 未来科技(襄阳)有限公司 3d游戏启动器及2d游戏的3d启动方法
CN113920280A (zh) * 2021-09-28 2022-01-11 广东未来科技有限公司 一种2d游戏的裸眼3d显示的方法和系统

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2384010A2 (en) * 2010-04-30 2011-11-02 Lg Electronics Inc. Method for controlling operations of image display apparatus and shutter glasses used for the image display apparatus
CN107341004A (zh) * 2017-06-16 2017-11-10 深圳康得新智能显示科技有限公司 开机界面的显示方法及装置、终端
CN108600733A (zh) * 2018-05-04 2018-09-28 成都泰和万钟科技有限公司 一种基于人眼跟踪的裸眼3d显示方法
CN109922326A (zh) * 2019-03-29 2019-06-21 深圳市新致维科技有限公司 裸眼3d视频图像的分辨率确定方法、装置、介质及设备
CN112714302A (zh) * 2019-10-25 2021-04-27 苏州苏大维格科技集团股份有限公司 裸眼3d图像的制作方法及装置
CN113347410A (zh) * 2021-06-01 2021-09-03 广东未来科技有限公司 一种利用陀螺仪辅助人眼追踪的3d显示方法及装置
CN113411574A (zh) * 2021-06-17 2021-09-17 纵深视觉科技(南京)有限责任公司 一种裸眼3d显示效果的评价方法、装置、介质及系统
CN114928739A (zh) * 2022-02-11 2022-08-19 广东未来科技有限公司 3d显示方法、装置及存储介质

Also Published As

Publication number Publication date
CN114928739A (zh) 2022-08-19

Similar Documents

Publication Publication Date Title
US10078367B2 (en) Stabilization plane determination based on gaze location
US9832451B2 (en) Methods for reduced-bandwidth wireless 3D video transmission
JP6353214B2 (ja) 画像生成装置および画像生成方法
US9020203B2 (en) System and method for managing spatiotemporal uncertainty
CN104536579B (zh) 交互式三维实景与数字图像高速融合处理系统及处理方法
US10726625B2 (en) Method and system for improving the transmission and processing of data regarding a multi-user virtual environment
CN109743626B (zh) 一种图像显示方法、图像处理方法和相关设备
CN107844190B (zh) 基于虚拟现实vr设备的图像展示方法及装置
WO2022089046A1 (zh) 虚拟现实显示方法、装置及存储介质
JP7134060B2 (ja) 画像生成装置および画像生成方法
CN108174178A (zh) 一种图像显示方法、装置及虚拟现实设备
WO2023216619A1 (zh) 3d 显示方法和 3d 显示设备
WO2017113729A1 (zh) 360度图像加载方法、加载模块及移动终端
JP6963399B2 (ja) プログラム、記録媒体、画像生成装置、画像生成方法
JP2018500690A (ja) 拡大3d画像を生成するための方法およびシステム
CN114513646B (zh) 一种三维虚拟场景中全景视频的生成方法及设备
WO2019073925A1 (ja) 画像生成装置および画像生成方法
Wei et al. Color anaglyphs for panorama visualizations
WO2023162504A1 (ja) 情報処理装置、情報処理方法およびプログラム
GB2575824A (en) Generating display data
JP7467748B1 (ja) 表示制御装置、表示システム及びプログラム
US11818324B2 (en) Virtual reality environment
JP7365183B2 (ja) 画像生成装置、ヘッドマウントディスプレイ、コンテンツ処理システム、および画像表示方法
WO2021146978A1 (zh) 显示系统、图形处理器gpu、显示控制器以及显示方法
WO2023068087A1 (ja) ヘッドマウントディスプレイ、情報処理装置および情報処理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22941551

Country of ref document: EP

Kind code of ref document: A1