WO2019105323A1 - 显示模组、头戴式显示设备及图像立体显示方法和装置 - Google Patents

显示模组、头戴式显示设备及图像立体显示方法和装置 Download PDF

Info

Publication number
WO2019105323A1
WO2019105323A1 PCT/CN2018/117429 CN2018117429W WO2019105323A1 WO 2019105323 A1 WO2019105323 A1 WO 2019105323A1 CN 2018117429 W CN2018117429 W CN 2018117429W WO 2019105323 A1 WO2019105323 A1 WO 2019105323A1
Authority
WO
WIPO (PCT)
Prior art keywords
liquid crystal
image
display
crystal lens
displayed
Prior art date
Application number
PCT/CN2018/117429
Other languages
English (en)
French (fr)
Inventor
余志雄
林明田
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2019105323A1 publication Critical patent/WO2019105323A1/zh
Priority to US16/667,242 priority Critical patent/US11064187B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/27Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
    • G02B30/28Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays involving active lenticular arrays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/12Fluid-filled or evacuated lenses
    • G02B3/14Fluid-filled or evacuated lenses of variable focal length
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/27Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • G02F1/133526Lenses, e.g. microlenses or Fresnel lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/261Image signal generators with monoscopic-to-stereoscopic image conversion
    • H04N13/268Image signal generators with monoscopic-to-stereoscopic image conversion based on depth image-based rendering [DIBR]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • G02B2027/0134Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/29Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the position or the direction of light beams, i.e. deflection
    • G02F1/294Variable focal length devices

Definitions

  • the present application relates to the field of image stereoscopic display technologies, and in particular, to a display module, a head mounted display device, and an image stereoscopic display method and apparatus.
  • the existing head-mounted display device usually adopts a lens plus a screen structure, and uses a magnifying glass principle to place the screen within a focal length of the lens, and the lens is between the screen and the user, thereby being at a certain distance from the user.
  • the enlarged virtual image of the screen on the screen is presented for the user to watch.
  • the lenses corresponding to the left and right eyes of the user may be spaced apart, and the user's stereo experience in the virtual scene comes from the relative positional change of the corresponding points of the left and right eye images, thereby causing a change in the angle between the eyes of the two eyes to cause near-far perception.
  • the distance seen by the human eye is only the virtual image distance between the virtual image on the screen and the user, and this distance usually does not change at the same time as the screen changes, which leads to the user's brain perception.
  • the distance to the picture and the distance from the binocular lens feedback to the brain are inconsistent.
  • the user will have self-protection wit, that is, dizziness and discomfort, which makes the product use time limited.
  • the embodiment of the present invention provides a display module, a head-mounted display device, and an image stereoscopic display method and device, which can realize the same line-of-sight superposition of a virtual object and a real object, thereby improving the viewing experience of the user.
  • the embodiment of the present application provides a display module, including a display screen, a liquid crystal lens layer, and a control module;
  • the display screen includes a plurality of pixel groups, each pixel group including at least one pixel;
  • the liquid crystal lens unit includes a plurality of liquid crystal lens units, each of which is correspondingly provided with a liquid crystal lens unit, and the liquid crystal lens unit includes liquid crystal molecules;
  • the control module is electrically connected to the display screen and the liquid crystal lens layer, and is configured to acquire virtual display depth information of the to-be-displayed content of each pixel group on the display screen, and after displaying the to-be-displayed content in each pixel group, Controlling an electric field of liquid crystal molecules in the liquid crystal lens unit corresponding to each pixel group according to the virtual display depth information to change a refractive index of the corresponding liquid crystal lens unit, and further adjusting each pixel by the corresponding liquid crystal lens unit The first virtual image of the content displayed by the group.
  • the embodiment of the present application further provides a head mounted display device, including the foregoing display module.
  • the embodiment of the present application further provides an image stereoscopic display method, including:
  • the pixel group corresponding to each pixel region is controlled to be in the display.
  • An electric field of the liquid crystal molecules in the liquid crystal lens unit in the light-emitting direction of the screen to change the refractive index of the corresponding liquid crystal lens unit, and further, the content of each pixel group is adjusted by the corresponding liquid crystal lens unit A virtual image.
  • the embodiment of the present application further provides an image stereoscopic display device, comprising: a processor and a memory, wherein the memory stores a computer program, and the computer program is loaded by the processor and performs the following steps:
  • the pixel group corresponding to each pixel region is controlled to be in the display.
  • An electric field of the liquid crystal molecules in the liquid crystal lens unit in the light-emitting direction of the screen to change the refractive index of the corresponding liquid crystal lens unit, and further, the content of each pixel group is adjusted by the corresponding liquid crystal lens unit A virtual image.
  • the embodiment of the present application further provides a non-volatile storage medium in which a plurality of instructions are stored, the instructions being adapted to be loaded by a processor and performing the following steps:
  • the pixel group corresponding to each pixel region is controlled to be in the display.
  • An electric field of the liquid crystal molecules in the liquid crystal lens unit in the light-emitting direction of the screen to change the refractive index of the corresponding liquid crystal lens unit, and further, the content of each pixel group is adjusted by the corresponding liquid crystal lens unit A virtual image.
  • FIG. 1 is a schematic diagram of electrical connections of a display module according to an embodiment of the present application.
  • FIG. 2 is a schematic diagram showing the structure and optical imaging of a display screen and a liquid crystal lens layer in a display module according to an embodiment of the present application;
  • FIG. 3 is a schematic diagram of optical imaging of a liquid crystal lens layer according to an embodiment of the present application.
  • FIG. 4 is a schematic view showing optical imaging of a liquid crystal lens unit of a liquid crystal lens layer according to an embodiment of the present application
  • FIG. 5 is a schematic structural diagram and optical imaging diagram of a display module according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram showing the structure and imaging of a display module according to an embodiment of the present application.
  • FIG. 7 is a schematic structural diagram and optical imaging diagram of a head mounted display device according to an embodiment of the present application.
  • FIG. 8 is a schematic flowchart diagram of an image stereoscopic display method according to an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of an image stereoscopic display device according to an embodiment of the present application.
  • the angle of the line of sight of the two eyes to the object and the degree of refractive power of the lens need to be changed in real time according to the difference in distance and angle between the two eyes and the different observation objects, and the direction of the line of sight is
  • the degree of refractive power of the lens is the same, that is, the light field information of the observed object is finally captured by the human eye.
  • the display module, the head mounted display device, and the image stereoscopic display method and apparatus provided by the embodiments of the present application can enable a user to realize a brain-eye coordinated viewing experience in a virtual scene. It is also possible to achieve the same line-of-sight superposition of virtual objects and real objects in an AR (Augmented Reality)/MR (Mixed Reality) scene to enhance the user's viewing experience.
  • FIG. 1 is a schematic diagram of electrical connections of a display module according to an embodiment of the present application.
  • FIG. 2 is a schematic diagram showing the structure and optical imaging of a display screen and a liquid crystal lens layer in a display module according to an embodiment of the present application.
  • the display module includes a display screen 1, a liquid crystal lens layer 2, and a control module 3.
  • the display screen 1 includes a plurality of pixel groups 10, each of which contains at least one pixel.
  • a pixel is the smallest unit of illumination of a display screen, and each of the display units that can be individually controlled is referred to as a pixel.
  • the liquid crystal lens layer 2 may include a plurality of liquid crystal lens units 20, and each of the pixel groups 10 is provided with a liquid crystal lens unit 20 correspondingly.
  • the liquid crystal lens unit 20 includes liquid crystal molecules.
  • the control module 3 is electrically connected to the display screen 1 and the liquid crystal lens layer 2 for acquiring virtual display depth information of the content to be displayed of each pixel group 10 on the display screen 1 , and displaying the information in each pixel group 10 .
  • the electric field of the liquid crystal molecules in the liquid crystal lens unit 20 corresponding to each pixel group 10 is controlled according to the virtual display depth information to change the refractive index of the corresponding liquid crystal lens unit 20, and further, The corresponding liquid crystal lens unit adjusts the first virtual image of the content displayed by each pixel group.
  • the virtual display depth information is distance information of each object in the content to be displayed. That is, the control module 3 can control the electric field of the liquid crystal molecules in the liquid crystal lens unit 20 corresponding to the pixel group 10 according to the distance information of each object included in the content displayed by the pixel group 10 to change the refractive index of the liquid crystal lens unit 20. And adjusting the virtual image distance of the virtual image corresponding to the content displayed by the pixel group 10.
  • FIG. 3 is a schematic diagram of optical imaging of a liquid crystal lens layer according to an embodiment of the present disclosure
  • FIG. 4 is a schematic diagram of optical imaging of a liquid crystal lens unit of a liquid crystal lens layer according to an embodiment of the present application.
  • the liquid crystal lens (Liquid Crystal Lens) layer 2 when the control module 3 applies different electric fields to each group of liquid crystal lens units 20 through the electrodes 21, each group of liquid crystal lens units The liquid crystal molecules within 20 undergo different degrees of rotation to form different refractive indices, so that each group of liquid crystal lens units 20 forms minute lenses having different convergence capabilities.
  • the general lens principle is to use the curvature of the mirror to cause light deflection under the same refractive index, and the liquid crystal lens layer 2 of the embodiment of the present application can utilize the refractive index in different regions of the liquid crystal lens layer 2 under the same thickness. Different causes the light to deflect.
  • the plane of the object that is, the object surface
  • an enlarged virtual image of the object is formed on the same side of the object, and the distance of the virtual image from the viewer is the virtual image distance.
  • the pixel group 10 of the display screen 1 is located within a focal length of the liquid crystal lens unit 20
  • the viewer located on the light exit side of the display screen 1 can see that the content displayed by the pixel group 10 is on the same side of the liquid crystal lens unit 20.
  • the first virtual image 101 is formed.
  • the control module 3 changes the electric field of the liquid crystal lens unit 20, and can rotate the liquid crystal molecules in the liquid crystal lens unit 20.
  • the control module 3 applies different electric fields to the different liquid crystal lens units 20 such that the rotation angles of the liquid crystal molecules of the different liquid crystal lens units 20 are different, thereby enabling the plurality of liquid crystal lens units 20 to achieve different light rays for their corresponding pixel groups 10.
  • the refraction effect further causes the plurality of first virtual images 101 formed by the content displayed by the corresponding pixel group 10 to have different virtual image distances.
  • the contents displayed by the plurality of pixel groups 10 are respectively imaged by the liquid crystal lens unit 20 at different virtual image distance positions. As shown in FIG. 2, the distance of the display screen from the viewer is constant (not shown), and the first virtual image 101 formed by the two pixel groups 10 respectively is at the distances d1 and d2 from the display screen. Thus, the viewer can see through the liquid crystal lens layer 2 that the content displayed on the display screen 1 is a stereoscopic effect.
  • the control module 3 can control the refractive index change of the corresponding liquid crystal lens unit 20 to change the virtual image distance of the virtual image of the content displayed by the corresponding pixel group 10, so that the content displayed by each pixel group 10 is
  • the virtual image distance of the virtual image can be changed in real time according to the displayed content, and the viewer can watch the corresponding stereoscopic effect in real time, avoiding dizziness and discomfort, and improving the user experience.
  • FIG. 5 is a schematic diagram showing the structure and optical imaging of a display module according to an embodiment of the present application.
  • the display module may further include a viewing lens 4, the liquid crystal lens layer 2 is located between the display screen 1 and the viewing lens 4, and the viewing lens 4 is located between the user and the liquid crystal lens layer 2.
  • the viewing lens 4 is a convex lens.
  • the content displayed on the display screen 1 is located within a focal length of the liquid crystal lens layer 2
  • the first virtual image 101 of the content displayed on the display screen 1 formed by the liquid crystal lens layer 2 is located within a focal length of the viewing lens 4, so that the viewing lens 4 is viewed.
  • the first virtual image 101 can be enlarged to form an enlarged second virtual image 102.
  • the first virtual image 101 formed by imaging the content displayed by the pixel group 10 at the original position is located at a position d from the display screen 1.
  • the distance d can be very small, in the millimeter mm range.
  • a slight change of the first virtual image 101 formed by one imaging can cause the second virtual image 102 after the secondary imaging by the viewing lens 4 to change very much, and the second virtual image 102 formed after the secondary imaging is between the viewing lens 4.
  • the distance D will change very significantly due to small changes in D2.
  • a change of D2 of 1 mm can change the distance D between the secondary virtual image and the viewing lens by more than 10 m.
  • the display screen 1 and the liquid crystal lens layer 2 can be closely arranged to facilitate the assembly connection between the two, so that the distance between each pixel group 10 and the liquid crystal lens unit 20 is equal, which facilitates the control of imaging; There is no gap between the screen 1 and the liquid crystal lens layer 2 to prevent the entry of dust and the like.
  • the display screen 1 is disposed in close contact with the liquid crystal lens layer 2, so that the distance between the two can be minimized, so that when the refractive index of the liquid crystal lens unit 20 is changed to the minimum focal length, the pixel group 10 is still doubled in the liquid crystal lens unit 20. Within the focal length.
  • the display area of the display screen 1 may be less than or equal to the area of the liquid crystal lens layer 2, and the edge of the liquid crystal lens layer 2 extends beyond the display area of the display screen 1 to ensure that each pixel group 10 on the display screen 1 is correspondingly disposed.
  • Liquid crystal lens unit 20 At the same time, the area of the liquid crystal lens layer 2 is larger than the area of the display area of the display 1, and the alignment between the liquid crystal lens unit 20 of the liquid crystal lens layer 2 and the pixel group 10 of the display 1 can be facilitated. For example, when the correspondence between the controlled liquid crystal lens unit 20 and the corresponding pixel group 10 is misaligned, the user can move the liquid crystal lens unit 20 to control all the liquid crystal lens units 20 to move left and right or up and down. It can be ensured that all the pixel groups 10 still correspond to the liquid crystal lens unit 20.
  • the plurality of liquid crystal molecules in the liquid crystal lens unit 20 may be arranged in a matrix or arranged in a plurality of rings which are sequentially nested.
  • each of the liquid crystal lens units 20 may be provided with two electrodes 21 for controlling an electric field, which are arranged in the left-right direction of the viewer.
  • the two electrodes 21 are a positive electrode and a negative electrode, respectively.
  • the two electrodes 21 are arranged along the left and right direction of the viewer to form an optimal visual effect with the left and right eyes of the viewer.
  • the liquid crystal lens layer may also contain a plurality of pixels.
  • Each liquid crystal lens unit 20 includes at least one pixel.
  • the electrodes are applied to pixels included in the liquid crystal lens unit 20, and the switches and gray scales of the pixels are controlled. Every point that a human eye sees on a liquid crystal screen, that is, one pixel, is composed of three sub-pixels of red, green, and blue (RGB).
  • RGB red, green, and blue
  • Each sub-pixel the light source behind it can show different brightness levels.
  • the gray scale represents the level of different brightness from the darkest to the brightest. The more levels, the more delicate the picture will be. Red, green, and blue are combined in different brightness levels to form dots of different colors.
  • the control module 3 controls the electric field of the liquid crystal molecules in the liquid crystal lens unit 20 corresponding to the pixel group 10 according to the distance information of each object included in the content displayed by the pixel group 10 to change the liquid crystal lens unit 20 .
  • the refractive index which in turn adjusts the virtual image distance of the virtual image of the content displayed by the corresponding pixel group 10, and thus, when the virtual scene is built or generated, for example, when designing a virtual reality game or shooting a virtual reality movie through the virtual engine Unity,
  • the distance information between the object to be displayed by each pixel group 10 of the display screen 1 and the camera in the virtual reality game scene or in the virtual reality movie scene is recorded.
  • the image stereoscopic display device for example, the head mounted display device, acquires the distance between the object recorded in the virtual reality game or the virtual reality movie and the camera, The virtual image of the corresponding object is presented according to the distance.
  • the distance between the object and the camera in a virtual reality game, can be obtained by calculating the distance between the virtual camera and the virtual object in the virtual reality game.
  • the distance between the object and the camera can be obtained by a camera that can perceive the distance when shooting a virtual reality movie.
  • the distance between the first object and the second object from the viewer that is, the distance between the first object and the second object is virtual reality
  • the distances of the virtual cameras in the game scene are respectively the first distance and the second distance.
  • the first distance and the second distance are recorded and stored as distance information along with other information of the object (eg, color, brightness, texture, etc.).
  • the distance information is virtual display depth information.
  • the distance information can be stored as information of pixels in a virtual reality picture.
  • the control module 3 obtains virtual display depth information of the first object and the second object from the data of the virtual reality game, and respectively gives two liquid crystal lenses corresponding to the two pixel groups according to the virtual display depth information.
  • the unit 20 applies a corresponding electric field to control the refractive indices of the two liquid crystal lens units 20 corresponding to the two pixel groups to virtualize the images of the first object and the second object in a preset ratio according to the respective virtual display depth information.
  • the correspondence between the refractive index and the voltage of the liquid crystal lens unit may be preset in the control module 3.
  • the refractive index of the liquid crystal lens unit 20 is controlled according to the preset correspondence.
  • the virtual image distances of the two second virtual images 102 are equal to the first distance and the second distance, respectively, such that the positions of the two second virtual images 102 of the two objects seen by the viewer and the positions of the two objects in the virtual reality game scene the same. In this way, the viewer can realize the experience consistent with the free viewing of different scenes in the real scene when viewing, that is, different objects may have different distances.
  • FIG. 6 is a schematic diagram showing the structure and imaging of a display module according to an embodiment of the present application.
  • the second virtual image 102 is formed by secondary imaging of the first virtual image formed by the liquid crystal lens layer by the viewing lens 4.
  • the second virtual image 102 is a virtual image that can be viewed by a viewer.
  • the liquid crystal lens layer 2 since the response speed and the excitation voltage of the liquid crystal lens layer 2 are both related to the thickness of the liquid crystal lens layer, in order to improve the response speed of the liquid crystal lens layer and save the liquid crystal lens material, as shown in FIG. 6, the liquid crystal lens layer 2 may be A layer of lens array layer 2a is further superposed thereon.
  • the lens array layer 2a includes a plurality of convex lenses 20a arranged in an array, and each of the convex lenses 20a is provided with at least one of the liquid crystal lens units. Each of the convex lenses 20a may be respectively provided with at least one liquid crystal lens unit 20 to re-image the first virtual image 101 of the at least one liquid crystal lens unit 20 to form a transition virtual image 101a.
  • the lens array layer 2a or its transitional virtual image 101a is located within a focal length of the viewing lens 4 to be imaged again to form a virtual image that is ultimately visible to the viewer.
  • liquid crystal lens layer 2 only a very thin layer of liquid crystal lens layer 2 is required to form a first virtual image 101 at a position d1, d2 from the display screen, and then each pixel group in the display screen is changed by the electric field of the liquid crystal lens unit.
  • the virtual image position of the display content is slightly changed, and the first virtual image 101 is again formed by the lens array layer 2a at a position separated from the display screen by d3 and d4, and the distance d1 and d2 can be enlarged.
  • the final virtual image 101a is finally imaged by viewing the lens 4 to form a virtual image that can be finally seen by the viewer.
  • the virtual image changes significantly compared to the virtual image formed once, thereby achieving
  • the content displayed by each pixel group has a different distance, that is, a stereoscopic image is seen.
  • the lens array layer is a microlens array layer, and each of the convex lenses may correspond to one liquid crystal lens unit 20 to improve imaging accuracy and sharpness.
  • the display module provided by the embodiment of the present invention can be applied to various types of displays or head-mounted display devices, and can be applied to an AR (Augmented Reality) or MR (Mixed Reality) scenario, the viewer's
  • the virtual object has the same line of sight as the real object to enhance the experience.
  • FIG. 7 is a schematic structural diagram and optical imaging diagram of a head mounted display device according to an embodiment of the present application.
  • the head mounted display device may include the display module of each of the foregoing embodiments, and specifically, may include a display screen 1, a liquid crystal lens layer 2, and a viewing lens 4.
  • the head mounted display device can also include a control module 3 (not shown).
  • the control module 3 controls the voltage on the liquid crystal lens unit of the liquid crystal lens layer 2 according to the virtual display depth information of the content displayed by each pixel group of the display screen 1 to change the refractive index of the corresponding liquid crystal lens unit, by the corresponding
  • the liquid crystal lens unit presents a first virtual image 101 of the content displayed by each pixel group on the display screen, and the first virtual image 101 forms a second virtual image 102 through the viewing lens 4.
  • the display module can be two, corresponding to the left and right eyes of the viewer.
  • the components in Fig. 7 can be referred to the foregoing description.
  • FIG. 8 is a diagram showing an image stereoscopic display method according to an embodiment of the present application. As shown in FIG. 8, the method mainly includes the following steps:
  • Step S101 Acquire an image to be displayed to be displayed on a display screen, where the image to be displayed includes a plurality of pixel regions.
  • the display module obtains an image to be displayed on the display screen
  • the information of the pixel included in the image to be displayed is acquired from the image data to be displayed, according to the pixel in the image to be displayed.
  • the information is divided into a plurality of pixel regions, which respectively correspond to a plurality of pixel groups of the display screen, so that each pixel region can be separately imaged so as to be located at a virtual image distance different from the viewer to present Stereo effect.
  • the pixels in the image to be displayed and the pixels on the display screen may not be in one-to-one correspondence, and the correspondence between the pixels of the image to be displayed and the pixels on the display screen may be set according to the corresponding precision requirements.
  • Step S102 Obtain virtual display depth information corresponding to each pixel region in each pixel region corresponding to the plurality of pixel groups on the display screen in the image to be displayed.
  • the virtual display depth information is distance information between a virtual image of the pixel area and a position of the viewer, that is, distance information simulating a distance between the object and the viewer in a real environment.
  • the virtual display depth information is used to determine the position of the virtual image of the pixel area, that is, the virtual image distance corresponding to the pixel area.
  • the virtual display depth information may be recorded in the data of the image to be displayed when the virtual scene is generated or built, so that when the virtual scene is displayed by the display module, the image to be displayed and its virtual display depth information are simultaneously acquired.
  • the distance information of the object when the virtual scene is built or generated, the distance information of the object needs to be recorded to form corresponding virtual display depth information.
  • the distance information of the object can be measured at the time of shooting.
  • a distance measuring device is disposed in the photographing device, and the distance of the object from the photographing device in each of the captured images is measured by the distance measuring device during the shooting of the virtual scene, thereby recording the virtual display depth information corresponding to the object.
  • the image to be displayed includes a left eye display image and a right eye display image
  • the left eye display image includes a plurality of left eye pixel regions
  • the right eye display image includes a plurality of right eye pixel regions
  • the virtual display depth information is according to the phase.
  • the corresponding left-eye pixel area and right-eye pixel area are calculated and calculated.
  • the left eye display image and the right eye display image are respectively viewed by the viewer's left and right eyes, thereby seeing a stereoscopic picture.
  • the virtual display depth information may be calculated according to the difference between the left eye pixel region and the right eye pixel region corresponding to the left eye display image and the right eye display image at the same position. For example, it is calculated by binocular vision triangulation.
  • the image to be displayed of the display screen is a virtual scene created by a computer
  • the virtual scene when the virtual scene is created, the position of the content displayed by each pixel group is marked, so that each pixel group can be marked.
  • Step S103 after the image to be displayed is displayed on the display screen, according to the virtual display depth information corresponding to each pixel region of the plurality of pixel regions, control corresponding to the pixel group corresponding to each pixel region.
  • an image to be displayed is displayed on the display screen 1.
  • the display screen 1 includes a plurality of pixel groups 10, and the plurality of pixel groups 10 are in one-to-one correspondence with the plurality of pixel regions.
  • the liquid crystal lens unit 20 is disposed in the light outgoing direction of each of the plurality of pixel groups 10 .
  • Each pixel group 10 corresponds to one liquid crystal lens unit 20. All of the liquid crystal lens units 20 can form the liquid crystal lens layer 2 and are disposed at the light exit surface of the display screen 1.
  • the liquid crystal lens unit 20 can image the pixel area displayed by the corresponding pixel group 10.
  • the pixel area is the content displayed by the corresponding pixel group 10.
  • the display module adjusts an electric field of the liquid crystal molecules in the liquid crystal lens unit 20 according to the virtual display depth information corresponding to each pixel group 10 of the plurality of pixel groups, so as to change the refractive index of the liquid crystal lens unit 20, thereby making the adjusted plurality of Each pixel area in the pixel area is imaged at a preset position at a preset ratio.
  • the preset position is, for example, the distance d shown in FIG. 5, which is the same as the above, and the distance d can be very small, and can be within 10 mm.
  • a viewing lens 4 is disposed on a side of the liquid crystal lens layer 2 away from the display screen 1.
  • the liquid crystal lens layer 2 and the display screen 1 may be located within a focal length of the viewing lens 4, and are generated by the viewing lens 4 for viewing by a viewer.
  • the lens array layer 2a is not disposed on the liquid crystal lens layer 2, and the viewing lens 4 can be located at the preset position.
  • the first virtual image 101 is imaged at a second preset position to form a second virtual image 102, and the second preset position is away from the display screen 1 relative to the preset position.
  • the liquid crystal lens layer 2 and the display screen 1 are both disposed within a focal length of the viewing lens 4, near the focal length of one time, and the first virtual image formed by one imaging of the liquid crystal lens layer 2 can be located in the viewing lens 4.
  • the position of the second virtual image of the secondary imaging is significantly changed, so that when the viewer views the flat display screen 1 through the viewing lens 4, the corresponding picture of each pixel group can be seen. Different distances, that is, a stereoscopic image is seen.
  • the secondary preset position is the same as or approximately the same as the position marked by the corresponding virtual display depth information, and the user can realize the experience consistent with the free viewing of different scenes in the real scene when viewing, and realize the viewing experience similar to the real scene. It can enable the user to realize a brain-eye coordinated viewing experience in the virtual scene.
  • the liquid crystal lens layer 2 is provided with a lens array layer 2a, and the lens array layer 2a may be formed on the first virtual image 101 of the liquid crystal lens layer 2.
  • the second imaging is performed to form the transition virtual image 101a, and the transition virtual image 101a is subjected to the third imaging by the viewing lens 4 to form a virtual image that is finally seen by the viewer.
  • the virtual image changes significantly, so that when the user sees the plane of the screen through the lens, it can see that the content displayed by each pixel group has a different distance, that is, the stereoscopic image is seen.
  • the liquid crystal lens layer 2 can cause the pixel area displayed by each pixel group 10 to be imaged for the first time, and the refractive index change of the liquid crystal lens layer 2 is used to make the pixel area of each pixel group 10 display first.
  • the virtual image undergoes a slight change within 10 mm, and the last image of the first virtual image is formed by the viewing lens 4 to form a virtual image for the viewer to see.
  • the virtual image distance of the virtual image formed by the viewing lens 4 can magnify the aforementioned minute changes, so that the final image is formed.
  • the virtual image distance is approximately the same as the actual shooting distance, achieving a viewing experience similar to that in a real scene.
  • the content displayed by the plurality of pixel groups is respectively imaged by the liquid crystal lens unit at different virtual image distance positions, and the viewer can see the display content of the display screen as a stereoscopic effect through the liquid crystal lens layer, and each pixel group
  • the control module can control the refractive index change of the corresponding liquid crystal lens unit to change the virtual image distance of the displayed content, so that the virtual image distance can be changed according to the displayed content in real time, and the viewer can view the corresponding stereoscopic effect in real time. To avoid dizziness and discomfort and improve user experience.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).
  • FIG. 9 is a schematic structural diagram of an image stereoscopic display device according to an embodiment of the present application.
  • the image stereoscopic display device 900 may include a processor 901 (for example, a CPU), a network interface 904, a user interface 903, a memory 905, and a communication bus 902.
  • the communication bus 902 is used to implement connection communication between these components.
  • the memory 905 may be a high speed RAM memory or a non-volatile memory such as at least one disk memory.
  • an operating system, a network communication module, a user interface module, and an image stereoscopic display program may be included in the memory 905 as a computer storage medium.
  • the processor 901 can be used to load an image stereoscopic display program stored in the memory 905, and specifically perform the following operations:
  • the pixel group corresponding to each pixel region is controlled to be in the display.
  • An electric field of the liquid crystal molecules in the liquid crystal lens unit in the light-emitting direction of the screen to change the refractive index of the corresponding liquid crystal lens unit, and further, the content of each pixel group is adjusted by the corresponding liquid crystal lens unit A virtual image.
  • the image to be displayed includes a left eye display image and a right eye display image
  • the left eye display image includes a plurality of left eye pixel regions
  • the right eye display image includes a plurality of right eye pixel regions
  • the virtual display The depth information is calculated according to the corresponding left eye pixel area and the right eye pixel area analysis.
  • the virtual display depth information is measured when the image to be displayed is captured; or the virtual display depth information is marked when the virtual scene is created.
  • a viewing lens is disposed on a side of the liquid crystal lens layer away from the display screen, and a second virtual image for viewing by a viewer is generated by the viewing lens based on the first virtual image.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Nonlinear Science (AREA)
  • Mathematical Physics (AREA)
  • Chemical & Material Sciences (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Liquid Crystal (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

一种显示模组、头戴式显示设备及图像立体显示方法和装置(900)。其中,显示模组包括显示屏(1)、液晶透镜层(2)及控制模块(3)。显示屏(1)包括多个像素组(10),液晶透镜层(2)包括多个液晶透镜单元(20),各像素组(10)均对应设置有液晶透镜单元(20)。控制模块(3)电连接于显示屏(1)及液晶透镜层(2),用于获取显示屏(1)上各像素组(10)待显示内容的虚拟显示深度信息,在各像素组(10)显示待显示内容后,根据虚拟显示深度信息控制各像素组(10)对应的液晶透镜单元(20)中的液晶分子所处的电场,以改变对应的液晶透镜单元(20)的折射率,进而,通过对应的液晶透镜单元(20)调整各像素组(10)所显示内容的第一虚像(101)。

Description

显示模组、头戴式显示设备及图像立体显示方法和装置
本申请要求于2017年11月28日提交中国专利局、申请号为201711217289.2、发明名称为“显示模组、头戴式显示设备及图像立体显示方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及图像立体显示技术领域,尤其涉及一种显示模组、头戴式显示设备及图像立体显示方法和装置。
背景技术
现有的头戴式显示设备通常采用透镜(lens)加上屏幕的结构,利用放大镜原理,将屏幕放置于透镜的一倍焦距内,透镜在屏幕和用户之间,从而在离用户一定的距离处呈现屏幕上画面的放大虚像,供用户观看。用户左右眼对应的透镜可以是间隔开的,用户在虚拟场景中的立体体验来自于左右眼画面对应点的相对位置变化,从而造成其两眼视线夹角变化而发生远近感知。但是实际上人眼看到的画面的距离只有屏幕上画面的虚像与用户之间的虚像距,并且此距离通常不会随着屏幕上画面的变化而同时发生变化,这样就会导致用户的大脑感知到画面的距离和双眼晶状体反馈给大脑的距离不一致。当长时间使用头戴式显示设备后,用户会产生自我保护机智,即晕眩不适,从而使得产品使用时间受限。
发明内容
本申请实施例提供了一种显示模组、头戴式显示设备及图像立体显示方法和装置,可实现虚拟物体与真实物体的同视距叠加,提高用户的观看体验。
本申请实施例提供了一种显示模组,包括显示屏、液晶透镜层及控制模块;
所述显示屏包含多个像素组,各像素组内包含至少一颗像素;
所述液晶透镜层包括多个液晶透镜单元,各像素组均对应设置有液晶透镜单元,所述液晶透镜单元包括液晶分子;
所述控制模块电连接于所述显示屏及所述液晶透镜层,用于获取所述显示屏上各像素组待显示内容的虚拟显示深度信息,在各像素组显示所述待显示内容后,根据该虚拟显示深度信息控制各像素组对应的液晶透镜单元中的液晶分子所处的电场,以改变所述对应的液晶透镜单元的折射率,进而,通过所述对应的液晶透镜单元调整各像素组所显示内容的第一虚像。
本申请实施例还提供了一种头戴显示装置,包括前述的显示模组。
本申请实施例还提供了一种图像立体显示方法,包括:
获取待在显示屏上显示的待显示图像,所述待显示图像包含多个像素区;
获取所述待显示图像中与所述显示屏上的多个像素组对应的各像素区中每一像素区所对应的虚拟显示深度信息;
在显示屏上显示出所述待显示图像之后,根据所述多个像素区中每一像素区对应的所述虚拟显示深度信息,控制每一像素区对应的像素组所对应的在所述显示屏的出光方向上的液晶透镜单元中的液晶分子所处的电场,以改变所述对应的液晶透镜单元的折射率,进而,通过所述对应的液晶透镜单元调整各像素组所显示内容的第一虚像。
本申请实施例另外提供了一种图像立体显示装置,包括:处理器和存储器,所述存储器存储有计算机程序,所述计算机程序由所述处理器加载并执行以下步骤:
获取待在显示屏上显示的待显示图像,所述待显示图像包含多个像素区;
获取所述待显示图像中与所述显示屏上的多个像素组对应的各像素区中每一像素区所对应的虚拟显示深度信息;
在显示屏上显示出所述待显示图像之后,根据所述多个像素区中每一像素区对应的所述虚拟显示深度信息,控制每一像素区对应的像素组所对应的在所述显示屏的出光方向上的液晶透镜单元中的液晶分子所处的电场,以改变所述对应的液晶透镜单元的折射率,进而,通过所述对应的液晶透镜单元调整各像素组所显示内容的第一虚像。
除此之外,本申请实施例还提供了一种非易失性存储介质,其中存储有多条指令,所述指令适于由处理器加载并执行以下步骤:
获取待在显示屏上显示的待显示图像,所述待显示图像包含多个像素区;
获取所述待显示图像中与所述显示屏上的多个像素组对应的各像素区中每一像素区所对应的虚拟显示深度信息;
在显示屏上显示出所述待显示图像之后,根据所述多个像素区中每一像素区对应的所述虚拟显示深度信息,控制每一像素区对应的像素组所对应的在所述显示屏的出光方向上的液晶透镜单元中的液晶分子所处的电场,以改变所述对应的液晶透镜单元的折射率,进而,通过所述对应的液晶透镜单元调整各像素组所显示内容的第一虚像。
附图简要说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请实施例提供的显示模组的电连接示意图;
图2为本申请实施例提供的显示模组中显示屏与液晶透镜层的结构及光学成像示意图;
图3为本申请实施例提供的液晶透镜层的光学成像示意图;
图4为本申请实施例提供的液晶透镜层的液晶透镜单元的光学成像示意图;
图5为本申请实施例提供的显示模组的结构及光学成像示意图;
图6是本申请实施例提供的显示模组的结构及成像示意图;
图7为本申请实施例提供的头戴式显示设备的结构及光学成像示意图;
图8是本申请实施例提供的图像立体显示方法的流程示意图;
图9是本申请实施例提供的一种图像立体显示装置的结构示意图。
实施本发明的方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。
通常人在真实场景中观看物体时,两眼到物体的视线夹角与晶状体屈光程度需要根据两眼与不同观察物体之间的距离与角度的不同而发生实时变化,并 且视线夹角方向与晶状体屈光程度一致,即最终被人眼捕捉到的是观察物体的光场信息。为了实现类似于真实场景中的观看体验,本申请实施例提出的显示模组、头戴式显示设备及图像立体显示方法和装置可以使得用户在虚拟场景中也能实现脑眼协调的观看体验,也可以在AR(Augmented Reality,增强现实技术)/MR(Mixed Reality,混合现实)场景中实现虚拟物体与真实物体的同视距叠加,提升用户的观看体验。
图1为本申请实施例提供的显示模组的电连接示意图。图2为本申请实施例提供的显示模组中显示屏与液晶透镜层的结构及光学成像示意图。如图1所示,显示模组包括显示屏1、液晶透镜层2及控制模块3。如图2所示,显示屏1包含多个像素组10,各像素组10内包含至少一颗像素。像素是显示屏的最小发光单位,显示屏中的每一个可被单独控制的发光单元称为像素。液晶透镜层2可以包括多个液晶透镜单元20,各像素组10均对应设置有液晶透镜单元20。液晶透镜单元20包括液晶分子。结合图1所示,控制模块3电连接于显示屏1及液晶透镜层2,用于获取所述显示屏1上各像素组10待显示内容的虚拟显示深度信息,在各像素组10显示所述待显示内容后,根据该虚拟显示深度信息控制各像素组10对应的液晶透镜单元20中的液晶分子所处的电场,以改变所述对应的液晶透镜单元20的折射率,进而,通过所述对应的液晶透镜单元调整各像素组所显示内容的第一虚像。其中,所述虚拟显示深度信息为所述待显示内容中各物体的距离信息。即,控制模块3能够根据像素组10所显示内容中包括的各物体的距离信息控制像素组10对应的液晶透镜单元20中液晶分子所处的电场,以改变所述液晶透镜单元20的折射率,进而调整对应所述像素组10所显示内容的虚像的虚像距。
图3为本申请实施例提供的液晶透镜层的光学成像示意图,图4为本申请实施例提供的液晶透镜层的液晶透镜单元的光学成像示意图。参见图3及图4所示,在液晶透镜(Liquid Crystal Lens,简写LC lens)层2中,当控制模块3通过电极21给每组液晶透镜单元20施加不同的电场时,每组液晶透镜单元20内的液晶分子会发生不同程度的旋转,形成不同的折射率,从而使得每组液晶透镜单元20形成具有不同汇聚能力的微小透镜。由于控制模块3所包括的微 小的电极产生电场使得液晶分子偏转,从而产生聚焦作用,使得液晶透镜单元20形成与其对应的像素组10所显示内容的放大的虚像。通常的透镜原理是在同种折射率下,利用镜面的曲率造成光线偏转,而本申请实施例的液晶透镜层2可以在同样厚度的情况下,利用在液晶透镜层2的不同区域的折射率不同造成光线偏转。
根据放大镜原理,当物体所在平面,即物面,处于透镜的一倍焦距以内时,会在物体同侧形成物体的放大虚像,虚像距离观看者的距离即为虚像距。结合图2所示,显示屏1的像素组10位于液晶透镜单元20的一倍焦距内时,位于显示屏1出光侧的观看者能够看到像素组10所显示内容在液晶透镜单元20同侧形成的第一虚像101。
控制模块3改变液晶透镜单元20的电场,能够使液晶透镜单元20内的液晶分子发生旋转。控制模块3通过给不同的液晶透镜单元20施加不同的电场,使得不同的液晶透镜单元20的液晶分子的旋转角度不同,从而使得多个液晶透镜单元20能够实现对其对应像素组10的光线不同的折射效果,进而使得对应像素组10所显示内容所成的多个第一虚像101具有不同的虚像距。
多个像素组10所显示的内容分别通过液晶透镜单元20成像在不同的虚像距位置处。如图2所示,显示屏距离观看者的距离是一定的(未示出),两个像素组10所显示内容分别所成的第一虚像101在距离显示屏的d1和d2位置处。这样,观看者透过液晶透镜层2即可看到显示屏1所显示内容为立体效果。各像素组10所显示内容变化时,控制模块3可以控制相应的液晶透镜单元20的折射率变化进而改变对应的像素组10所显示内容的虚像的虚像距,使得各像素组10所显示内容的虚像的虚像距可以实时根据所显示内容进行变化,观看者可以实时观看到相应的立体效果,避免了眩晕不适,提高用户体验。
图5为本申请实施例提供的显示模组的结构及光学成像示意图。进一步,如图5所示,显示模组还可以包括观看透镜4,液晶透镜层2位于显示屏1与观看透镜4之间,观看透镜4位于用户和液晶透镜层2之间。本申请实施例中,观看透镜4为一凸透镜。显示屏1所显示的内容位于液晶透镜层2的一倍焦距以内,通过液晶透镜层2形成的显示屏1所显示内容的第一虚像101位于 观看透镜4的一倍焦距以内,使得观看透镜4可以对第一虚像101进行放大形成放大的第二虚像102。
由于液晶透镜层2的微小折光性,使得原本位置的像素组10所显示内容一次成像形成的第一虚像101位于与显示屏1相距为d的位置处。距离d可以非常小,在毫米mm级别。通过设置观看透镜4,将液晶透镜层2与显示屏1均设置在观看透镜4的一倍焦距以内,靠近一倍焦距附近,经液晶透镜层2一次成像形成的第一虚像101可以位于观看透镜4的一倍焦距以内,靠近一倍焦距附近。即第一虚像101与观看透镜4之间的距离D2接近观看透镜4的焦距。所以一次成像形成的第一虚像101的微小变化都可以导致其通过观看透镜4二次成像后的第二虚像102变化非常大,二次成像后形成的第二虚像102至观看透镜4之间的距离D会因为D2的微小变化而发生非常显著的变化。例如,当观看透镜4的焦距在30mm附近时,D2的变化为1mm就能使二次虚像至观看透镜之间的距离D变化超过10m。通过观看透镜4可以使得二次成像的第二虚像的位置发生显著变化,从而实现当观看者透过观看透镜4观看平面的显示屏1时,却能看到每个像素组所显示内容有不同的距离,即看到立体图像画面。
显示屏1与液晶透镜层2可以贴合设置,以方便二者之间的装配连接,使得各像素组10与液晶透镜单元20之间的距离均相等,利于对成像的控制;同时可以使得显示屏1与液晶透镜层2之间没有间隙,避免灰尘等杂物的进入。显示屏1与液晶透镜层2贴合设置,可以尽可能减小二者之间的距离,使得液晶透镜单元20的折射率改变至焦距最小时,像素组10依然位于液晶透镜单元20的一倍焦距以内。
显示屏1的显示区域面积可以小于或等于液晶透镜层2的面积,液晶透镜层2的边缘延伸至显示屏1的显示区域之外,可以保证显示屏1上每个像素组10均对应设置有液晶透镜单元20。同时,液晶透镜层2的面积大于显示屏1的显示区域面积,可以便于对液晶透镜层2的液晶透镜单元20与显示屏1的像素组10之间的校准。例如,当被控制的液晶透镜单元20与对应的像素组10之间的对应发生错位时,用户可以对液晶透镜单元20进行移动,控制所有液晶透镜单元20进行左右或上下移动,此时,依然可以保证所有像素组10均依然对应有液晶透镜单元20。
在通过电极施加电压时,液晶透镜单元20内的多个液晶分子排布可以呈矩阵状或排布呈依次嵌套的多环状。
如图4所示,每个液晶透镜单元20可以设置有用于控制电场的两个电极21,所述两个电极21沿观看者的左右方向排布。两个电极21分别为正电极和负电极。通过改变对两个电极21所施加的电压,可以改变液晶透镜单元20内液晶分子所处的电场,从而改变液晶透镜单元20的折射率。两个电极21沿观看者的左右方向排布,可以与观看者左右眼形成最佳视觉效果。
根据本申请实施例,液晶透镜层也可以包含许多像素。每个液晶透镜单元20包含至少一颗像素。所述电极在给液晶透镜单元20施加电压时,是施加在液晶透镜单元20所包含的像素上,控制像素的开关及灰阶。人们肉眼所见的液晶屏幕的每一个点,即一个像素,是由红、绿、蓝(RGB)三个子像素组成的。每一个子像素,其背后的光源都可以显现出不同的亮度级别。灰阶代表了由最暗到最亮之间不同亮度的层次级别,层级越多,所能够呈现的画面效果也就越细腻。不同亮度层次的红、绿、蓝组合起来,最终形成不同色彩的点。
根据本申请实施例,由于控制模块3是根据像素组10所显示内容中包括的各物体的距离信息控制像素组10对应的液晶透镜单元20中液晶分子所处的电场,以改变液晶透镜单元20的折射率,进而调整对应的像素组10所显示内容的的虚像的虚像距,因此,可以在虚拟场景搭建或生成时,例如在通过虚拟引擎Unity设计虚拟现实游戏或者拍摄虚拟现实电影时,将虚拟现实游戏场景中或者虚拟现实电影场景中的待由显示屏1的每个像素组10显示的物体与摄像头之间的距离信息记录下来。之后在用户在体验所述虚拟现实游戏或虚拟现实电影时,图像立体显示装置,例如,头戴式显示设备,获取所述虚拟现实游戏或者虚拟现实电影中记录的物体与摄像头之间的距离,按照该距离将所对应的物体的虚像呈现出来。
根据本申请实施例,在虚拟现实游戏中,物体与摄像头之间的距离可以通过计算虚拟现实游戏中的虚拟摄像头与虚拟物体之间的距离得到。在虚拟现实电影的情况下,可以在拍摄虚拟现实电影时,通过可以感知距离的摄像头获得物体与摄像头之间的距离。
例如,在设计一个虚拟现实游戏时,该虚拟现实游戏场景中存在第一物体和第二物体,第一物体与第二物体距离观看者的距离(也即第一物体与第二物 体距离虚拟现实游戏场景中的虚拟摄像头的距离)分别为第一距离、第二距离。第一距离和第二距离作为距离信息与物体的其他信息(例如,颜色、亮度、纹理等)一起被记录和存储。所述距离信息即为虚拟显示深度信息。在虚拟现实电影的情况下,所述距离信息可以作为虚拟现实画面中的像素点的信息存储。
在进行虚拟成像时,例如观看者通过头戴式显示设备体验所述虚拟现实游戏,头戴式显示设备中显示屏1的两个像素组10分别呈现第一物体和第二物体的影像时,控制模块3从所述虚拟现实游戏的数据中获得所述第一物体和第二物体的虚拟显示深度信息,并根据虚拟显示深度信息,分别给与所述两个像素组对应的两个液晶透镜单元20施加相应的电场,控制与两个像素组相对应的两个液晶透镜单元20的折射率,以将第一物体和第二物体的影像根据各自虚拟显示深度信息以预设的比例进行虚拟成像,形成两个第一虚像101,再通过观看透镜4对两个第一虚像101进行再次虚拟成像,形成两个第二虚像102。根据本申请实施例,控制模块3中可以预先设置有液晶透镜单元的折射率与电压之间的对应关系,在进行虚拟成像时,根据该预先设置的对应关系控制液晶透镜单元20的折射率。两个第二虚像102的虚像距分别与第一距离和第二距离相等,使得观看者看到的两个物体的两个第二虚像102的位置与两个物体在虚拟现实游戏场景中的位置相同。这样一来,观看者在观看时就可以实现和在真实场景中自由观看不同景物时一致的体验,即不同的物体可能有不同的距离。
图6是本申请实施例提供的显示模组的结构及成像示意图。在前面实施例中,通过观看透镜4对液晶透镜层所成的第一虚像进行二次成像形成第二虚像102,第二虚像102即为观看者可以观看到的虚像。根据本申请实施例,由于液晶透镜层2的响应速度与激发电压都与自身厚度相关,为了提高液晶透镜层的响应速度,并且节省液晶透镜材料,如图6所示,可以在液晶透镜层2上再叠加一层透镜阵列层2a。透镜阵列层2a包括阵列排布的多个凸透镜20a,每个所述凸透镜20a对应设置至少一个所述液晶透镜单元。每个凸透镜20a可以分别对应设置至少一个液晶透镜单元20,以便对至少一个液晶透镜单元20的第一虚像101进行再次成像形成过渡虚像101a。透镜阵列层2a或其所成过渡虚 像101a位于观看透镜4的一倍焦距以内,以便再次成像形成最终供观看者看到的虚像。这样,只需要很薄的一层液晶透镜层2将在与显示屏距离为d1、d2的位置处形成第一虚像101,之后通过液晶透镜单元电场的变化使显示屏中的每个像素组所显示内容的虚像位置发生微小变化,再通过透镜阵列层2a将第一虚像101再次在与显示屏距离为d3、d4的位置处形成过渡虚像101a,进而可以将距离d1、d2进行放大。最后,通过观看透镜4将过渡虚像101a进行最后的成像,形成最终可以供观看者看到的虚像,经过三次成像后的虚像相比较于一次所成的虚像发生显著变化,从而实现当用户透过观看透镜4看平面的屏幕时,却能看到每个像素组所显示内容有不同的距离,即看到立体画面图像。
进一步,在本申请实施例中,透镜阵列层为微透镜阵列层,其中的每一个凸透镜可以对应一个液晶透镜单元20,以提高成像准确度、清晰度。
本申请实施例提供的显示模组,可以应用于各类显示器或头戴显示装置中,可以适用于AR(Augmented Reality,增强现实技术)或MR(Mixed Reality,混合现实)场景中,观看者的虚拟物体与真实物体的视距相同从而提高体验感。
图7为本申请实施例提供的头戴式显示设备的结构及光学成像示意图。如图7所示,该头戴式显示设备可以包括前述各个实施例的显示模组,具体地,可以包括显示屏1、液晶透镜层2、观看透镜4。头戴式显示设备还可以包括控制模块3(未示出)。控制模块3根据显示屏1的各像素组所显示内容的虚拟显示深度信息控制液晶透镜层2的液晶透镜单元上的电压,以改变所述对应的液晶透镜单元的折射率,通过所述对应的液晶透镜单元呈现所述显示屏上各像素组所显示内容的第一虚像101,该第一虚像101通过观看透镜4形成第二虚像102。显示模组可以为两个,分别对应观看者的左右眼。图7中的各部件可以参考前文的描述。
本申请实施例还提供了一种图像立体显示方法,可以适用于前述的显示模组。图8所示为本申请实施例提供的图像立体显示方法,如图8所示,该方法主要包括以下步骤:
步骤S101、获取待在显示屏上显示的待显示图像,待显示图像包含多个像素区。
根据本申请实施例,例如显示模组获得待在显示屏上显示的图像时,从待显示的图像数据中获取待显示图像中所包含的像素的信息,按照该待显示的图像中的像素的信息,将待显示图像划分为多个像素区,分别对应于显示屏的多个像素组,以可以对每个像素区分别进行成像,使其位于距离观看者不同的虚像距处,以呈现出立体效果。待显示图像中的像素和显示屏上的像素可能不是一一对应的,可以依照相应的精度要求设置待显示图像的像素和显示屏上的像素的对应关系。
步骤S102、获取所述待显示图像中与所述显示屏上的多个像素组对应的各像素区中每一像素区所对应的虚拟显示深度信息。
根据本申请实施例,虚拟显示深度信息为像素区的虚像与观看者位置处之间的距离信息,即模拟在真实环境中物体距离观看者之间的距离的距离信息。该虚拟显示深度信息用于确定像素区的虚像的位置,即像素区对应的虚像距。虚拟显示深度信息可以在生成或搭建虚拟场景时,记录在待显示图像的数据中,以当通过用显示模组显示虚拟场景时,同时获取待显示图像及其虚拟显示深度信息。
根据本申请实施例,在虚拟场景搭建或生成时,需要将物体的距离信息记录下来,形成对应的虚拟显示深度信息。所述物体的距离信息可以在拍摄时测量出。例如在摄影设备中设置一距离测量设备,在拍摄虚拟场景的过程中通过距离测量设备测量所拍摄的每一幅画面中物体距离摄影设备的距离,从而记录下物体对应的虚拟显示深度信息。
根据本申请实施例,待显示图像包括左眼显示图像及右眼显示图像,左眼显示图像包含多个左眼像素区,右眼显示图像包含多个右眼像素区,虚拟显示深度信息根据相对应的左眼像素区及右眼像素区分析计算得出。通常左眼显示图像及右眼显示图像分别由观看者的左眼及右眼观看,从而看到立体画面。可以根据左眼显示图像及右眼显示图像在同一位置处相对应的左眼像素区和右眼像素区二者之间差异计算出该处的虚拟显示深度信息。例如,通过双目视觉三角测量法来计算。
根据本申请实施例,显示屏的待显示图像为计算机制作的虚拟场景时,在虚拟场景进行制作时即标注出每个像素组所显示内容所在的位置,从而可以标记出每个像素组所显示内容与观看者位置之间的距离,即每个像素区的虚拟显示深度信息。
步骤S103,在显示屏上显示出所述待显示图像之后,根据所述多个像素区中每一像素区对应的所述虚拟显示深度信息,控制每一像素区对应的像素组所对应的在所述显示屏的出光方向上的液晶透镜单元中的液晶分子所处的电场,以改变所述对应的液晶透镜单元的折射率,进而,通过所述对应的液晶透镜单元调整各像素组所显示内容的第一虚像。
根据本申请实施例,在显示屏1上显示出待显示图像,显示屏1包括多个像素组10,多个像素组10与多个像素区一一对应。其中,多个像素组10中每一像素组10的出光方向设置有液晶透镜单元20。每个像素组10对应一个液晶透镜单元20。所有的液晶透镜单元20可以形成液晶透镜层2,并设置在显示屏1的出光面处。液晶透镜单元20可以对对应的像素组10所显示出的像素区进行成像。像素区即为对应像素组10所显示内容。
根据多个像素组中每一像素组10对应的虚拟显示深度信息,显示模组调整液晶透镜单元20中液晶分子所处电场,以改变液晶透镜单元20的折射率,进而使调整后的多个像素区中每一像素区按预设比例成像在预设位置处。预设位置例如为图5所示的距离d,与前述相同,距离d可以非常小,可以在10mm以内。
在所述液晶透镜层2远离所述显示屏1的一侧设置观看透镜4,液晶透镜层2与显示屏1可以位于观看透镜4的一倍焦距内,通过观看透镜4生成供观看者观看的显示屏1所显示内容的虚像。
例如在前述第一种显示模组的实现方式中,例如如图5所示的方式,液晶透镜层2上未设置有透镜阵列层2a,所述观看透镜4可以使位于所述预设位置处的第一虚像101成像在二次预设位置处,形成第二虚像102,所述二次预设位置相对所述预设位置远离所述显示屏1。
通过观看透镜4,将液晶透镜层2与显示屏1均设置在观看透镜4的一倍焦距之内,靠近一倍焦距附近,经液晶透镜层2一次成像形成的第一虚像可以位于观看透镜4的一倍焦距附近,使得二次成像的第二虚像的位置发生显著变 化,从而实现当观看者透过观看透镜4观看平面的显示屏1时,却能看到每个像素组对应的画面有不同的距离,即看到立体图像画面。
二次预设位置与对应虚拟显示深度信息所标记的位置相同或者近似相同,用户在观看时就可以实现和在真实场景中自由观看不同景物时一致的体验,实现类似于真实场景中的观看体验,可以使得用户在虚拟场景中也能实现脑眼协调的观看体验。
在前述第二种显示模组的实施方式中,例如如图6所示的方式,液晶透镜层2上设置有透镜阵列层2a,透镜阵列层2a可以对液晶透镜层2形成的第一虚像101进行第二次成像形成过渡虚像101a,过渡虚像101a经观看透镜4进行第三次成像后可以形成最终被观看者看到的虚像。经过三次成像后的虚像发生显著变化,从而实现当用户透过观看透镜看平面的屏幕时,却能看到每个像素组所显示内容有不同的距离,即看到立体画面图像。
本申请实施例中,液晶透镜层2可以使每个像素组10所显示的像素区发生第一次成像,利用液晶透镜层2的折射率变化使得各个像素组10所显示的像素区的第一虚像发生10mm以内的微小变化,再通过观看透镜4对第一虚像进行最后一次成像形成供观看者看到的虚像,观看透镜4形成的虚像的虚像距可以将前述微小变化放大,使得最终成像的虚像距与实际拍摄距离大致相同,实现类似于真实场景中的观看体验。
本申请实施例中多个像素组所显示的内容分别通过液晶透镜单元成像在不同的虚像距位置处,观看者透过液晶透镜层即可看到显示屏所显示内容为立体效果,各像素组所显示内容变化时,控制模块可以控制相应的液晶透镜单元的折射率变化进而改变所显示内容的虚像距,使得虚像距可以实时根据所显示内容进行变化,观看者可以实时观看到相应的立体效果,避免了眩晕不适,提高用户体验。
实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述方法实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。
图9为本申请实施例的一种图像立体显示装置的结构示意图。如图9所示,该图像立体显示装置900可以包括:处理器901(例如CPU)、网络接口904、用户接口903、存储器905、通信总线902。其中,通信总线902用于实现这些组件之间的连接通信。其中,存储器905可以是高速RAM存储器,也可以是非易失性存储器(non-volatile memory),例如至少一个磁盘存储器。如图9所示,作为一种计算机存储介质的存储器905中可以包括操作系统、网络通信模块、用户接口模块以及图像立体显示程序。
其中,处理器901可以用于加载存储器905中存储的图像立体显示程序,并具体执行以下操作:
获取待在显示屏上显示的待显示图像,所述待显示图像包含多个像素区;
获取所述待显示图像中与所述显示屏上的多个像素组对应的各像素区中每一像素区所对应的虚拟显示深度信息;
在显示屏上显示出所述待显示图像之后,根据所述多个像素区中每一像素区对应的所述虚拟显示深度信息,控制每一像素区对应的像素组所对应的在所述显示屏的出光方向上的液晶透镜单元中的液晶分子所处的电场,以改变所述对应的液晶透镜单元的折射率,进而,通过所述对应的液晶透镜单元调整各像素组所显示内容的第一虚像。
其中,所述待显示图像包括左眼显示图像及右眼显示图像,所述左眼显示图像包含多个左眼像素区,所述右眼显示图像包含多个右眼像素区,所述虚拟显示深度信息根据相对应的所述左眼像素区及所述右眼像素区分析计算得出。
其中,所述虚拟显示深度信息为所述待显示图像在拍摄时测量出;或者,所述虚拟显示深度信息为所述待显示图像在虚拟场景制作时标出。
其中,在所述液晶透镜层远离所述显示屏的一侧设置观看透镜,通过所述观看透镜基于所述第一虚像生成供观看者观看的第二虚像。
以上所揭露的仅为本申请的实施例而已,当然不能以此来限定本申请之权利范围,本领域普通技术人员可以理解实现上述实施例的全部或部分流程,并依本申请权利要求所作的等同变化,仍属于本申请所涵盖的范围。

Claims (15)

  1. 一种显示模组,其特征在于,包括显示屏、液晶透镜层及控制模块;
    所述显示屏包含多个像素组,各像素组内包含至少一颗像素;
    所述液晶透镜层包括多个液晶透镜单元,各像素组均对应设置有液晶透镜单元,所述液晶透镜单元包括液晶分子;
    所述控制模块电连接于所述显示屏及所述液晶透镜层,用于获取所述显示屏上各像素组待显示内容的虚拟显示深度信息,在各像素组显示所述待显示内容后,根据该虚拟显示深度信息控制各像素组对应的液晶透镜单元中的液晶分子所处的电场,以改变所述对应的液晶透镜单元的折射率,进而,通过所述对应的液晶透镜单元调整各像素组所显示内容的第一虚像。
  2. 根据权利要求1所述的显示模组,其特征在于,所述显示模组还包括观看透镜,所述液晶透镜层位于所述显示屏与所述观看透镜之间,所述观看透镜用于基于所述第一虚像生成供观看者观看的第二虚像。
  3. 根据权利要求1或2所述的显示模组,其特征在于,所述液晶透镜层上远离所述显示屏的一侧设置有透镜阵列层,所述透镜阵列层包括阵列排布的多个凸透镜,每个所述凸透镜对应设置至少一个所述液晶透镜单元。
  4. 根据权利要求3所述的显示模组,其特征在于,所述透镜阵列层为微透镜阵列层,每个所述凸透镜对应设置一个所述液晶透镜单元。
  5. 根据权利要求1所述的显示模组,其特征在于,所述显示屏与所述液晶透镜层贴合设置。
  6. 一种头戴显示装置,其特征在于,包括权利要求1-5任一项所述的显示模组。
  7. 一种图像立体显示方法,其特征在于,包括:
    获取待在显示屏上显示的待显示图像,所述待显示图像包含多个像素区;
    获取所述待显示图像中与所述显示屏上的多个像素组对应的各像素区中每一像素区所对应的虚拟显示深度信息;
    在显示屏上显示出所述待显示图像之后,根据所述多个像素区中每一像素区对应的所述虚拟显示深度信息,控制每一像素区对应的像素组所对应的在所述显示屏的出光方向上的液晶透镜单元中的液晶分子所处的电场,以改变所述对应的液晶透镜单元的折射率,进而,通过所述对应的液晶透镜单元调整各像素组所显示内容的第一虚像。
  8. 根据权利要求7所述的图像立体显示方法,其特征在于,所述待显示图像包括左眼显示图像及右眼显示图像,所述左眼显示图像包含多个左眼像素区,所述右眼显示图像包含多个右眼像素区,所述虚拟显示深度信息根据相对应的所述左眼像素区及所述右眼像素区分析计算得出。
  9. 根据权利要求7所述的图像立体显示方法,其特征在于,
    所述虚拟显示深度信息为所述待显示图像在拍摄时测量出;或者,
    所述虚拟显示深度信息为所述待显示图像在虚拟场景制作时标出。
  10. 根据权利要求7所述的图像立体显示方法,其特征在于,在所述液晶透镜层远离所述显示屏的一侧设置观看透镜,通过所述观看透镜基于所述第一虚像生成供观看者观看的第二虚像。
  11. 一种图像立体显示装置,其特征在于,包括:处理器和存储器,所述存储器存储有计算机程序,所述计算机程序由所述处理器加载并执行以下步骤:
    获取待在显示屏上显示的待显示图像,所述待显示图像包含多个像素区;
    获取所述待显示图像中与所述显示屏上的多个像素组对应的各像素区中每一像素区所对应的虚拟显示深度信息;
    在显示屏上显示出所述待显示图像之后,根据所述多个像素区中每一像素区对应的所述虚拟显示深度信息,控制每一像素区对应的像素组所对应的在所述显示屏的出光方向上的液晶透镜单元中的液晶分子所处的电场,以改变所述 对应的液晶透镜单元的折射率,进而,通过所述对应的液晶透镜单元调整各像素组所显示内容的第一虚像。
  12. 根据权利要求11所述的图像立体显示装置,其特征在于,所述待显示图像包括左眼显示图像及右眼显示图像,所述左眼显示图像包含多个左眼像素区,所述右眼显示图像包含多个右眼像素区,所述虚拟显示深度信息根据相对应的所述左眼像素区及所述右眼像素区分析计算得出。
  13. 根据权利要求11所述的图像立体显示装置,其特征在于,
    所述虚拟显示深度信息为所述待显示图像在拍摄时测量出;或者,
    所述虚拟显示深度信息为所述待显示图像在虚拟场景制作时标出。
  14. 根据权利要求11所述的图像立体显示装置,其特征在于,在所述液晶透镜层远离所述显示屏的一侧设置观看透镜,通过所述观看透镜基于所述第一虚像生成供观看者观看的第二虚像。
  15. 一种非易失性存储介质,其特征在于,其中存储有多条指令,所述指令适于由处理器加载并执行以下步骤:
    获取待在显示屏上显示的待显示图像,所述待显示图像包含多个像素区;
    获取所述待显示图像中与所述显示屏上的多个像素组对应的各像素区中每一像素区所对应的虚拟显示深度信息;
    在显示屏上显示出所述待显示图像之后,根据所述多个像素区中每一像素区对应的所述虚拟显示深度信息,控制每一像素区对应的像素组所对应的在所述显示屏的出光方向上的液晶透镜单元中的液晶分子所处的电场,以改变所述对应的液晶透镜单元的折射率,进而,通过所述对应的液晶透镜单元调整各像素组所显示内容的第一虚像。
PCT/CN2018/117429 2017-11-28 2018-11-26 显示模组、头戴式显示设备及图像立体显示方法和装置 WO2019105323A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/667,242 US11064187B2 (en) 2017-11-28 2019-10-29 Display module, head mounted display, and image stereoscopic display method and apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711217289.2 2017-11-28
CN201711217289.2A CN107884940A (zh) 2017-11-28 2017-11-28 显示模组、头戴式显示设备及图像立体显示方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/667,242 Continuation US11064187B2 (en) 2017-11-28 2019-10-29 Display module, head mounted display, and image stereoscopic display method and apparatus

Publications (1)

Publication Number Publication Date
WO2019105323A1 true WO2019105323A1 (zh) 2019-06-06

Family

ID=61775819

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/117429 WO2019105323A1 (zh) 2017-11-28 2018-11-26 显示模组、头戴式显示设备及图像立体显示方法和装置

Country Status (3)

Country Link
US (1) US11064187B2 (zh)
CN (1) CN107884940A (zh)
WO (1) WO2019105323A1 (zh)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107884940A (zh) 2017-11-28 2018-04-06 腾讯科技(深圳)有限公司 显示模组、头戴式显示设备及图像立体显示方法
CN108803024A (zh) * 2018-03-08 2018-11-13 成都理想境界科技有限公司 一种实现光场显示的近眼显示设备、近眼显示装置和屏幕
CN112154077A (zh) * 2018-05-24 2020-12-29 三菱电机株式会社 车辆用显示控制装置和车辆用显示控制方法
WO2019232768A1 (en) * 2018-06-08 2019-12-12 Chiu Po Hsien Devices for displaying 3d image
CN109188700B (zh) * 2018-10-30 2021-05-11 京东方科技集团股份有限公司 光学显示系统及ar/vr显示装置
EP3913912A4 (en) * 2019-01-14 2022-08-24 BOE Technology Group Co., Ltd. DISPLAY DEVICE, ELECTRONIC DEVICE AND DRIVE METHOD FOR A DISPLAY DEVICE
US11509882B2 (en) 2019-08-26 2022-11-22 Beijing Boe Optoelectronics Technology Co., Ltd. Three-dimensional display apparatus and virtual reality device
CN111290164A (zh) 2020-03-31 2020-06-16 京东方科技集团股份有限公司 透明显示面板、显示装置及眼镜
CN111427166B (zh) * 2020-03-31 2022-07-05 京东方科技集团股份有限公司 一种光场显示方法及系统、存储介质和显示面板
CN111782063B (zh) * 2020-06-08 2021-08-31 腾讯科技(深圳)有限公司 实时显示方法、系统及计算机可读存储介质和终端设备
CN113093402B (zh) * 2021-04-16 2022-12-02 业成科技(成都)有限公司 立体显示器及其制造方法与立体显示系统
CN114624886A (zh) * 2022-03-14 2022-06-14 惠州Tcl移动通信有限公司 成像装置、方法及增强或虚拟现实设备
CN115016139A (zh) * 2022-07-18 2022-09-06 未来科技(襄阳)有限公司 元宇宙3d显示系统、方法和相关设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08313848A (ja) * 1995-05-16 1996-11-29 Citizen Watch Co Ltd 接眼端末装置
US5825456A (en) * 1995-05-24 1998-10-20 Olympus Optical Company, Ltd. Stereoscopic video display apparatus
CN204331219U (zh) * 2014-12-03 2015-05-13 京东方科技集团股份有限公司 一种3d显示装置
CN105242405A (zh) * 2012-06-07 2016-01-13 冯林 基于电致折射率改变的3d显示方法
CN205562957U (zh) * 2016-01-12 2016-09-07 叠境数字科技(上海)有限公司 基于液晶透镜阵列的头戴式显示设备
CN107884940A (zh) * 2017-11-28 2018-04-06 腾讯科技(深圳)有限公司 显示模组、头戴式显示设备及图像立体显示方法

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7034784B2 (en) * 2001-11-22 2006-04-25 Sharp Kabushiki Kaisha Optical shifter and optical display system
CA2627999C (en) * 2007-04-03 2011-11-15 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of Industry Through The Communications Research Centre Canada Generation of a depth map from a monoscopic color image for rendering stereoscopic still and video images
KR101350475B1 (ko) * 2007-04-12 2014-01-15 삼성전자주식회사 고효율 2차원/3차원 겸용 영상 표시장치
US20120105747A1 (en) * 2010-10-31 2012-05-03 Sajal Biring Optical system for displaying three-dimensional images and associated method
TW201314267A (zh) * 2011-05-30 2013-04-01 Koninkl Philips Electronics Nv 自動立體顯示裝置
US8705177B1 (en) * 2011-12-05 2014-04-22 Google Inc. Integrated near-to-eye display module
CN102854694B (zh) * 2012-09-25 2016-12-21 深圳市华星光电技术有限公司 2d/3d切换的液晶透镜组件
CN105093541A (zh) 2014-05-22 2015-11-25 华为技术有限公司 显示装置
CN104076572B (zh) * 2014-06-20 2017-01-18 京东方科技集团股份有限公司 菲涅尔液晶透镜面板、其制备方法及应用其的3d显示器
KR101648119B1 (ko) * 2014-10-29 2016-08-16 에스케이씨하스디스플레이필름(유) 무안경 입체 영상 표시 장치용 필름
JPWO2016072194A1 (ja) * 2014-11-07 2017-09-14 ソニー株式会社 表示装置及び表示制御方法
CN104360533B (zh) * 2014-12-03 2017-08-29 京东方科技集团股份有限公司 一种 3d 显示装置及其显示驱动方法
US9846310B2 (en) * 2015-06-22 2017-12-19 Innolux Corporation 3D image display device with improved depth ranges
CN106291959B (zh) 2016-10-31 2018-02-27 京东方科技集团股份有限公司 一种虚拟显示面板及显示装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08313848A (ja) * 1995-05-16 1996-11-29 Citizen Watch Co Ltd 接眼端末装置
US5825456A (en) * 1995-05-24 1998-10-20 Olympus Optical Company, Ltd. Stereoscopic video display apparatus
CN105242405A (zh) * 2012-06-07 2016-01-13 冯林 基于电致折射率改变的3d显示方法
CN204331219U (zh) * 2014-12-03 2015-05-13 京东方科技集团股份有限公司 一种3d显示装置
CN205562957U (zh) * 2016-01-12 2016-09-07 叠境数字科技(上海)有限公司 基于液晶透镜阵列的头戴式显示设备
CN107884940A (zh) * 2017-11-28 2018-04-06 腾讯科技(深圳)有限公司 显示模组、头戴式显示设备及图像立体显示方法

Also Published As

Publication number Publication date
US20200068191A1 (en) 2020-02-27
US11064187B2 (en) 2021-07-13
CN107884940A (zh) 2018-04-06

Similar Documents

Publication Publication Date Title
WO2019105323A1 (zh) 显示模组、头戴式显示设备及图像立体显示方法和装置
US11109015B2 (en) Display apparatus, display apparatus driving method, and electronic instrument
KR100440956B1 (ko) 2d/3d 겸용 디스플레이
US8363156B2 (en) Single-lens 2D/3D digital camera
TWI665905B (zh) 立體影像產生方法、成像方法與系統
CN203522943U (zh) 图像采集装置和3d显示系统
US20190035364A1 (en) Display apparatus, method of driving display apparatus, and electronic apparatus
US10642061B2 (en) Display panel and display apparatus
JP2008085503A (ja) 三次元画像処理装置、方法、プログラム及び三次元画像表示装置
WO2016026338A1 (zh) 立体成像装置、方法、显示器和终端
WO2019000948A1 (zh) 三维立体显示面板、其显示方法及显示装置
US10775617B2 (en) Eye tracked lens for increased screen resolution
KR100440955B1 (ko) 2d/3d 겸용 디스플레이
EP2408217A2 (en) Method of virtual 3d image presentation and apparatus for virtual 3d image presentation
US20230077212A1 (en) Display apparatus, system, and method
CN206133120U (zh) 一种显示面板和显示装置
KR100274624B1 (ko) 적층액정디스플레이를이용한3차원입체화상생성장치
JP2001218231A (ja) 立体画像を表示する装置および方法
JP3825414B2 (ja) 三次元表示装置
RU2609285C9 (ru) Способ формирования многопланового изображения и мультифокальный стереоскопический дисплей
Wu et al. Multi-view autostereoscopic display based on tilt-fit lenticular lenses
JPH05100206A (ja) デイスプレイ装置
JP6179282B2 (ja) 3次元画像表示装置及び3次元画像表示方法
JPH06233326A (ja) 立体映像システム
Kakeya et al. Realization of precise depth perception with coarse integral volumetric imaging

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18884815

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18884815

Country of ref document: EP

Kind code of ref document: A1