WO2019232768A1 - Devices for displaying 3d image - Google Patents

Devices for displaying 3d image Download PDF

Info

Publication number
WO2019232768A1
WO2019232768A1 PCT/CN2018/090325 CN2018090325W WO2019232768A1 WO 2019232768 A1 WO2019232768 A1 WO 2019232768A1 CN 2018090325 W CN2018090325 W CN 2018090325W WO 2019232768 A1 WO2019232768 A1 WO 2019232768A1
Authority
WO
WIPO (PCT)
Prior art keywords
converging element
refracting
image
pixel
unit
Prior art date
Application number
PCT/CN2018/090325
Other languages
French (fr)
Inventor
Po-Hsien Chiu
Original Assignee
Chiu Po Hsien
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chiu Po Hsien filed Critical Chiu Po Hsien
Priority to PCT/CN2018/090325 priority Critical patent/WO2019232768A1/en
Publication of WO2019232768A1 publication Critical patent/WO2019232768A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/004Optical devices or arrangements for the control of light using movable or deformable optical elements based on a displacement or a deformation of a fluid
    • G02B26/005Optical devices or arrangements for the control of light using movable or deformable optical elements based on a displacement or a deformation of a fluid based on electrowetting
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/12Fluid-filled or evacuated lenses
    • G02B3/14Fluid-filled or evacuated lenses of variable focal length
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/27Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
    • G02B30/28Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays involving active lenticular arrays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/322Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using varifocal lenses or mirrors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B2207/00Coding scheme for general features or characteristics of optical elements and systems of subclass G02B, but not including elements and systems which would be classified in G02B6/00 and subgroups
    • G02B2207/115Electrowetting
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/29Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the position or the direction of light beams, i.e. deflection
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/29Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the position or the direction of light beams, i.e. deflection
    • G02F1/294Variable focal length devices

Definitions

  • the present disclosure relates generally to displays and, more particularly, to devices for displaying 3D images.
  • the present disclosure provides an image display device, including: a pixel array including pixel units; and a refracting array including refracting units.
  • a refracting unit refracts a light from a pixel unit of the pixel units to form a pixel image of the pixel unit, and the refracting unit is controlled to change a refracted direction of the light to change an image distance of the pixel image of the pixel unit.
  • the refracting units can be controlled differently from each other to have different image distances of pixel images of the pixel units.
  • the image display device presents an image displayed in a display device as a virtual image to the eyes, and thus the image can be seen as being beyond the screen.
  • the image display device provided by the present disclosure can present the 3D images without wearing the 3D glasses.
  • Fig.1A is a block diagram of a refracting unit for forming a pixel image of a pixel unit of a display on the basis of an embodiment of the present disclosure.
  • Fig.1B is a side view of a refracting unit on the basis of an embodiment of the present disclosure.
  • FIG. 2A is a block diagram of a device for displaying 3D images on the basis of another embodiment of the present disclosure.
  • FIG. 2B is a perspective view of a device for displaying 3D images based on an embodiment of the present disclosure.
  • FIGS. 3A-3C are side views of a refracting unit including two converging elements on the basis of embodiments of the present disclosure.
  • FIGS. 4A and 4B are side views of a refracting unit including a diverging element and a converging element on the basis of some embodiments of the present disclosure.
  • FIGS. 5A-5D are sectional views of a pixel unit and a refracting unit with one or two electro-wetting lenses based on some embodiments of the present disclosure.
  • FIGS. 6A-6D are sectional views of a pixel unit and a refracting unit with one or two liquid crystal lenses based on some embodiments of the present disclosure.
  • FIG. 7 is a partial circuit layout of a refracting array based on some embodiments of the present disclosure.
  • first, second, third, etc. may be used herein to describe various elements, components, regions, layers or sections, these elements, components, regions, layers or sections are not limited by these terms. Rather, these terms are merely used to distinguish one element, component, region, layer or section from another region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the present inventive concept.
  • FIG. 1A is a block diagram of a refracting unit 110 for forming a pixel image 214 of a pixel unit 210 of a display on the basis of an embodiment of the present disclosure.
  • the light with a direction 22 from the pixel unit 210 is refracted to another direction 22’ by the refracting unit 110.
  • the pixel image 214 seen by the eyes 20 is formed along the refracted direction 22’ in a position with an image distance i away from the refracting unit 110, and the image distance i may be different from the object distance p of the pixel unit 210.
  • FIG. 2A is a block diagram of a device 10 for displaying 3D images on the basis of another embodiment of the present disclosure.
  • the device 10 includes a pixel array 200 and a refracting array 100.
  • the pixel array 200 includes pixel units 210, and can be a traditional display or monitor.
  • the refracting array 100 includes refracting units 110. While lights from pixel units 210, 210’ are refracted respectively by corresponding refracting units 110, 110’, pixel images are formed and the image distances of the pixel images of the pixel units 210, 210’ may be different because the refractions of the refracting units 110 can be controlled differently from each other. Therefore, while lights from the pixel array 200 are refracted by the refracting array 100, a virtual image with objects in different image distances is formed, and thus a 3D image is formed.
  • FIG. 2B is a perspective view of a device 10 for displaying 3D images based on an embodiment of the present disclosure.
  • a device 10 includes a refracting array 100 and a pixel array 200.
  • a far object image 201 for example, a mountain
  • a near object image 202 for example, a person
  • the refracting units of the refracting array 100 refracts lights from pixels of the pixel array 200, including the pixels presenting the far object image 201 and the near object image 202.
  • a far virtual object 701 of the far object image 201 and a near virtual object 702 of the near object image 202 are formed.
  • the far virtual object 701 and the near virtual object 702 appear in a long image distance i1 and a short image distance i2, respectively.
  • the device 10 of the present disclosure can display a virtual image with virtual objects 701, 702 in different image distances i1, i2.
  • the present disclosure provides an image display device presenting virtual images of pixels to present a 3D image by taking pixels of an image display as the objects and refracting lights from the pixels by the refracting units 110. Therefore, the 3D image can be advantageously achieved without the traditional three dimensional glasses. Further, since the virtual image can be presented in the image distance far away from the display, the eyes of users can be benefited by the effect of looking far away, which can also prevent the myopia.
  • FIG. 1B is a side view of a refracting unit 110 on the basis of an embodiment of the present disclosure.
  • the refracting unit 110 can comprises a converging element 112, which can be transparent, to converge the light from the pixel unit to achieve optical properties of a biconvex shape, a plano-convex shape, or a positive meniscus shape.
  • a convex lens while the pixel unit 210 is positioned equivalent to or less than a focal length f away from the converging element 112, a pixel image 214 of the pixel unit 210 is formed.
  • the image distance i from the pixel image 214 to the converging element 112 is given by:
  • p is the object distance from the pixel unit 210 to the converging element 112
  • f is the focal length of the converging element 112.
  • the image distance i of the pixel image 214 gets longer while the pixel unit 210 gets closer to the focus.
  • the focal length f or the object distance p can be controlled to change the image distance i of the pixel image 214.
  • the pixel unit 210 may be positioned equivalent to or less than the minimum focal length away from the converging element 112 to assure that the pixel image 214 can be formed.
  • FIG. 3A is a side view of a refracting unit 110 including two converging elements on the basis of embodiments of the present disclosure.
  • the refracting unit 110 includes a first converging element 111a and a second converging element 112.
  • the first converging element 111a can be a microlens with a biconvex shape, a plano-convex shape, or a positive meniscus shape, and the focal length f1 of the first converging element 111a can be fixed.
  • the first converging element 111a converges the light from the pixel unit 210, and the first converging element 111a is positioned equivalent to or more than a focal length f1 of the first converging element 111a away from the pixel unit 210 to form a pixel image 212a of the pixel unit 210 a first distance d1 away from the first converging element 111a.
  • the second converging element 112 converges the light from the first converging element 111a and refracts the real pixel image 212a to form a pixel image 214.
  • the second converging element 112 may be positioned equivalent to or more than the first distance d1 away from the first converging element 111a.
  • the front focal point F2 of the second converging element 112 may have a minimum focal length f2 Min , and the second converging element 112 is positioned that a front focal point F2 of the second converging element with the minimum focal length f2 Min is equivalent to or less than the first distance d1 away from the first converging element 111a.
  • FIGS. 3B and 3C are side views of a refracting unit 110 including two converging elements on the basis of embodiments of the present disclosure.
  • the refracting unit 110 includes a first converging element 111a and a second converging element 112.
  • the first converging element 111a converges the light from the pixel unit 210.
  • FIG. 3B illustrates that the first converging element 111a is controlled to have a maximum focal length f1 Max . While the focal length of the first converging element 111a is controlled to be changed, to make the real pixel image 212a formed smaller than the pixel unit 210, the first converging element 111a is positioned equivalent to or more than twice the maximum focal length f1 Max of the first converging element 111a away from the pixel unit 210 to assure that the real pixel image 212a can be formed smaller than the pixel unit 210.
  • FIG. 3C illustrates that the first converging element 111a of FIG. 3B is controlled to have a minimum focal length f1 Min .
  • the second converging element 112 converges the light from the first converging element 111a and refracts the real pixel image 212a to form a pixel image 214. While the focal length of the second converging element 112 can be controlled to be changed, the second converging element 112 is positioned that a front focal point F2 of the second converging element 112 with a minimum focal length f2 Min is equivalent to or less than a minimum focal length f1 Min of the first converging element away from the first converging element.
  • the second converging element 112 may be positioned equivalent to or more than twice the maximum focal length f1 Max of the first converging element 111a away from the first converging element 111a.
  • the second converging element 112 can be the microlens with a biconvex shape, a plano-convex shape, or a positive meniscus shape. While the second converging element 112 is a microlens with a fixed focal length. The second converging element 112 is positioned that a front focal point F2 of the second converging element 112 with the fixed focal length is equivalent to or less than a minimum focal length f1 Min of the first converging element away from the first converging element.
  • FIGS. 4A and 4B are side views of a refracting unit 110 including a diverging element and a converging element on the basis of some embodiments of the present disclosure.
  • the refracting unit 110 includes a diverging element 111b and a converging element 112.
  • the diverging element 111b diverges the light from the pixel unit 210 to form a pixel image 212b.
  • FIG. 4A illustrates that the diverging element 111b is controlled to have a maximum focal length f1 Max .
  • the diverging element 111b is positioned equivalent to or more than the maximum focal length f1 Max of the diverging element 111b away from the pixel unit 210.
  • FIG. 4B illustrates that the diverging element 111a of FIG. 4A is controlled to have a minimum focal length f1 Min .
  • the converging element 112 converges the light from the diverging element 111b and refracts the pixel image 212b to form a pixel image 214. While the focal length of the converging element 112 can be controlled to be changed, the converging element 112 is positioned that a front focal point F2 of the converging element 112 with a minimum focal length f2 Min is the same or nearer to the pixel unit 210 in comparison with a front focal point F1 of the diverging element 111b with a minimum focal length f1 Min .
  • a focal length of the diverging element 111b can be controlled to be changed.
  • the converging element 112 can be a microlens with a biconvex shape, a plano-convex shape, or a positive meniscus shape, and thus the converging element 112 has a fixed focal length.
  • the diverging element 111b is a microlens with a biconcave shape, a plano-concave shape, or a negative meniscus shape, and thus the diverging element 111b has a fixed focal length. And, a focal length of the converging element 112 is controlled to be changed.
  • FIG. 5A is a sectional view of a pixel unit 210 and a refracting unit 110a with an electro-wetting lens based on some embodiments of the present disclosure.
  • the electro-wetting lens includes two immiscible liquids 516, 517, one of which is a conducting liquid, for example, water, and another of which can be oil.
  • the two immiscible liquids 516, 517 can be contained in a space surrounded by electrodes 514, 515, a substrate 518, and a window 519.
  • a transistor 522 can be formed on the substrate 518 and connected to electrodes 514, 515.
  • the electrodes 514, 515 can apply a voltage across the two immiscible liquids 516, 517 to change the curvature of interface between the two immiscible liquids 516, 517 and hence the refraction. Therefore, the refracting unit 110a can refract the light from the pixel unit 210 by the electro-wetting lens to control the image distance of the pixel image of the pixel unit 210.
  • FIG. 5B is a sectional view of a pixel unit 210 and a refracting unit 110b with an electro-wetting lens and a microlens 546 in a sequential order based on some embodiments of the present disclosure.
  • the refracting unit 110b may further include a microlens 546 separated from the electro-wetting lens by a space 544. Therefore, the refracting unit 110b refracts the light from the pixel unit 210 by the electro-wetting lens and then by the microlens 546 to control the image distance of the pixel image of the pixel unit 210.
  • FIG. 5C is a sectional view of a pixel unit 210 and a refracting unit 110c with a microlens 546 and an electro-wetting lens in a sequential order based on some embodiments of the present disclosure.
  • the refracting unit 110c refracts the light from the pixel unit 210 by the microlens 546 and then by the electro-wetting lens to control the image distance of the pixel image of the pixel unit 210.
  • FIG. 5D is a sectional view of a pixel unit 210 and a refracting unit 110d with two electro-wetting lenses in a sequential order based on some embodiments of the present disclosure.
  • the refracting unit 110d refracts the light from the pixel unit 210 by two electro-wetting lenses to control the image distance of the pixel image of the pixel unit 210.
  • FIG. 6A is a sectional view of a pixel unit 210 and a refracting unit 110a’ with a liquid crystal lens based on some embodiments of the present disclosure.
  • the liquid crystal lens may include electrodes 614, 615, and a liquid crystal layer 616 with liquid crystals 617.
  • the liquid crystal lens may further comprise a substrate 618.
  • a transistor 622 can be formed on the substrate 618 and connected to the electrodes 614, 615, and the transistor 142 can be a MOSFET.
  • a capacitor 624 can be formed on the substrate 618 and connected to the electrodes 614, 615.
  • a spacer 628 can be formed in the liquid crystal layer 616.
  • the electrodes 614, 615 can generate an electric field across the liquid crystal layer 616 to change alignments of the liquid crystals 617 to change the refraction. Therefore, the refracting unit 110a’ can refract the light from the pixel unit 210 by the liquid crystal lens to control the image distance of the pixel image of the pixel unit 210.
  • FIG. 6B is a sectional view of a pixel unit 210 and a refracting unit 110b’ with a liquid crystal lens and a microlens 646 in a sequential order based on some embodiments of the present disclosure.
  • the refracting unit 110b’ may further include a microlens 646 separated from the liquid crystal lens by a space 644. Therefore, the refracting unit 110b’ refracts the light from the pixel unit 210 by the liquid crystal lens and then by the microlens 646 to control the image distance of the pixel image of the pixel unit 210.
  • FIG. 6C is a sectional view of a pixel unit 210 and a refracting unit 110c’ with a microlens 646 and a liquid crystal lens in a sequential order based on some embodiments of the present disclosure.
  • the refracting unit 110c’ refracts the light from the pixel unit 210 by the microlens 646 and then by the liquid crystal lens to control the image distance of the pixel image of the pixel unit 210.
  • FIG. 6D is a sectional view of a pixel unit 210 and a refracting unit 110d’ with two liquid crystal lenses in a sequential order based on some embodiments of the present disclosure.
  • the refracting unit 110d’ refracts the light from the pixel unit 210 by two liquid crystal lenses to control the image distance of the pixel image of the pixel unit 210.
  • FIG. 7 is a partial circuit layout of a refracting array 100 based on some embodiments of the present disclosure.
  • the refracting array further includes source lines 132, gate lines 122, a source driver 130, a gate driver 120 and a controller 140.
  • Each of the refracting units 110 may further include a transistor 114, and the transistor 114 is connected to one of the gate lines 122 and one of the source lines 132.
  • the source lines 132 may apply voltages to the refracting units 110.
  • the gate lines 122 controls if the voltages are applied to the refracting units 110.
  • the source driver 130 is connected to the source lines 132.
  • the gate driver 120 is connected to the gate lines 122.
  • the controller 140 is connected to the gate driver 120 and the source driver 130.
  • the controller 140 receives imaging data concerning image distances of pixels of an image to be displayed; and controls the refracting units 110 through the source lines 132 and the gate lines 122 based on the imaging data. Because the refraction of the refracting units 110 can be controlled to be changed, while the imaging data is a video, the image distances of the pixel images of the pixel units can be changed with time. Alternatively, the controller 140 may generate the image distances of the pixel images of the pixel units by scaling the image distances of the pixels of the image to be displayed based on limitation of the refracting array 100.
  • the device for displaying 3D image provided in the present disclosure can be widely used in many 3C products, such as cellphone, notebook, display, or the like.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Liquid Crystal (AREA)

Abstract

An image display device presents virtual images to the eyes to present a 3D image without wearing the 3D glasses. The image display device comprises a pixel array and a refracting array. The refracting units of the refracting array refract lights from pixel units of the pixel array to form a virtual image to present the 3D image.

Description

DEVICES FOR DISPLAYING 3D IMAGE
The present disclosure relates generally to displays and, more particularly, to devices for displaying 3D images.
Traditional displays present the image directly on the screen, and thus the distance from the screen to the eyes is the distance from the image to the eyes. In addition, a traditional 3D display, which makes the eyes of a user receive different images by the glasses, requires the user to wear the glasses.
The present disclosure provides an image display device, including: a pixel array including pixel units; and a refracting array including refracting units. A refracting unit refracts a light from a pixel unit of the pixel units to form a pixel image of the pixel unit, and the refracting unit is controlled to change a refracted direction of the light to change an image distance of the pixel image of the pixel unit.
Further, the refracting units can be controlled differently from each other to have different image distances of pixel images of the pixel units.
The foregoing has outlined rather broadly the features and technical advantages of the present disclosure in order that the detailed description of the disclosure that follows may be better understood. Additional features and advantages of the disclosure will be described hereinafter, and form the subject of the claims of the disclosure. It may be appreciated by those skilled in the art that the conception and specific embodiment disclosed may be readily utilized as a basis for modifying or designing other structures or processes for carrying out the same purposes of the present disclosure. It should also be realized by those skilled in the art that such equivalent constructions do not depart from the spirit and scope of the disclosure as set forth in the appended claims.
Traditional displays present the image directly on the screen, and thus the distance from the screen to the eyes is the distance from the image to the eyes. Because the screen is usually close to the eyes, the eyes are prone to fatigue due to short focus and the eyes may suffer illness, such as, myopia. In addition, a traditional 3D display, which makes the eyes of a user receive different images by the glasses, requires the user to wear the glasses, and thus not conducive to promotion because of the inconvenience to the user. Therefore, there is a requirement for a new technique to solve those problems.
To solve the above problems, the image display device provided by the present disclosure presents an image displayed in a display device as a virtual image to the eyes, and thus the image can be seen as being beyond the screen.
The image display device provided by the present disclosure can present the 3D images without wearing the 3D glasses.
A more complete understanding of the present disclosure may be derived by referring to the detailed description and claims when considered in connection with the Figures, where like reference numbers refer to similar elements throughout the Figures. Aspects of the present disclosure are best understood from the following detailed description when read with the accompanying figures. It is noted that, in accordance with the standard practice in the industry, various features are not drawn to scale. In fact, the dimensions of the various features may be arbitrarily increased or reduced for clarity of discussion.
Fig.1A
Fig.1A is a block diagram of a refracting unit for forming a pixel image of a pixel unit of a display on the basis of an embodiment of the present disclosure.
Fig.1B
Fig.1B is a side view of a refracting unit on the basis of an embodiment of the present disclosure.
Fig.2A
FIG. 2A is a block diagram of a device for displaying 3D images on the basis of another embodiment of the present disclosure.
Fig.2B
FIG. 2B is a perspective view of a device for displaying 3D images based on an embodiment of the present disclosure.
Figs.3A-3C
FIGS. 3A-3C are side views of a refracting unit including two converging elements on the basis of embodiments of the present disclosure.
Figs.4A and 4B
FIGS. 4A and 4B are side views of a refracting unit including a diverging element and a converging element on the basis of some embodiments of the present disclosure.
Figs.5A-5D
FIGS. 5A-5D are sectional views of a pixel unit and a refracting unit with one or two electro-wetting lenses based on some embodiments of the present disclosure.
Fig.6A-6D
FIGS. 6A-6D are sectional views of a pixel unit and a refracting unit with one or two liquid crystal lenses based on some embodiments of the present disclosure.
Fig.7
FIG. 7 is a partial circuit layout of a refracting array based on some embodiments of the present disclosure.
Embodiments, or examples, of the disclosure illustrated in the drawings are now described using specific language. It shall be understood that no limitation of the scope of the disclosure is hereby intended. Any alteration or modification of the described embodiments, and any further applications of principles described in this document, are to be considered as normally occurring to one of ordinary skill in the art to which the disclosure relates. Reference numerals may be repeated throughout the embodiments, but this does not necessarily mean that feature(s) of one embodiment apply to another embodiment, even if they share the same reference numeral.
It shall be understood that, although the terms first, second, third, etc. may be used herein to describe various elements, components, regions, layers or sections, these elements, components, regions, layers or sections are not limited by these terms. Rather, these terms are merely used to distinguish one element, component, region, layer or section from another region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the present inventive concept.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limited to the present inventive concept. As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It shall be further understood that the terms “comprises” and “comprising,” when used in this specification, point out the presence of stated features, integers, steps, operations, elements, or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or groups thereof.
FIG. 1A is a block diagram of a refracting unit 110 for forming a pixel image 214 of a pixel unit 210 of a display on the basis of an embodiment of the present disclosure. Referring to FIG. 1, the light with a direction 22 from the pixel unit 210 is refracted to another direction 22’ by the refracting unit 110. Thus, the pixel image 214 seen by the eyes 20 is formed along the refracted direction 22’ in a position with an image distance i away from the refracting unit 110, and the image distance i may be different from the object distance p of the pixel unit 210.
FIG. 2A is a block diagram of a device 10 for displaying 3D images on the basis of another embodiment of the present disclosure. The device 10 includes a pixel array 200 and a refracting array 100. The pixel array 200 includes pixel units 210, and can be a traditional display or monitor. The refracting array 100 includes refracting units 110. While lights from pixel units 210, 210’ are refracted respectively by corresponding refracting units 110, 110’, pixel images are formed and the image distances of the pixel images of the pixel units 210, 210’ may be different because the refractions of the refracting units 110 can be controlled differently from each other. Therefore, while lights from the pixel array 200 are refracted by the refracting array 100, a virtual image with objects in different image distances is formed, and thus a 3D image is formed.
FIG. 2B is a perspective view of a device 10 for displaying 3D images based on an embodiment of the present disclosure. Referring to FIG. 2B, a device 10 includes a refracting array 100 and a pixel array 200. A far object image 201, for example, a mountain, and a near object image 202, for example, a person, are presented in the pixel array 200. The refracting units of the refracting array 100 refracts lights from pixels of the pixel array 200, including the pixels presenting the far object image 201 and the near object image 202. Thus, as seen by the eyes 20, a far virtual object 701 of the far object image 201 and a near virtual object 702 of the near object image 202 are formed. By controlling the refraction of the refracting units, which can be controlled differently from each other, the far virtual object 701 and the near virtual object 702 appear in a long image distance i1 and a short image distance i2, respectively. Thus, the device 10 of the present disclosure can display a virtual image with virtual objects 701, 702 in different image distances i1, i2.
The present disclosure provides an image display device presenting virtual images of pixels to present a 3D image by taking pixels of an image display as the objects and refracting lights from the pixels by the refracting units 110. Therefore, the 3D image can be advantageously achieved without the traditional three dimensional glasses. Further, since the virtual image can be presented in the image distance far away from the display, the eyes of users can be benefited by the effect of looking far away, which can also prevent the myopia.
FIG. 1B is a side view of a refracting unit 110 on the basis of an embodiment of the present disclosure. The refracting unit 110 can comprises a converging element 112, which can be transparent, to converge the light from the pixel unit to achieve optical properties of a biconvex shape, a plano-convex shape, or a positive meniscus shape. For a convex lens, while the pixel unit 210 is positioned equivalent to or less than a focal length f away from the converging element 112, a pixel image 214 of the pixel unit 210 is formed.
Based on the thin lens formula, the image distance i from the pixel image 214 to the converging element 112 is given by:
1/p + 1/i = 1/f (1)
Wherein p is the object distance from the pixel unit 210 to the converging element 112, and f is the focal length of the converging element 112.
Based on Equation (1), the image distance i of the pixel image 214 gets longer while the pixel unit 210 gets closer to the focus. Thus, the focal length f or the object distance p can be controlled to change the image distance i of the pixel image 214.
Referring to FIG. 1B, because the focal length f can be changed and the converging element 112 may have a minimum focal length resulted from its property, the pixel unit 210 may be positioned equivalent to or less than the minimum focal length away from the converging element 112 to assure that the pixel image 214 can be formed.
FIG. 3A is a side view of a refracting unit 110 including two converging elements on the basis of embodiments of the present disclosure. The refracting unit 110 includes a first converging element 111a and a second converging element 112. The first converging element 111a can be a microlens with a biconvex shape, a plano-convex shape, or a positive meniscus shape, and the focal length f1 of the first converging element 111a can be fixed. The first converging element 111a converges the light from the pixel unit 210, and the first converging element 111a is positioned equivalent to or more than a focal length f1 of the first converging element 111a away from the pixel unit 210 to form a pixel image 212a of the pixel unit 210 a first distance d1 away from the first converging element 111a.
Referring to FIG. 3A, the second converging element 112 converges the light from the first converging element 111a and refracts the real pixel image 212a to form a pixel image 214. The second converging element 112 may be positioned equivalent to or more than the first distance d1 away from the first converging element 111a. Further, while the focal length of the second converging element 112 can be controlled to be changed, the front focal point F2 of the second converging element 112 may have a minimum focal length f2Min, and the second converging element 112 is positioned that a front focal point F2 of the second converging element with the minimum focal length f2Min is equivalent to or less than the first distance d1 away from the first converging element 111a.
FIGS. 3B and 3C are side views of a refracting unit 110 including two converging elements on the basis of embodiments of the present disclosure. The refracting unit 110 includes a first converging element 111a and a second converging element 112. The first converging element 111a converges the light from the pixel unit 210.
FIG. 3B illustrates that the first converging element 111a is controlled to have a maximum focal length f1Max. While the focal length of the first converging element 111a is controlled to be changed, to make the real pixel image 212a formed smaller than the pixel unit 210, the first converging element 111a is positioned equivalent to or more than twice the maximum focal length f1Max of the first converging element 111a away from the pixel unit 210 to assure that the real pixel image 212a can be formed smaller than the pixel unit 210.
FIG. 3C illustrates that the first converging element 111a of FIG. 3B is controlled to have a minimum focal length f1Min. The second converging element 112 converges the light from the first converging element 111a and refracts the real pixel image 212a to form a pixel image 214. While the focal length of the second converging element 112 can be controlled to be changed, the second converging element 112 is positioned that a front focal point F2 of the second converging element 112 with a minimum focal length f2Min is equivalent to or less than a minimum focal length f1Min of the first converging element away from the first converging element. Returning to FIG. 3B, the second converging element 112 may be positioned equivalent to or more than twice the maximum focal length f1Max of the first converging element 111a away from the first converging element 111a.
Alternatively, the second converging element 112 can be the microlens with a biconvex shape, a plano-convex shape, or a positive meniscus shape. While the second converging element 112 is a microlens with a fixed focal length. The second converging element 112 is positioned that a front focal point F2 of the second converging element 112 with the fixed focal length is equivalent to or less than a minimum focal length f1Min of the first converging element away from the first converging element.
FIGS. 4A and 4B are side views of a refracting unit 110 including a diverging element and a converging element on the basis of some embodiments of the present disclosure. The refracting unit 110 includes a diverging element 111b and a converging element 112. The diverging element 111b diverges the light from the pixel unit 210 to form a pixel image 212b.
FIG. 4A illustrates that the diverging element 111b is controlled to have a maximum focal length f1Max. To form a pixel image 212b, which is smaller than the pixel unit 210, while a focal length of the diverging element 111b can be controlled to be changed, the diverging element 111b is positioned equivalent to or more than the maximum focal length f1Max of the diverging element 111b away from the pixel unit 210.
FIG. 4B illustrates that the diverging element 111a of FIG. 4A is controlled to have a minimum focal length f1Min. The converging element 112 converges the light from the diverging element 111b and refracts the pixel image 212b to form a pixel image 214. While the focal length of the converging element 112 can be controlled to be changed, the converging element 112 is positioned that a front focal point F2 of the converging element 112 with a minimum focal length f2Min is the same or nearer to the pixel unit 210 in comparison with a front focal point F1 of the diverging element 111b with a minimum focal length f1Min.
In some embodiments of the present disclosure, a focal length of the diverging element 111b can be controlled to be changed. And, the converging element 112 can be a microlens with a biconvex shape, a plano-convex shape, or a positive meniscus shape, and thus the converging element 112 has a fixed focal length.
In some embodiments of the present disclosure, the diverging element 111b is a microlens with a biconcave shape, a plano-concave shape, or a negative meniscus shape, and thus the diverging element 111b has a fixed focal length. And, a focal length of the converging element 112 is controlled to be changed.
FIG. 5A is a sectional view of a pixel unit 210 and a refracting unit 110a with an electro-wetting lens based on some embodiments of the present disclosure. The electro-wetting lens includes two immiscible liquids 516, 517, one of which is a conducting liquid, for example, water, and another of which can be oil. The two immiscible liquids 516, 517 can be contained in a space surrounded by electrodes 514, 515, a substrate 518, and a window 519. A transistor 522 can be formed on the substrate 518 and connected to electrodes 514, 515. The electrodes 514, 515 can apply a voltage across the two immiscible liquids 516, 517 to change the curvature of interface between the two immiscible liquids 516, 517 and hence the refraction. Therefore, the refracting unit 110a can refract the light from the pixel unit 210 by the electro-wetting lens to control the image distance of the pixel image of the pixel unit 210.
FIG. 5B is a sectional view of a pixel unit 210 and a refracting unit 110b with an electro-wetting lens and a microlens 546 in a sequential order based on some embodiments of the present disclosure. In addition to the electro-wetting lens illustrated in FIG. 5A, the refracting unit 110b may further include a microlens 546 separated from the electro-wetting lens by a space 544. Therefore, the refracting unit 110b refracts the light from the pixel unit 210 by the electro-wetting lens and then by the microlens 546 to control the image distance of the pixel image of the pixel unit 210.
FIG. 5C is a sectional view of a pixel unit 210 and a refracting unit 110c with a microlens 546 and an electro-wetting lens in a sequential order based on some embodiments of the present disclosure. The refracting unit 110c refracts the light from the pixel unit 210 by the microlens 546 and then by the electro-wetting lens to control the image distance of the pixel image of the pixel unit 210.
FIG. 5D is a sectional view of a pixel unit 210 and a refracting unit 110d with two electro-wetting lenses in a sequential order based on some embodiments of the present disclosure. The refracting unit 110d refracts the light from the pixel unit 210 by two electro-wetting lenses to control the image distance of the pixel image of the pixel unit 210.
FIG. 6A is a sectional view of a pixel unit 210 and a refracting unit 110a’ with a liquid crystal lens based on some embodiments of the present disclosure. The liquid crystal lens may include electrodes 614, 615, and a liquid crystal layer 616 with liquid crystals 617. The liquid crystal lens may further comprise a substrate 618. A transistor 622 can be formed on the substrate 618 and connected to the electrodes 614, 615, and the transistor 142 can be a MOSFET. Optionally, a capacitor 624 can be formed on the substrate 618 and connected to the electrodes 614, 615. Further, a spacer 628 can be formed in the liquid crystal layer 616.
Referring to FIG. 6A, the electrodes 614, 615 can generate an electric field across the liquid crystal layer 616 to change alignments of the liquid crystals 617 to change the refraction. Therefore, the refracting unit 110a’ can refract the light from the pixel unit 210 by the liquid crystal lens to control the image distance of the pixel image of the pixel unit 210.
FIG. 6B is a sectional view of a pixel unit 210 and a refracting unit 110b’ with a liquid crystal lens and a microlens 646 in a sequential order based on some embodiments of the present disclosure. In addition to the liquid crystal lens illustrated in FIG. 6A, the refracting unit 110b’ may further include a microlens 646 separated from the liquid crystal lens by a space 644. Therefore, the refracting unit 110b’ refracts the light from the pixel unit 210 by the liquid crystal lens and then by the microlens 646 to control the image distance of the pixel image of the pixel unit 210.
FIG. 6C is a sectional view of a pixel unit 210 and a refracting unit 110c’ with a microlens 646 and a liquid crystal lens in a sequential order based on some embodiments of the present disclosure. The refracting unit 110c’ refracts the light from the pixel unit 210 by the microlens 646 and then by the liquid crystal lens to control the image distance of the pixel image of the pixel unit 210.
FIG. 6D is a sectional view of a pixel unit 210 and a refracting unit 110d’ with two liquid crystal lenses in a sequential order based on some embodiments of the present disclosure. The refracting unit 110d’ refracts the light from the pixel unit 210 by two liquid crystal lenses to control the image distance of the pixel image of the pixel unit 210.
FIG. 7 is a partial circuit layout of a refracting array 100 based on some embodiments of the present disclosure. The refracting array further includes source lines 132, gate lines 122, a source driver 130, a gate driver 120 and a controller 140. Each of the refracting units 110 may further include a transistor 114, and the transistor 114 is connected to one of the gate lines 122 and one of the source lines 132. The source lines 132 may apply voltages to the refracting units 110. The gate lines 122 controls if the voltages are applied to the refracting units 110. The source driver 130 is connected to the source lines 132. The gate driver 120 is connected to the gate lines 122. The controller 140 is connected to the gate driver 120 and the source driver 130. The controller 140 receives imaging data concerning image distances of pixels of an image to be displayed; and controls the refracting units 110 through the source lines 132 and the gate lines 122 based on the imaging data. Because the refraction of the refracting units 110 can be controlled to be changed, while the imaging data is a video, the image distances of the pixel images of the pixel units can be changed with time. Alternatively, the controller 140 may generate the image distances of the pixel images of the pixel units by scaling the image distances of the pixels of the image to be displayed based on limitation of the refracting array 100.
Although the present disclosure and its advantages have been described in detail, it may be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the disclosure as defined by the appended claims. For example, many of the processes discussed above can be implemented in different methodologies and replaced by other processes, or a combination thereof.
Moreover, the scope of the present application is not intended to be limited to the particular embodiments of the process, machine, manufacture, composition of matter, means, methods and steps described in the specification. As one of ordinary skill in the art will readily appreciate from the disclosure of the present disclosure, processes, machines, manufacture, compositions of matter, means, methods, or steps, presently existing or later to be developed, that perform substantially the same function or achieve substantially the same result as the corresponding embodiments described herein may be utilized according to the present disclosure. Accordingly, the appended claims are intended to include within their scope such processes, machines, manufacture, compositions of matter, means, methods, or steps.
The device for displaying 3D image provided in the present disclosure can be widely used in many 3C products, such as cellphone, notebook, display, or the like.
The reference signs are listed in the below:
10 device
20 eyes
22 direction of lights
22' refracted direction of lights
100 refracting array
110, 110', 110a, 110b, 110c, 110d, 110a', 110b', 110c', 110d' refracting unit
111a first converging element
111b diverging element
112 second converging element
120 gate driver
122 gate line
130 source driver
132 source line
140 controller
200 pixel array
201 far object image
202 near object image
210, 210' pixel units
212a, 212b pixel image
214 pixel image
514, 515 electrode
516, 517 immiscible liquids
518 substrate
519 window
522 transistor
544 space
546 microlens
614, 615 electrode
616 liquid crystal layer
617 liquid crystal
618 substrate
622 transistor
624 capacitor
628 spacer
644 space
646 microlens
701 far virtual object
702 near virtual object
i image distance
i1 long image distance
i2 short image distance
d1 first distance
f, f1, f2 focal length
f1Max, f2Max maximum focal length
f1Min, f2Min minimum focal length
p object distance
F, F1, F2 focal point
The reference signs are listed in the above.
Citation List follows:
International Publication No. WO2017189230A2
US Patent Application Publication No. 20160313697A1
US Patent Application Publication No. 20100033813A1
US Patent Application Publication No. 20140168051A1
US Patent Application Publication No. 20150370079A1
US Patent Application Publication No. 20140022511A1
US Patent Application Publication No. 20170212360A1

Claims (20)

  1. An image display device, comprising:
    a pixel array comprising a pixel unit; and
    a refracting array comprising a refracting unit, wherein the refracting unit refracts a light from the pixel unit to form a pixel image of the pixel unit,
    wherein the refracting unit is controlled to change a refracted direction of the light to change an image distance of the pixel image of the pixel unit.
  2. The device of claim 1, wherein the refracting unit is controlled differently from other refracting units of the refracting array to present the pixel image of the pixel unit in a different distance.
  3. The device of claim 1, wherein the refracting unit comprises:
    a converging element to converge the light from the pixel unit,
    wherein a focal length of the converging element or a distance from the pixel unit to the converging element is controlled to change the image distance of the pixel image of the pixel unit, and
    the pixel unit is positioned equivalent to or less than a minimum focal length of the converging element away from the converging element.
  4. The device of claim 1, wherein the refracting unit comprises:
    a first converging element to converge the light from the pixel unit, wherein the first converging element is positioned equivalent to or more than a focal length of the first converging element away from the pixel unit to form a pixel image of the pixel unit a first distance away from the first converging element; and
    a second converging element to converge the light from the first converging element.
  5. The device of claim 4, wherein
    the first converging element is a microlens with a biconvex shape, a plano-convex shape, or a positive meniscus shape; and
    a focal length of the second converging element is controlled to be changed, wherein the second converging element is positioned that a front focal point of the second converging element with a minimum focal length is equivalent to or less than the first distance away from the first converging element.
  6. The device of claim 1, wherein the refracting unit comprises
    a first converging element to converge the light from the pixel unit, wherein the first converging element is positioned equivalent to or more than twice a maximum focal length of the first converging element away from the pixel unit; and
    a second converging element to converge the light from the first converging element, wherein the second converging element is positioned equivalent to or more than twice the maximum focal length of the first converging element away from the first converging element.
  7. The device of claim 6, wherein a front focal point of the second converging element with a minimum focal length is equivalent to or less than a minimum focal length of the first converging element away from the first converging element.
  8. The device of claim 6, wherein the second converging element is a microlens with a biconvex shape, a plano-convex shape, or a positive meniscus shape.
  9. The device of claim 1, wherein the refracting unit comprises:
    a diverging element to diverge the light from the pixel unit, wherein the diverging element is positioned equivalent to or more than a maximum focal length of the diverging element away from the pixel unit; and
    a converging element to converge the light from the diverging element wherein the converging element is positioned that a front focal point of the converging element with a minimum focal length is the same or nearer to the pixel unit in comparison with a front focal point of the diverging element with a minimum focal length.
  10. The device of claim 9, wherein
    a focal length of the diverging element is controlled to be changed; and
    the converging element is a microlens with a biconvex shape, a plano-convex shape, or a positive meniscus shape.
  11. The device of claim 9, wherein
    the diverging element is a microlens with a biconcave shape, a plano-concave shape, or a negative meniscus shape; and
    a focal length of the converging element is controlled to be changed.
  12. The device of claim 1, wherein the refracting unit comprises:
    a conducting liquid; and
    an electrode to apply a voltage across the conducting liquid to change a shape of the conducting liquid to change a refraction.
  13. The device of claim 12, wherein the refracting unit comprises:
    a substrate; and
    a transistor formed on the substrate and connected to the electrode.
  14. The device of claim 1, wherein the refracting unit comprises:
    a liquid crystal layer comprising liquid crystals; and
    an electrode to generate an electric field across the liquid crystal layer to change alignments of the liquid crystals to change a refraction.
  15. The device of claim 14, wherein the refracting unit further comprises:
    a substrate;
    a transistor formed on the substrate and connected to the electrode; and
    a capacitor formed on the substrate and connected to the electrode.
  16. The device of claim 1, wherein the refracting array further comprises:
    a source line for applying a voltage to the refracting unit;
    a gate line for controlling if the voltage is applied to the refracting unit;
    a source driver connected to the source line;
    a gate driver connected to the gate line; and
    a controller connected to the source driver and the gate driver.
  17. The device of claim 16, wherein the refracting unit further comprises a transistor connected to the gate line and the source line.
  18. The device of claim 16, wherein the controller executes:
    receiving imaging data concerning image distances of pixels of an image to be displayed; and
    controlling the refracting unit through the source line and the gate line based on the imaging data.
  19. The device of claim 18, wherein the controller generates the image distance of the pixel image of the pixel unit by scaling one of the image distances of the pixels of the image to be displayed based on limitation of the refracting array.
  20. The device of claim 18, wherein the imaging data is a video, and the image distance of the pixel image of the pixel unit is changed with time.
PCT/CN2018/090325 2018-06-08 2018-06-08 Devices for displaying 3d image WO2019232768A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/090325 WO2019232768A1 (en) 2018-06-08 2018-06-08 Devices for displaying 3d image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/090325 WO2019232768A1 (en) 2018-06-08 2018-06-08 Devices for displaying 3d image

Publications (1)

Publication Number Publication Date
WO2019232768A1 true WO2019232768A1 (en) 2019-12-12

Family

ID=68769634

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/090325 WO2019232768A1 (en) 2018-06-08 2018-06-08 Devices for displaying 3d image

Country Status (1)

Country Link
WO (1) WO2019232768A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5790086A (en) * 1995-01-04 1998-08-04 Visualabs Inc. 3-D imaging system
US20120200916A1 (en) * 2007-07-30 2012-08-09 Magnetic Media Holdings Inc. Multi-Stereoscopic Viewing Apparatus
CN107065182A (en) * 2017-03-08 2017-08-18 上海乐蜗信息科技有限公司 A kind of virtual reality optical lens and virtual reality device
CN107884940A (en) * 2017-11-28 2018-04-06 腾讯科技(深圳)有限公司 Display module, head-mounted display apparatus and image stereo display method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5790086A (en) * 1995-01-04 1998-08-04 Visualabs Inc. 3-D imaging system
US20120200916A1 (en) * 2007-07-30 2012-08-09 Magnetic Media Holdings Inc. Multi-Stereoscopic Viewing Apparatus
CN107065182A (en) * 2017-03-08 2017-08-18 上海乐蜗信息科技有限公司 A kind of virtual reality optical lens and virtual reality device
CN107884940A (en) * 2017-11-28 2018-04-06 腾讯科技(深圳)有限公司 Display module, head-mounted display apparatus and image stereo display method

Similar Documents

Publication Publication Date Title
CN109254399B (en) Display device and display method
CN101551546B (en) Display device
KR20200105687A (en) Augmented Reality Optical System with Pinpoint Mirror
US20120293503A1 (en) Image display device and liquid crystal lens
US10197886B2 (en) Display spectacles having microprism structures and driving method thereof
WO2018053684A1 (en) Three-dimensional display panel, three-dimensional display apparatus having the same, and fabricating method thereof
US20140063381A1 (en) Naked-eye three-dimensional image display method and device
CN203433192U (en) Head mounted display HMD
WO2017219672A1 (en) Display device and control method therefor
TWI454791B (en) Electrically-driven liquid crystal lens panel and stereoscopic display panel
CN104049369B (en) A kind of camera lens for wearing display device and helmet
CN103149696A (en) Display system
CN105894970B (en) A kind of virtual curved face display panel and display device
US10534192B2 (en) Stereo display panel and display device having the stereo display panel
KR102329295B1 (en) Head mounted display device
CN109669277A (en) Active matric focusing eyeglass and the focus-adjustable glasses with the eyeglass
CN110221440A (en) A kind of augmented reality shows equipment and its driving method, augmented reality glasses
WO2019232768A1 (en) Devices for displaying 3d image
CN205880362U (en) Eyepiece optical system and use its little demonstration optical system
CN208953847U (en) A kind of display device
JP2013195536A (en) Display device, electronic apparatus and control circuit
KR20080060108A (en) Three dimensional image display panel and three dimensional image display device having the same
CN113281906A (en) Detachable augmented reality display equipment and system thereof
US20190086680A1 (en) Display device, and operating method thereof
KR101269180B1 (en) Apparatus of 3D display using electrowetting

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18921642

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18921642

Country of ref document: EP

Kind code of ref document: A1