CN113640987A - Variable-focus optical imaging array system, near-to-eye display device and optical system image projection method - Google Patents

Variable-focus optical imaging array system, near-to-eye display device and optical system image projection method Download PDF

Info

Publication number
CN113640987A
CN113640987A CN202010341951.0A CN202010341951A CN113640987A CN 113640987 A CN113640987 A CN 113640987A CN 202010341951 A CN202010341951 A CN 202010341951A CN 113640987 A CN113640987 A CN 113640987A
Authority
CN
China
Prior art keywords
array
microdisplay
micro
image
optical imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010341951.0A
Other languages
Chinese (zh)
Inventor
黄正宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yilian Technology Co ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202010341951.0A priority Critical patent/CN113640987A/en
Publication of CN113640987A publication Critical patent/CN113640987A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/27Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

The application discloses a variable-focus optical imaging array system, a near-eye display device and an optical system image projection method. The variable-focus optical imaging array system comprises an optical imaging array component, a control unit and an image processing unit, wherein the control unit and the image processing unit are coupled with the optical imaging array component; the control unit is configured to change a spacing between the microdisplay array and the microlens array; the image processing unit is configured to adjust the display in the microdisplays according to the distance between the microdisplay array and the microlens array, so that the image light emitted by each microdisplay is imaged at a preset region of the retina after passing through the corresponding aperture and the microlens. The variable-focus optical imaging array system can enable people with different vision levels to watch clear images under near-eye display by adjusting the distance between the micro display array and the micro lens array.

Description

Variable-focus optical imaging array system, near-to-eye display device and optical system image projection method
Technical Field
The present application relates generally to the field of optical display, and more particularly to a variable focus optical imaging array system, a near-to-eye display device, and an optical system image projection method.
Background
With the development of computer technology and display technology, Virtual Reality (VR) technology for experiencing a Virtual world through a computer simulation system, and Augmented Reality (AR) technology and Mixed Reality (MR) technology for fusing display contents into a real environment background have been rapidly developed.
Near-eye VR, AR and MR display technologies combining VR, AR and MR technologies with near-eye display are important novel display technologies, and can bring unprecedented visual experience and man-machine interaction. The near-eye VR display technology mainly pursues the virtual display of an immersion type large view field, and the near-eye AR and MR display technologies aim to realize perspective type virtual-real fusion. In principle, near-eye display devices for AR and MR are also called virtual reality techniques in the case of blocking ambient light from entering the user's eyes.
The near-eye display device is generally constructed as a helmet or glasses-shaped display device, is used for imaging an image displayed by a micro display chip at a distance through an optical system, enables human eyes to directly see the displayed enlarged image at the distance through the device, is combined with an SLAM technology to realize spatial perception and positioning, realizes interaction through technologies such as gesture recognition, voice recognition and eyeball tracking, and is a novel display technology which has important potential commercial application value and is considered to be expected to replace a smart phone.
Generally, when viewing an object, the entire pupil of the eyeball can be used as a light-entering surface. Light rays (image light) from an object to be viewed are incident on the pupil, are refracted, are converged and are incident on the retina to form a corresponding image. For a person with normal vision, the point of convergence of the image light after refraction through the pupil is located on the retina, thereby enabling the object to be viewed clearly. However, for a person with myopia, a convergence point of image light refracted by a pupil is located in front of a retina, and when the image light passes through the convergence point and is continuously incident on the retina, the image light is already diverged into a diaphragm from a point, so that an image is blurred, and therefore, an object of vision is unclear. In addition, in a person with far vision, the convergence point of the image light refracted through the pupil is located behind the retina, and the image light is incident on the retina to form a stop before the convergence point is actually formed, thereby causing blurred vision.
If a small hole is provided between the object to be viewed and the pupil, the beam size of the image light received by the pupil is reduced under the restriction of the small hole, and thus the convergence angle after refraction through the pupil is also reduced accordingly. In this case, for a person with myopia/hyperopia, although the point of convergence of image light is still located in front of/behind the retina, the aperture formed on the retina becomes smaller due to the reduction of the angle of convergence thereof, so that the visual object is relatively clear. That is, by arranging the small hole on the optical path of the object to be viewed, the person with myopia/hyperopia can view a clearer image than the person without the small hole; and the smaller the aperture of the aperture, the higher the sharpness of the viewed image.
However, the existence of the small holes can cause most of image light from an observed object to be blocked, the light entering the pupil is greatly reduced, the image is darkened, the brightness is insufficient, and the effect of the object is adversely affected; and the smaller the aperture of the aperture, the lower the brightness of the viewed image. It follows that by improving the visual effect of a person with myopia/hyperopia by using small holes, there is a conflict between the sharpness and brightness of the image.
On the other hand, in near-eye display, in a person with normal, near, or far vision, the image light from the object to be viewed is refracted by the pupil and the convergence point is located behind the retina, and the image light is incident on the retina to form a stop before the convergence point is actually formed, and therefore the object cannot be clearly viewed. Similarly to the foregoing, when a small hole is provided between the object to be viewed and the pupil, the aperture formed on the retina by the image light blocked by the small hole becomes small, so that the sharpness of the image can be improved to some extent. However, in this case, there is also a problem that it is difficult to achieve both the image sharpness and the brightness. In the near-eye display, in order to reduce the aperture formed by the image light on the retina to the limit of human eye recognition so as to approach an infinite point, the brightness of the image inevitably becomes too low to be viewed.
Therefore, there is an urgent need to develop an optical assembly that can achieve both the sharpness and brightness of an image, improve the visual effect of people of different visual acuity levels in a near-eye display, and have a light and compact structure, using a small aperture.
Disclosure of Invention
It is an object of the present application to provide a variable focus optical imaging array system, and a near-eye display device and an optical system image projection method based on the variable focus optical imaging array system, which at least partially solve the above problems in the prior art.
According to an aspect of the present application, there is provided a variable focus optical imaging array system, comprising:
the optical imaging array assembly comprises a micro display array, an aperture array and a micro lens array, wherein the aperture array and the micro lens array are positioned at the downstream of the micro display array along the image light transmission direction, the aperture array and the micro lens array are adjacently arranged, the micro display array, the aperture array and the micro lens array respectively comprise a plurality of micro displays, apertures and micro lenses which are the same in number, the micro displays, the apertures and the micro lenses are in one-to-one correspondence to form corresponding light channels, and the micro displays can emit image light;
a control unit coupled to the optical imaging array assembly and configured to change a spacing between a microdisplay array and a microlens array to accommodate people of different vision levels; and
and the image processing unit is coupled with the optical imaging array assembly and is configured to adjust the display in the micro display according to the distance between the micro display array and the micro lens array, so that the image light emitted by each micro display is shielded by the corresponding small hole and modulated by the micro lens and then enters the pupil to be imaged at the preset region of the retina.
In one embodiment of the present application, a plurality of microdisplays in the microdisplay array each display a plurality of sub-image sources from which image sources are disassembled, wherein an image light emitted by each microdisplay and displaying a corresponding sub-image source is blocked by a corresponding aperture and modulated by a microlens and then enters a pupil, and sub-images are formed at a predetermined region of a retina, and the sub-images constitute an image corresponding to the image sources.
In one embodiment of the present application, the image processing unit is configured to change a disassembling manner of an image source according to a change in a distance between a microdisplay array and a microlens array, and display a plurality of retrieved sub-image sources in corresponding microdisplays, so that sub-images formed by the sub-image sources still constitute an image corresponding to the image source.
In one embodiment of the present application, the variable focus optical imaging array system is configured such that image light from all sub-image sources disassembled from the image source forms a continuous distribution of incident angles over a range of field angles that culminate in a pupil.
In one embodiment of the present application, the control unit is coupled to a microdisplay array, and changes its spacing from a microlens array by controlling the microdisplay array to move;
or the control unit is coupled with the micro-lens array and changes the distance between the micro-lens array and the micro-display array by controlling the micro-lens array to move;
or the control unit is respectively coupled with the microdisplay array and the microlens array, and the distance between the microdisplay array and the microlens array is changed by controlling the microdisplay array and the microlens array to move.
In one embodiment of the present application, the control unit includes a screw threadedly coupled to a microdisplay array and/or a microlens array, the microdisplay array and/or the microlens array being moved by rotation of the screw;
or the control unit comprises a micro motor coupled with the micro display array and/or the micro lens array, and the micro motor drives the micro display array and/or the micro lens array to move.
In one embodiment of the application, the image processing unit comprises a measurement module, a calculation module and a disassembly module, wherein,
the measurement module is configured to measure a spacing between the microdisplay array and the microlens array;
the computing module is configured to convert pitch data of the microdisplay array and microlens array into an array defining a display area in a microdisplay of a microdisplay array;
the disassembly module is configured to disassemble an image source into a sub-image source according to the array of display regions in the microdisplays defining the microdisplay array.
In one embodiment of the application, the measurement module is configured to emit a light beam by one of the microdisplay array and the microlens array toward the other and receive the light beam reflected by the latter, and to measure the spacing between the microdisplay array and the microlens array by detecting the time from emission to reception of the light beam or the phase difference of the emitted light and the reflected light, wherein the light beam is selected from laser light or infrared light.
In one embodiment of the application, the calculation module is configured to receive the microdisplay array to microlens array spacing data from the measurement module and to computationally convert it into an array of positions and sizes of regions to be displayed in the microdisplays of the microdisplay array.
In one embodiment of the application, the un-binning module is configured to receive an array of positions and sizes of regions to be displayed in microdisplays of the microdisplay array from the computing module, thereby un-binning the image source into a plurality of portions having predetermined intervals, wherein each portion contains color information and/or luminance information of a corresponding pixel, thereby forming a plurality of sub-image sources.
In one embodiment of the present application, the disassembling module is coupled to a microdisplay array, and an image source obtained by disassembling an image source is displayed in a microdisplay of the microdisplay array.
In one embodiment of the present application, the distance between the microdisplay array and the microlens array along the direction perpendicular to the microdisplay array can be changed within the range of f + - Δ f, wherein f represents the focal length of the microlenses in the microlens array, and is within the range of 1-20 mm; the range of delta f is 0< delta f < 0.5 f.
In one embodiment of the present application, the distance between the microdisplay array and the microlens array in a direction perpendicular to both is in the range of 0.5-30 mm.
In one embodiment of the present application, in the optical imaging array assembly, the array of apertures is located upstream or downstream of the array of microlenses in the direction of image light propagation.
In one embodiment of the present application, in the optical imaging array assembly, the microdisplay array is located at or near the focal plane of the microlens array, so that the image light emitted by the microdisplay is blocked by the corresponding aperture and modulated by the microlens to form a parallel or approximately parallel light beam;
or the micro display array is positioned in the focal plane of the micro lens array, so that the image light emitted by the micro display is shielded by the corresponding small hole and modulated by the micro lens to form a light beam which is divergent relative to the parallel light beam;
or the micro display array is positioned outside the focal plane of the micro lens array, so that the image light emitted by the micro display is shielded by the corresponding small hole and modulated by the micro lens to form a light beam which is converged relative to the parallel light beam.
In one embodiment of the present application, in the optical imaging array assembly, the microdisplay array includes a microdisplay device integrally covered with pixels, the microdisplay device being configured to control pixels in a designated area of the microdisplay device to emit light and pixels in other areas to not emit light according to an instruction, wherein the area emitting light corresponds to the microdisplay.
In one embodiment of the present application, in the optical imaging array assembly, a plurality of microdisplays in a microdisplay array are arranged in a rectangle along a first direction and a second direction perpendicular to the first direction.
In one embodiment of the application, in the optical imaging array component, the distance between the aperture array and the center of the pupil along the direction vertical to the aperture array is in the range of 5-30 mm;
in the optical imaging array assembly, the total number of the micro-displays included in the micro-display array is more than or equal to 2, and the number of the micro-displays included in the micro-display array in the first direction or the second direction is within the range of 1-10.
In one embodiment of the present application, in the optical imaging array assembly, a ratio of a diameter of a pupil to a crosstalk safe distance is in a range of 0.2 to 0.6, wherein the crosstalk safe distance represents a distance between a center of the pupil and a center of another pupil along the first direction or the second direction;
in the optical imaging array assembly, the ratio of the length of a microdisplay to the midpoint spacing of adjacent microdisplays in the first or second direction is in the range of 0.2-0.6.
In one embodiment of the present application, a physical boundary is provided between the optical channels in the optical imaging array assembly to mitigate cross-talk.
According to another aspect of the present application, there is provided a near-eye display device comprising a variable focus optical imaging array system as described above.
In one embodiment of the present application, the near-eye display device comprises two of the variable focus optical imaging array systems for displaying respective images for the left and right eyes of a user.
In a preferred embodiment of the present application, the two variable focus optical imaging array systems independently perform the imaging process for the level of vision of the user's left and right eyes, respectively.
According to still another aspect of the present application, there is provided an optical system image projection method including the steps of:
s1: providing a micro display array, and an aperture array and a micro lens array which are positioned at the downstream of the micro display array along the image light transmission direction, wherein the aperture array and the micro lens array are adjacently arranged, the micro display array, the aperture array and the micro lens array respectively comprise a plurality of micro displays, apertures and micro lenses which are the same in number, and the micro displays, the apertures and the micro lenses are in one-to-one correspondence to form corresponding light channels;
s2: monitoring the distance between the micro display array and the micro lens array;
s3: according to the distance between the micro display array and the micro lens array, the image source is disassembled into a plurality of sub-image sources, and one sub-image source is displayed in each micro display;
s4: making the image light which is emitted by each micro display and displays the corresponding sub-image source enter the pupil after being shielded by the corresponding small hole and modulated by the micro lens, and forming sub-images at the preset area of the retina respectively;
s5: the sub-images are made to constitute an image corresponding to the image source.
In one embodiment of the present application, the method further comprises the steps of:
s6: changing the distance between the micro display array and the micro lens array;
s7: when the change of the distance between the micro display array and the micro lens array is monitored, the disassembling mode of the image source is changed, and the sub images formed by the plurality of sub image sources are obtained again to form the image corresponding to the image source.
In one embodiment of the present application, the steps S2 and S7 each include:
a method for measuring the distance between a micro display array and a micro lens array includes emitting a light beam to one of the micro display array and the micro lens array by the other and receiving the light beam reflected by the latter, and measuring the distance between the micro display array and the micro lens array by detecting the time from the emission to the reception of the light beam or the phase difference of the emitted light and the reflected light, wherein the light beam is selected from laser light or infrared light.
In one embodiment of the present application, the steps S3 and S7 each include:
converting the spacing data of the microdisplay array and the microlens array into an array defining a display area in the microdisplay of the microdisplay array;
according to the array of display regions in the microdisplays defining the microdisplay array, an image source is disassembled into a plurality of portions with predetermined intervals, where each portion contains color information and/or luminance information for a corresponding pixel, forming a plurality of sub-image sources.
In one embodiment of the present application, the step S6 includes:
the microdisplay array and/or microlens array is moved by rotation of a screw in threaded connection with the microdisplay array and/or microlens array, or by driving of a micro-motor coupled to the microdisplay array and/or microlens array.
In one embodiment of the application, the method is implemented by a variable focus optical imaging array system or a near-eye display device as described above.
According to the optical imaging array component of this application, it is through dismantling the image source and utilize aperture and microlens array structure to recombine the formation of image, can improve luminance when guaranteeing near-to-eye display lower formation of image definition to make different eyesight horizontally people homoenergetic see clear formation of image, in addition can show the size that reduces optical module.
According to the variable-focus optical imaging array system, the distance between the micro display array and the micro lens array in the optical imaging array assembly is constructed into a variable structure, and the image source is disassembled in a preset mode in a matching manner, so that the continuous adjustment of the divergence/convergence degree of the image light incident to the pupil relative to the parallel light beams can be realized, the use requirements of people with different vision levels can be met simultaneously, and the variable-focus optical imaging array system has improved applicability and convenience.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, with reference to the accompanying drawings in which:
FIG. 1 shows the contrast of the imaging process in a near-to-eye display using different optics, where (A) is the imaging process using pinhole, (B) is the imaging process using pinhole + microlens combination;
FIG. 2 shows a comparison of imaging processes for persons of different vision levels in a near-to-eye display, where (A) is the imaging process for a normal-sighted person, (B) is the imaging process for a myopic person, and (C) is the imaging process for a hyperopic person, in accordance with the present application;
FIG. 3 illustrates a three-dimensional structure of an optical imaging array assembly according to one embodiment of the present application;
FIG. 4 illustrates a particular manner of placing an aperture array and a microlens array adjacent to each other according to the present application, wherein (A) the center microlens is attached at the edge of the aperture, (B) the center microlens is integrally placed adjacent to the aperture array by cavity accommodation, and (C) the center microlens is embedded in the aperture;
FIG. 5 shows a front structure of a microdisplay array according to one embodiment of the application;
fig. 6 illustrates a process of image source disassembly and combined imaging according to an embodiment of the present application, wherein (a) is the disassembly of an image source into a sub-image source, and (B) is the composition of an image from sub-images;
FIG. 7 shows a schematic optical path diagram of an optical imaging array assembly forming an image from an image source according to one embodiment of the present application;
FIG. 8 shows an exploded view of the angular field of view of an optical imaging array assembly according to one embodiment of the present application imaging on the retina;
FIG. 9 shows structural parameters of an optical imaging array assembly according to an embodiment of the present application;
FIG. 10 shows the structure of an image processing unit according to one embodiment of the present application, and the manner in which the image processing unit is connected to an optical imaging array assembly;
FIG. 11 illustrates a variable focus optical imaging array system in accordance with one embodiment of the present application, wherein a control unit is coupled to a microdisplay array, and the system is used in an imaging process for people with different levels of vision, wherein (A) is used for normal-sighted people, (B) is used for near-sighted people, and (C) is used for far-sighted people; and
FIG. 12 shows a variable focus optical imaging array system in accordance with one embodiment of the present application, wherein the control unit is coupled to the microlens array, and the imaging process of the system is used by persons of different vision levels, wherein (A) is the imaging process used by a person with normal vision, and (B) is the imaging process used by a person with myopia.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
In the application, for watching an object under near-eye display, a micro lens is arranged at the adjacent position of an aperture while improving imaging definition by using the aperture. The use of aperture + microlens combinations according to the present application has a number of advantages:
first, the use of aperture + microlens combinations gives better compromise between image sharpness and brightness than if only apertures were used. Fig. 1(a) shows an imaging process for viewing an object in a near-eye display using only a small hole. As shown in FIG. 1(A), an image light emitted from an object point O is blocked by a pinhole CH and then formed as CLaAnd CLbIs a diverging beam of boundary rays that enters the pupil and is refracted by the pupil to converge. In near-eye display, the convergence point of the refracted beam is located behind the retina for a person with normal, near, or far vision, and the refracted beam is incident on the retina and imaged before the convergence point is formed. Referring to FIG. 1(A), boundary ray CLaAnd CLbThe incidence points on the retina are respectively CPaAnd CPbThus, corresponding to the object point O, the image actually seen by the eyeball is the image from the incident point CPaAnd CPbA defined aperture. Fig. 1(B) shows an imaging process for viewing an object in a near-eye display using a pinhole + microlens combination, in which a pinhole H and a microlens M are adjacently disposed, and the aperture of the pinhole H is larger than the pinhole CH in fig. 1 (a). As shown in FIG. 1(B), the image light emitted from the object point O is first blocked by the small hole H and then formed into La' and Lb' is a divergent beam of boundary rays, and since the aperture of the pinhole H is larger than that of the pinhole CH in FIG. 1(A), the aperture formed by the refraction of the beam through the pupil on the retina is correspondingly larger than that formed by the incident point CP in FIG. 1(A)aAnd CPbA defined aperture. I.e. corresponding to the object point O, a diaphragm with reduced sharpness will be seen by only increasing the aperture of the small aperture. However, in fig. 1(B), a microlens M is further provided adjacent to the aperture H so as to be at La' and LbThe light beam of the boundary light rays is deflected inward in the direction of the arrow in the figure, and is formed with L after passing through the microlens MaAnd LbA beam of boundary rays with reduced divergence angle, and refracted by the pupil to form a beam with an incident point P on the retinaaAnd PbA defined aperture of correspondingly reduced size. By comparison, it can be seen that, with the proper arrangement, P is shown in FIG. 1(B)aAnd PbThe defined aperture is compared with the aperture shown by CP in FIG. 1(A)aAnd CPbThe size of the defined aperture may be comparable; and the imaging brightness can be higher than that of fig. 1(a) because the aperture of the small hole is larger in fig. 1 (B). Thus, according to the present application, using the pinhole + microlens combination, the brightness of the image can be improved while ensuring the image clarity of the near-eye display.
Second, the use of aperture + microlens combinations can improve the clarity of imaging and improve adaptability to different levels of vision as compared to the use of apertures alone. Fig. 2(a) shows an imaging process for a normal-sighted person in a near-to-eye display using an aperture + microlens combination, in which an aperture H and a microlens M are adjacently disposed, and the microlens M is disposed at a distance equal to 1 focal length (═ f) from an object point O. As shown in fig. 2(a), the image light emitted from the object point O is blocked by the small hole H and then enters the micro lens M, at this time, the object point O is located at 1-fold focal length of the micro lens M, so that it is modulated to form a parallel light beam, and for a person with normal vision, the parallel light beam is refracted by the pupil and then converged to a point on the retina, thereby forming a clear image. Fig. 2(B) shows the imaging process for a myopic person in a near-to-eye display using an aperture + microlens combination, where the aperture H and the microlens M are positioned adjacent and the microlens M is positioned less than 1 focal length (< f) from the object point O. For a person with myopia, parallel light beams from the outside are generally refracted through the pupil and converged in front of the retina, and then spread on the retina after passing through the convergence point, thereby forming an aperture on the retina. In this case, as shown in fig. 2(B), the distance between the microlens M and the object point O can be reduced so that the object point O is located within 1 × of the focal length of the microlens M, and thus the emitted image light is blocked by the pinhole H and modulated by the microlens M to form a beam that diverges to some extent with respect to the parallel beam. At this time, the convergence point formed by the refraction of the light beam through the pupil is shifted backward compared to the incidence of the parallel light beam, so that the convergence point can be shifted to the retina to form a clear image when the arrangement is proper. Fig. 2(C) shows the imaging process for a person with far vision in a near-to-eye display using an aperture + microlens combination, where the aperture H and the microlens M are arranged adjacently and the microlens M is arranged at a distance greater than 1 focal length (> f) from the object point O. For a person with far vision, parallel light beams from the outside are generally refracted through the pupil and converged behind the retina, and are incident on the retina before the convergence point is actually formed, thereby forming an aperture on the retina. In this case, as shown in fig. 2(C), the distance between the microlens M and the object point O can be increased so that the object point O is located beyond 1-fold focal length of the microlens M, and thus the emitted image light is blocked by the pinhole H and modulated by the microlens M to form a light beam that converges to some extent with respect to the parallel light beam. At this time, the convergence point formed by the light beam refracted through the pupil is shifted forward compared to the incidence of the parallel light beam, and thus, in a suitable setting, the convergence point can be shifted to the retina, thereby forming a clear image. Therefore, according to the application, by using the combination of the small holes and the micro lenses, people with different vision levels can see clear imaging under near-to-eye display through structural adjustment.
In addition, by using the aperture + microlens combination, structural optimization of the optical module can be achieved compared to using a conventional lens. Generally, the focal length of a lens is related to the aperture, i.e. the larger the aperture of the lens the larger the focal length. Under the condition of the combination of the small hole and the micro lens, the light beam of the image light emitted by an image source is greatly reduced in size by shielding the small hole; at this time, the image light can be optionally modulated using the microlens without using a conventional lens. The size (e.g., thickness) of the optical module can be reduced to a large extent because the aperture and focal length of the microlens are much smaller than conventional lenses, thereby allowing the image source to be placed closer together. From this, use aperture + microlens combination according to this application, can effectively improve optical module's structure to be favorable to improving the outward appearance, the weight of relevant product and use and experience etc..
Finally, the size of the object that can be viewed through a single aperture + microlens combination is still relatively limited. When the size of the target object (e.g., image source) is large, it can be disassembled into a plurality of parts according to the present application, while using an array structure provided with a plurality of apertures and a plurality of microlenses, corresponding parts of the target object are imaged through each aperture + microlens combination, and the images are recombined into a complete image corresponding to the target object. The present application is made based on the above inventive concept.
In the present application, unless otherwise specified, the expression "plurality" usually means two or more, i.e., a number ≧ 2.
In the present application, for convenience of description and understanding, a simplified equivalent model as shown in the respective drawings is used instead of the actual structure of the human eye, in which the dioptric system of the human eye is simplified into an integral single lens and the aperture of the integral single lens is set to be equal to the pupil diameter of the human eye, so that the pupil can be understood as being integrated into the integral single lens without being intentionally distinguished. In this case, in the context of the present application, the above-mentioned unitary single lens or related components therein will sometimes be represented by the "pupil" for simplicity of description. For example, the light rays are expressed as "refractile through the pupil" through the cornea and/or the crystalline lens, and the light center of the entire single lens is expressed as "pupil center" or the like, which should be appropriately understood by those skilled in the art according to the specific circumstances.
A first aspect of the present application relates to an optical imaging array assembly. The three-dimensional structure of the optical imaging array assembly is shown in fig. 3.
Referring to fig. 3, the optical imaging array assembly AM includes a microdisplay array 11, an aperture array 12, and a microlens array 13, where the aperture array 12 and the microlens array 13 are disposed adjacent to each other and downstream of the microdisplay array 11 in the image light propagation direction.
The microdisplay array 11 comprises a plurality of microdisplays 110 where the microdisplays 110 are arranged to form an array. Aperture array 12 includes a plurality of apertures 120, wherein apertures 120 are the same number as microdisplay 110 and are arranged in the same manner as microdisplay 110 to form an array. The microlens array 13 includes a plurality of microlenses 130, wherein the number of microlenses 130 is the same as the microdisplay 110 and are arranged in the same manner as the microdisplay 110 to form an array. The microdisplays 110, apertures 120, and microlenses 130 are in one-to-one correspondence, i.e., each microdisplay 110 and its corresponding aperture 120 and microlens 130 form a light channel.
The microdisplay array 11 displays the image source S as a whole, with each microdisplay 110 displaying a portion of the image source S after disassembly, i.e. the sub-image source Si. A display sub-image source S from the micro-display 110iIs blocked by the corresponding aperture 120 and modulated by the micro-lens 130, and then enters the pupil, and forms a sub-image F at a predetermined region of the retinai. All sub-image sources SiRespectively formed sub-images FiForming an image F corresponding to the image source S.
According to the present application, there is no particular limitation on the type of microdisplay 110 in the microdisplay array 11. For example, the micro-display 110 may be a self-emitting LED, OLED, or MLED, among others. Alternatively, the micro display 110 may be a DMD, LCoS, or LCD, etc., that does not emit light itself, in which case it would be desirable to provide a light source for external illumination, including but not limited to, for example, a laser light source, an LED light source, or an OLED light source, etc.
According to the present application, there is no particular limitation on the implementation of the orifice array 12 as long as it can suitably provide the orifices 120. On this basis, the size and the specific location distribution of the small holes 120 can be further controlled by optimizing the design. In one embodiment of the present application, the aperture array 12 is implemented in the form of a diaphragm comprising a plurality of apertures 120. In one embodiment of the present application, the array of orifices 12 takes a planar form, wherein all of the orifices 120 are at the same height.
According to the present application, the order of disposing the small hole array 12 and the microlens array 13 is not particularly limited as long as both are disposed adjacently. In one embodiment of the present application, the microdisplay array 11, the aperture array 12, and the microlens array 13 are arranged in this order along the image light propagation direction. In another embodiment of the present application, the microdisplay array 11, the microlens array 13, and the aperture array 12 are arranged in this order along the image light propagation direction. Hereinafter, description will be made only in the case where image light passes through the aperture array 12 and the microlens array 13 in this order.
According to the present application, the mode of adjacently disposing the small hole array 12 and the microlens array 13 is not particularly limited as long as both can be fixed at the corresponding positions. On the basis, the specific positions of the two can be further set through optimization design, so that clear imaging is realized. As a specific embodiment, as shown in fig. 4(a), the micro-lenses 130 may be attached (e.g., pasted) at the edges of the corresponding apertures 120. As an alternative embodiment, as shown in fig. 4(B), the microlenses 130 can be placed in a correspondingly shaped transparent cavity (e.g., a plastic shell) and the cavity can be placed entirely adjacent to the array of apertures 12. Alternatively, as shown in fig. 4(C), the microlenses 130 may also be embedded in the corresponding apertures 120 from upstream or downstream in the image light propagation direction.
According to the present application, when changing the relative position of one of the aperture array 12 and the microlens array 13 to the microdisplay array 11, the other should be understood to naturally change the position as the former, which will not be described in detail later.
In one embodiment of the present application, the microdisplay array 11 is positioned at or near the focal plane of the microlens array 13 such that the image light emitted by the microdisplay 110 is modulated by its corresponding microlens 130 into a parallel or nearly parallel beam.
In another embodiment of the present application, the microdisplay array 11 is located within the focal plane of the microlens array 13 such that the image light from the microdisplay 110 is modulated by its corresponding microlens 130 to form a beam that is somewhat divergent with respect to the parallel beam.
In yet another embodiment of the present application, the microdisplay array 11 is located outside the focal plane of the microlens array 13 such that the image light emitted by the microdisplay 110 is modulated by its corresponding microlens 130 to form a light beam that converges to some extent with respect to a parallel light beam.
In one embodiment of the present application, the microdisplays 110 are spaced apart from each other by a spacing region, such as a non-transmissive region, as shown in FIG. 3.
FIG. 5 shows a front structure of a microdisplay array 11 according to one embodiment of the application, which includes 9 microdisplays 110 arranged in a 3 × 3 rectangle. It will be appreciated by those skilled in the art that the arrangement shown in figure 5 is by way of example only. According to the present application, the microdisplays 110 in the microdisplay array 11 may adopt various common arrangement modes, including a one-dimensional array arrangement mode (for example, all the microdisplays 110 are arranged in a row), and also a two-dimensional array arrangement mode; in addition, the array can be a regularly arranged array (as shown in fig. 3 and fig. 5) or an irregularly arranged array, and a person skilled in the art can freely set the array according to actual scenes and requirements without creative efforts. In the present application, a rectangular arrangement is preferred. Also, in the case of an a × b rectangular arrangement (where a and b are the number of microdisplays in the y-axis and x-axis directions in FIG. 3, respectively), a and b may be the same or different. When the microdisplays 110 are arranged in an a × b rectangle, the image source S is correspondingly disassembled into a × b regions (i.e., sub-image sources S)i) Each of which is displayed by one of a x b microdisplays 110 as part of an image source S. With proper arrangement, each microdisplay 110 can be made to display a sub-image source SiForming corresponding sub-images F at predetermined positions of the retinaiAnd these sub-images FiA complete image F corresponding to the image source S is composed.
Fig. 6 illustrates a process of image source disassembly and combined imaging according to an embodiment of the present application, where fig. 6(a) illustrates 9 sub-image sources S into which image source S is disassembled1~S9(ii) a FIG. 6(B) shows an image F formed on the retina by the optical imaging array assembly of the present application, which is actually F1~F9Of 9 sub-images (the dashed lines are drawn for illustrative purposes only).
The microdisplay array 11 and the microdisplays 110 included therein can be implemented in a variety of ways in accordance with the present application. In one embodiment of the present application, the microdisplay array 11 is formed by combining multiple independent microdisplaysIs provided on a substrate. At each subsidiary image source S separated by the image source SiWith the same number, location and size, the same number, location and size of microdisplays 110 can simply be mounted on the substrate. In more cases, however, to accommodate different image source splitting schemes (S)iChange in number, location, and/or size) of the microdisplay array 11, the area available for display in the microdisplay array 11 needs to be increased so that the size of each microdisplay 110 is not smaller than the sub-image source S to be displayediAnd the pitch of the microdisplays 110 is not larger than the sub-image source S to be displayediThe minimum pitch of. In this regard, in a preferred embodiment of the present application, the microdisplay array 11 is implemented by designating a portion of the area of the microdisplay device, which is entirely covered with pixels, to be active and the other area to be inactive, wherein the microdisplay array 11 controls pixels in a specific area to emit light and pixels in the other area to be inactive according to the received command, where the area emitting light corresponds to the microdisplay and the area emitting no light corresponds to the spacing area.
Thus, the description of parameters such as number, position and size of a microdisplay in this application is generally understood to be a description of the corresponding parameters of the sub-image source to which the microdisplay corresponds. Hereinafter, unless otherwise specified, descriptions regarding "microdisplay" and "sub-image source" are not separately distinguished, and for example, the terms "microdisplay size" and "sub-image source size", "microdisplay pitch" and "sub-image source pitch", and "number of microdisplays" and "number of sub-image sources", etc. may be used interchangeably, as will be flexibly understood by those skilled in the art.
The optical process of imaging the optical imaging array assembly of the present application on the retina will be described below. Those skilled in the art will appreciate that the optical imaging process in the z-y coordinate system and the z-x coordinate system shown in FIG. 3 is the same or similar. Hereinafter, for convenience of description, description and limitation will be made only under a two-dimensional plane (z-y coordinate system). In this case, expressions such as "in the z-axis direction" and "in the y-axis direction" may sometimes be omitted for the sake of brevity, which should be flexibly understood by those skilled in the art.
In addition, since the optical imaging array module of the present application is in communication with the optical imaging principles in the z-y coordinate system and the z-x coordinate system shown in fig. 3, without introducing a description of the coordinate systems, either one of the y-axis direction and the x-axis direction in fig. 3 may be expressed non-specifically as the "first direction", while the other one of the two is naturally expressed as the "second direction", which should be appropriately understood by those skilled in the art.
FIG. 7 illustrates the optical path of an optical imaging array assembly to form an image from an image source according to one embodiment of the present application. As shown in fig. 7, the optical imaging array module AM includes a microdisplay array 11, an aperture array 12, and a microlens array 13 arranged in this order along the image light propagation direction, wherein the microdisplay array 11 includes microdisplays 111, 112, and 113 respectively displaying a sub-image source S1、S2And S3The aperture array 12 includes apertures 121, 122, and 123, and the microlens array 13 includes microlenses 131, 132, and 133, which are in one-to-one correspondence. Taking the microdisplay 111 as an example, one pixel point first emits a light beam L1(ii) a Light beam L1Passes through the aperture 121 and is blocked to form a beam L with a reduced beam size2(ii) a Light beam L2Then incident on the micro-lens 131, modulated to form a parallel or approximately parallel light beam L3(ii) a Light beam L3Then incident on the pupil, refracted through the pupil to the retina1The regions form corresponding image points. Thus, all the pixels in the microdisplay 111 are on the retina K1The areas together forming a sub-image F1. Displaying the sub-image source S similarly to the above-described process2The image light emitted from the microdisplay 112 is blocked by the aperture 122 and modulated by the microlens 132, and K is displayed on the retina2Area forming sub-image F2(ii) a Displaying a source S of a child image3The image light emitted from the microdisplay 113 is blocked by the aperture 123 and modulated by the microlens 133, and K is displayed on the retina3Region forming subgraphLike F3
Based on the same principles, the imaging process in the z-x coordinate system is similar to that described above in the z-y coordinate system. Therefore, the optical imaging array assembly can realize the optical process of disassembling the image source and recombining the image on the retina along the y-axis direction and the x-axis direction.
It should be noted here that the embodiment shown in fig. 7 shows the microdisplay array 11 at or near the focal plane of the microlens array 13, so as to accommodate, for example, a normal-sighted person. However, the present application is not limited thereto. According to the present application, the optical imaging array assembly AM may also be arranged in which the microdisplay array 11 is located inside/outside the focal plane of the microlens array 13, such that the light beam L in fig. 72Modulated by the micro-lenses 131 to form a beam that diverges/converges to some extent with respect to the parallel beam, and thus is suitable for, for example, a person with myopia/hyperopia. These embodiments are within the scope of the present application and are in principle in communication with the embodiment shown in fig. 7. Hereinafter, again for the sake of brevity, the description will be made only in the case of a person with normal vision, and for embodiments of persons with other levels of vision, this can be achieved correspondingly by varying the spacing of the microdisplay array from the microlens array and the degree of divergence/convergence of the modulated light beam, as described above.
It will be appreciated by those skilled in the art that in the optical imaging process described above, the boundaries of the individual sub-images formed on the retina need to be suitably interfaced with one another in order to be able to display an image that corresponds exactly to the image source. If the pitch of the sub-images is too small, an overlapping image area may be viewed; if the pitch of the sub-images is too large, an image-free area may be viewed. For this reason, the structure and parameters of the optical imaging array assembly of the present application need to be properly set.
From the perspective of the field of view (FOV), the eyeball can view a clear and complete image, which is understood to be a continuous distribution of incident angles formed by the light rays emitted from the image source in the range of the FOV with the pupil as the vertex. In the application, the image source is disassembled into a plurality of sub-image sources, wherein light rays emitted by each sub-image source enter the pupil through the corresponding small hole and the corresponding micro lens to form a part of the whole incident angle received by the pupil. That is, according to the present application, each sub-image source disassembled from the image source forms one angular component within the field angle range with the pupil as the vertex, and the sum of these angular components constitutes the entire field angle. In this case, it is also required that the image lights emitted from all the sub-image sources form a continuous distribution of incident angles in the field angle range with the pupil as a vertex.
Specifically, in the field angle range with the pupil as the vertex, adjacent angle components should be exactly butted, and there should be neither blank angle components (causing too large sub-image pitches to view an image-free region) nor overlapping angle components (causing too small sub-image pitches to view a ghost region). Therefore, in order to be able to view an image exactly corresponding to an image source, the boundary ray of each angular component constituting the angle of view should be made to coincide with the boundary ray of the adjacent angular component.
Fig. 8 illustrates angular resolution of field of view for imaging on the retina for an optical imaging array assembly according to one embodiment of the present application. As shown in fig. 8, the optical imaging array module AM includes a microdisplay array 11, an aperture array 12, and a microlens array 13 arranged in this order along the image light propagation direction, wherein the aperture array 12 and the microlens array 13 are arranged adjacently. The microdisplay array 11 includes microdisplays 111, 112, and 113, the aperture array 12 includes apertures 121, 122, and 123, and the microlens array 13 includes microlenses 131, 132, and 133, which are in one-to-one correspondence. Image source S is disassembled into 3 sub-image sources S1、S2And S3Respectively, in microdisplays 111, 112, and 113.
Referring to fig. 8, the image light from the microdisplay 111 passes through the aperture 121 and modulated by the microlens 131 to enter the pupil, where the boundary light rays are L11And L12. From boundary ray L11And L12The angle defined between the main direction of the image light and the eye axis AO is alpha1And boundary ray L11And L12Respectively having an angle with the eye axis AO of
Figure BDA0002468815810000181
And
Figure BDA0002468815810000182
here and in the following, the term "eye axis" may be generally understood as the main or central direction of the line of sight of a human eye when viewing an object.
Also plotted in FIG. 8 is ray L in dashed line11', which passes through the center of the pupil (corresponding to the optical center of the single lens) and is associated with the light ray L11Parallel light rays. L is11' the intersection of the extended line passing through the center of the pupil in the image light propagation direction and the retina is A1At this time due to L11And L11' is a parallel ray, which will converge at a point after passing through the pupil, namely A1And (4) point. Similarly, a ray L drawn with a dotted line12' passing through the center of the pupil and associated with the light ray L12Parallel to the retina at an intersection A of an extension line passing through the pupil center in the image light propagation direction and the retina2Thus, the light L12Through the pupil and converge at A2And (4) point.
Thus, the microdisplay 111 emits the boundary light L11And L12The defined image light is imaged equivalently to the boundary light L11' and L12The light beam defined, the apex of which is the pupil center, and the angular component within the field angle range with the pupil as the apex is
Figure BDA0002468815810000183
To
Figure BDA0002468815810000184
And, the micro display 111 emits the boundary light L11And L12The incidence area of the image light on the retina is defined as A1Point and point A2Between the dots, which form sub-images F in the area1And sub-image source S1And (7) corresponding.
Similar to the above, the microdisplay 112 emitsThe image light is blocked by the small hole 122 and modulated by the micro lens 132 to enter the pupil, and the boundary light rays are L respectively21And L22. From boundary ray L21And L22The angle defined between the main direction of the image light and the eye axis AO is alpha0And boundary ray L21And L22Respectively having an angle with the eye axis AO of
Figure BDA0002468815810000185
And
Figure BDA0002468815810000186
light ray L drawn with a dotted line21' and L22' pass through the center of the pupil and respectively communicate with the light ray L21And L22Parallel to each other, and the intersection points of the extended lines passing through the center of the pupil along the image light propagation direction and the retina are B1And B2Thus, the light L21And L22Respectively converge at B after passing through the pupil1Points and B2And (4) point. Thus, the microdisplay 112 emits boundary light L21And L22The defined image light is imaged equivalently to the boundary light L21' and L22The light beam defined, the apex of which is the pupil center, and the angular component within the field angle range with the pupil as the apex is
Figure BDA0002468815810000187
To
Figure BDA0002468815810000188
Also, the microdisplay 112 emits the boundary light L21And L22The incidence area of the image light on the retina is defined as B1Point and B2Between the dots, which form sub-images F in the area2And sub-image source S2And (7) corresponding.
Similarly to the above, the image light from the microdisplay 113 is blocked by the aperture 123 and modulated by the microlens 133, and then enters the pupil, where the boundary light rays are L31And L32. From boundary ray L31And L32Defined imageThe angle between the main direction of the light and the eye axis AO is alpha-1And boundary ray L31And L32Respectively having an angle with the eye axis AO of
Figure BDA0002468815810000191
And
Figure BDA0002468815810000192
drawing the ray L with a dotted line31' and L32' pass through the center of the pupil and respectively communicate with the light ray L31And L32Parallel to each other, and the intersection points of the extended lines passing through the center of the pupil along the image light propagation direction and the retina are respectively C1And C2Thus, the light L31And L32Respectively converge at C after passing through the pupil1Point and C2And (4) point. Thus, the microdisplay 113 emits the boundary light L31And L32The defined image light is imaged equivalently to the boundary light L31' and L32The light beam defined, the apex of which is the pupil center, and the angular component within the field angle range with the pupil as the apex is
Figure BDA0002468815810000193
To
Figure BDA0002468815810000194
And, the micro display 113 emits the boundary light L31And L32The incidence area of the image light on the retina is defined as C1Point and C2Between the dots, which form sub-images F in the area3And sub-image source S3And (7) corresponding.
As can be seen from FIG. 8, the equivalent light ray L11'、L12'、L21'、L22'、L31' and L32' all pass through the pupil center and together constitute a field angle range with the pupil as the vertex. Obviously, in order to make the image light emitted from all the sub-image sources disassembled from the image source form a continuous distribution of incident angles in the field angle range with the pupil as the vertex, wherein the adjacent angle components are exactly butted, the adjustment is performed to ensure thatEquivalent light L12' and L21' coincidence and equivalent light L22' and L31' coincidence, so that A is on the retina2Point and B1Point coincidence and B2Point and C1The points coincide. In this case, the incident light beams formed by the micro display 111, the micro display 112 and the micro display 113 after the image light is blocked by the corresponding small holes and modulated by the micro lens are equivalent to the incident light beams with the pupil as the vertex and the equivalent light L as the vertex11' and L32' an incident beam of a continuous distribution of incident angles for a boundary; and its A on the retina1Point and C2Formed by sub-image F in the area between the points1、F2And F3A co-composed image, a source S of sub-images in the image source1、S2And S3Corresponding to the image, and does not contain an image-free region or a ghost region.
In the embodiment shown in fig. 8, the optical imaging array assembly of the present application includes 3 microdisplays and corresponding apertures and microlenses, and has a symmetric structure. However, it will be appreciated by those skilled in the art that the above-described structure is merely a simplified illustration for ease of description. Based on this configuration, the number of microdisplays (and corresponding apertures and microlenses) is not limited to 3, and a greater or lesser number of microdisplays (and corresponding apertures and microlenses) can be included. Also, the optical imaging array assembly of the present application need not be symmetrical in structure, for example, where the size and spacing of the individual microdisplays and apertures may be the same or different. Of course, it may be preferable to set the sizes and pitches of the individual micro-displays and the apertures to be the same from the viewpoints such as structural regularity and processing efficiency. Nevertheless, simple modifications and local adjustments based on the basic structure described above are within the scope of the present application.
From the above description of image source disassembly and combined imaging, those skilled in the art can understand that in order to realize the desired optical path, that is, to make the image lights emitted from all the sub-image sources disassembled from the image source form a continuous distribution of incident angles within the field angle range taking the pupil as the vertex, it is necessary to set the structure and parameters of the optical imaging array component of the present application reasonably.
FIG. 9 shows structural parameters of an optical imaging array assembly according to one embodiment of the present application. As shown in fig. 9, the optical imaging array module AM includes a microdisplay array 11, an aperture array 12, and a microlens array 13, which are sequentially arranged along the image light propagation direction, wherein the microdisplay array 11 includes microdisplays 111, 112, and 113, the aperture array 12 includes apertures 121, 122, and 123, and the microlens array 13 includes microlenses 131, 132, and 133, which correspond one to one and each form a light channel.
The meaning of the structural parameters indicated in fig. 9 is explained below: d1Showing the length of the microdisplay along the y-axis (hereinafter referred to as the microdisplay size); p1Showing the distance of the middle points of adjacent microdisplays along the y-axis direction (hereinafter referred to as the microdisplay pitch); p0Indicating the distance of the middle points of the adjacent pores along the y-axis direction (hereinafter referred to as the pore distance); l isDAThe distance between the micro display array and the small hole array (between the micro display and the small hole) along the z-axis direction (hereinafter referred to as micro display-small hole distance) is shown; l is0The distance between the aperture array and the pupil center along the z-axis direction (hereinafter referred to as the aperture-pupil distance); dERepresents the pupil diameter; dCIndicating the distance between the pupil center and the other pupil center along the y-axis direction (hereinafter referred to as the crosstalk safe distance).
The number of microdisplays (and corresponding apertures and microlenses) in the y-direction is defined as N. Here, since the microdisplays, apertures, and microlenses are in a one-to-one correspondence in this application, and are equal in number, the term "number N of microdisplays (and corresponding apertures and microlenses") will sometimes be expressed for the sake of brevity as "number N of microdisplays", "number N of apertures", "number N of microlenses", and the like, which have the same or equivalent meaning.
Also, as previously mentioned, the description of "microdisplay" and "child image source" in this application is not particularly differentiated. Thus, D1Also understood as the length of the sub-image source along the y-axis (sub-image source size for short); p1The distance between the middle points of adjacent sub-image sources along the y-axis direction (called sub-image source spacing for short) can also be understood; n can also be understood as the number of subimage sources in the y-axis direction.
Further, as described previously, in the present application, the aperture array and the microlens array are disposed adjacently without any order restriction, and when the position of one of the two is changed, the other naturally changes the position with the former. Therefore, in the present application, there is no particular distinction between the micro-display array and the aperture array/microlens array pitch, LDAIt can also be understood as the distance between the microdisplay array and the microlens array (microdisplay and microlens) along the z-axis direction (abbreviated as microdisplay-microlens spacing).
First, in the structure and parameter setting of the optical imaging array assembly of the present application, it should be considered to prevent the occurrence of undesirable optical path problems such as crosstalk. Referring to fig. 9, microdisplay 112 is illustrated as an example. According to the present application, the microdisplay 112 forms an optical channel with the aperture 122 and the microlens 132, and the image light of the sub-image source displayed by the microdisplay is blocked by the aperture 122 and modulated by the microlens 132, and then enters the pupil, and forms a sub-image on the retina. In practice, since the image light emitted by the microdisplay 112 is divergent light, it is incident through not only the aperture 122 and microlens 132, but also possibly other apertures and microlenses, such as illustrated in FIG. 9, that form undesired optical paths with the aperture 123 and microlens 133. When image light emitted from the same microdisplay is formed into a plurality of incident light beams by a plurality of small holes and microlenses, and the light beams are simultaneously incident on the same pupil, a crosstalk phenomenon occurs, which causes image superposition. To prevent this problem, it is ensured that, of the image light emitted by the microdisplay 112, only the light beam formed by the aperture 122 and the microlens 132 can be incident on the pupil, and the light beam formed by the other aperture and microlens cannot be incident on the pupil.
About the interpupillary distance L0For L in the present application0The numerical range of (b) is not particularly limited. However, when the optical imaging array assembly of the present application is applied to, for example, the near-eye display field, the L is excessively large0And is not of practical significance. For practical purposes, inIn one embodiment of the present application, L0In the range of 5 to 30mm, preferably in the range of 10 to 20 mm.
About the pupil diameter DEFor D, this applicationEThe numerical range of (b) is not particularly limited. Broadly speaking, the diameter D of the pupil of a personECan be in the range of 2-5 mm. In one embodiment of the present application, D is a relatively general caseEIs about 3 mm.
With regard to microdisplay-aperture spacing LDATheoretically, it can always be varied by other parameters (e.g., microdisplay size D)1And microdisplay pitch P1) Are adjusted accordingly to balance. For example when LDAWhen increasing or decreasing, D can be increased or decreased accordingly1And P1To maintain the imaging. However, in one aspect, LDAThe smaller the parameter that directly affects the thickness of the optical module, the better, D1It is also set correspondingly as small as possible, but at the display single pixel size (R)0) Limitation when D is1When the resolution of the sub-image source is reduced, the resolution is also reduced, so that the imaging quality is reduced; on the other hand, if LDAIf the arrangement is too large, the thickness of the optical module is increased accordingly, resulting in an oversized product. In view of this, for practical reasons, in one embodiment of the present application, LDAIn the range of 0.5 to 30mm, preferably in the range of 1.4 to 13 mm.
With regard to microdisplay size D1It should be set within a reasonable range of values. Given other conditions, if D1If the size of the aperture is too large, the incident beam formed by the image light emitted by the micro display after being shielded by the corresponding aperture and modulated by the micro lens cannot be completely received by the pupil, or the distance L between the micro display and the aperture is excessively increasedDAThe overall thickness of the optical module is too large; and if D1Too small, as previously described, may result in a reduction in the quality of the imaging due to a reduction in the resolution of the image source. Further, from the basic geometrical relationship shown in fig. 9, the following relationship can be obtained:
Figure BDA0002468815810000221
in the formula (1), DEAnd L0Having relatively definite numerical ranges by themselves, in combination with L as described aboveDACan be defined accordingly by D1The numerical range of (c). In one embodiment of the present application, D1In the range of 0.5 to 5mm, preferably in the range of 1 to 3 mm.
In the optical imaging array assembly of the present application, as previously described, the microdisplay array includes a total number of microdisplays ≧ 2. Under this premise, the number of microdisplays included in the microdisplay array in any one of the y-axis direction and the x-axis direction in fig. 3 is at least 1, i.e., the lower limit of the numerical range of the number N of microdisplays is 1. Based on the above, the number N of the micro-displays and the size D of the micro-displays for a given image source1Qualitatively in inverse proportion. Therefore, if N is set too small, D will be the result as described above1Too large a pupil to receive the entire incident beam or too large a thickness of the optical module to adversely affect the appearance, weight, etc. of the product. On the other hand, if N is set too large, D will be the cause of1Too small results in too low resolution of the sub-image source, affecting the imaging quality. In practical applications, too low a resolution is clearly undesirable and of limited value. For practical purposes, in one embodiment of the present application, 640 × 480 is used as the lowest resolution of the sub-image source, and is taken into account when determining the value range of N. Thus, in one embodiment of the present application, under the condition that the total number of microdisplays is ≧ 2, the number N of microdisplays included in the microdisplay array in the first direction or the second direction is in the range of 1 to 10, preferably in the range of 2 to 5.
Safety distance D with respect to crosstalkCIt is understood as the minimum beam pitch at which the incident beams of image light from the same microdisplay via adjacent apertures and microlenses do not simultaneously strike the same pupil, i.e., as a measure of the absence of crosstalk. FIG. 9 shows makingThe actual eyeball E, which is an image formed by an image source, and the imaginary eyeball E' located in the vicinity of the actual eyeball E in the y-axis direction are viewed with the optical imaging array assembly of the present application. Taking the microdisplay 112 as an example, a part of the image light emitted therefrom is incident on the pupil of the actual eyeball E via the aperture 122 and the microlens 132, and another part of the image light is incident on the pupil of the virtual eyeball E' via the aperture 123 and the microlens 133. Thus, DCMay be equivalent to the minimum distance between the pupil center of the actual eyeball E and the pupil center of the imaginary eyeball E'. In this case, D is understood by those skilled in the artCAs long as it is not less than DEThe above conditions can be theoretically satisfied. However, in practical application, D isCIs set to be larger than D to a certain extentEIt may be necessary to more surely prevent the problem of optical path overlapping such as crosstalk.
Therefore, in not less than DEOn the premise of (1), the application is directed to DCThe lower limit of the numerical range of (b) is in principle not limiting. For practical reasons, in one embodiment of the present application, DCThe lower limit of the numerical range of (A) is selected from the group consisting of 5mm, 5.5mm, 6mm, 6.5mm and 7mm, preferably 6 mm.
In other words, the present application is directed to DE/DCThe upper limit of the numerical range of (B) is in principle not restricted as long as it is ≦ 1. For practical reasons, in one embodiment of the present application, DE/DC0.6 or less, preferably 0.5 or less.
On the other hand, DE/DCToo small a value of (b) is also not suitable. From the basic geometrical relationship shown in fig. 9, the following relationship can also be obtained:
Figure BDA0002468815810000231
Figure BDA0002468815810000232
from the combination of equation (2) and equation (3) with equation (1), the following relationship is derived:
Figure BDA0002468815810000241
as can be seen from formula (4), DE/DCAnd D1/P1Are equal in value. D1/P1As a ratio of microdisplay size to microdisplay pitch, it can reflect the ratio of light-emitting to non-light-emitting regions in a microdisplay array, and thus can be a measure of the luminous efficiency of a microdisplay array. In general, it is advantageous for the luminous efficiency of the microdisplay array to be as high as possible within a reasonable range, while a too low luminous efficiency is undesirable. At the same time, this also explains DE/DCThe reason why the value of (b) is not set too small is not suitable.
Thus, this application is directed to D1/P1The lower limit of the numerical range of (b) is in principle not limiting. However, for practical reasons, in one embodiment of the present application, D1/P10.2 or more, preferably 0.4 or more.
Accordingly, in one embodiment of the present application, DE/DC0.2 or more, preferably 0.4 or more.
Accordingly, in one embodiment of the present application, D1/P10.6 or less, preferably 0.5 or less.
About the pitch P of the holes0It should be set within a reasonable range of values. Given other conditions, if P0If it is too small, problems such as crosstalk and the like that cause optical paths to overlap are relatively easily generated; and if P0Too large may result in too large a pitch of sub-images formed on the retina by the sub-image sources to view the non-image area. Further, according to formula (3), at L0Having relatively definite numerical ranges by themselves, in combination with L as described aboveDAAnd DCIn the case of the limitation of (3), P can be defined accordingly0The numerical range of (c). In one embodiment of the present application, P0In the range of 1 to 5mm, preferably in the range of 2 to 4 mm.
In addition, with regard to the aperture of the pores, it should also be set within a reasonable numerical range. As described above, under the given other conditions, if the aperture is too small, the light intensity of the light beam formed by the image light emitted by the microdisplay after being blocked by the small aperture is too low, which results in too dark image and affects the viewing; whereas if the aperture is too large, the effect of reducing the beam size of the image light emitted by the microdisplay by aperture blocking is diminished, and the size (e.g., thickness) of the optical module increases due to the corresponding increase in the aperture and focal length of the microlens. According to the present application, the aperture should in principle be selected to be as small as possible while ensuring the brightness of the image. In view of the brightness level of currently commercially available micro-display devices, for practical reasons, in one embodiment of the present application, the aperture diameter of the small hole is in the range of 0.2 to 3mm, preferably in the range of 0.5 to 2 mm.
Although the principal structural parameters of the optical imaging array package of the present application have been described above as being individually resolved for ease of understanding, those skilled in the art will appreciate that these structural parameters are interacting and constrained as a whole system, and that the determination and adjustment of the range of values involves overall coordination of the entire system, rather than being easily accomplished by only limited variation of individual parameters.
Further, as previously described, according to the present application, each microdisplay in the optical imaging array assembly and its corresponding aperture and microlens form a light channel; however, if image light emitted from a microdisplay is incident on a pinhole and a microlens that are not aligned with the microdisplay, problems such as crosstalk and the like may occur that cause optical paths to overlap. To eliminate or at least mitigate problems such as crosstalk, in addition to the structural and parameter settings according to the above, in one embodiment of the present application, physical boundaries may be formed between the individual optical channels. For example, spacers may be provided between the microdisplay array and the aperture/microlens array, which may be pre-populated with through holes corresponding to the individual light channels to enable the definition and splitting of the light channels. Such spacers can be made, for example, by 3D printing techniques and can be blackened as needed to form the matting tube.
The structure and optical imaging process of the optical imaging array assembly according to the first aspect of the present application are described above. In the following, a variable focus optical imaging array system according to the second aspect of the present application will be described.
According to this application, through disassembling the image source and utilizing aperture and microlens array structure recombination formation of image, the optical imaging array subassembly that obtains from this not only can improve luminance when guaranteeing the formation of image definition under near-to-eye shows, can show the size that reduces optical module moreover. In particular, under the premise of satisfying the imaging condition of the present application, that is, under the premise that image lights emitted from all sub-image sources disassembled from the image sources form a continuous distribution of incident angles in a field angle range with a pupil as a vertex, by setting the distance between the micro-display array and the micro-lens array in the optical imaging array assembly to be smaller than/larger than the focal length of the micro-lens array, the image lights incident to the pupil through the micro-display array can be correspondingly diverged/converged, thereby improving the visual effect of a person with myopia/hypermetropia. Therefore, for a person with a certain vision level, clear imaging can be observed by the optical imaging array component provided with a specific structure and parameters. On the basis, if the distance between the micro display array and the micro lens array in the optical imaging array assembly is constructed into a variable structure, so that the divergence/convergence degree of image light incident to the pupil through the variable structure relative to parallel light beams can be continuously adjusted, the use requirements of people with different vision levels can be met simultaneously, and the variable structure has important significance for improving the applicability and convenience of products.
In view of this, a second aspect of the present application relates to a variable focus optical imaging array system. The variable-focus optical imaging array system is suitable for people with different vision levels by setting the distance between the micro display array and the micro lens array in the optical imaging array assembly to be adjustable and correspondingly changing the disassembling mode of the image source, thereby improving the applicability and convenience in application.
A variable focus optical imaging array system according to a second aspect of the present application comprises an optical imaging array assembly according to the first aspect of the present application, and a control unit and an image processing unit coupled to the optical imaging array assembly, wherein the control unit is configured to change a pitch of a microdisplay array and a microlens array in the optical imaging array assembly; the image processing unit is configured to adjust display in the microdisplay according to a distance between the microdisplay array and the microlens array in the optical imaging array assembly, for example, change a disassembling mode of an image source according to a change of the distance between the microdisplay array and the microlens array, and display an obtained sub-image source in the corresponding microdisplay.
In the variable focus optical imaging array system of the present application, the change of the pitch of the microdisplay array and the microlens array is achieved by the control unit adjusting the relative positions of the microdisplay array and the microlens array. In one embodiment of the application, a control unit is coupled to the microdisplay array to change its spacing from the microlens array by controlling the microdisplay array to move. In another embodiment of the present application, a control unit is coupled to the microlens array to change its spacing from the microdisplay array by controlling the movement of the microlens array. In yet another embodiment of the present application, a control unit is coupled to the microdisplay array and the microlens array, respectively, to change the spacing of the microdisplay array and the microlens array by controlling their movement.
According to the present application, there is no particular limitation on the manner in which the microdisplay array and/or the microlens array are moved by the control unit, as long as the spacing between the microdisplay array and the microlens array can be made to vary. For example, movement of the microdisplay array and/or the microlens array may be accomplished by means such as mechanical, electrical, etc. In one embodiment of the application, the control unit comprises a screw in threaded connection with the microdisplay array and/or the microlens array, the microdisplay array and/or the microlens array being moved by rotation of the screw. In another embodiment of the present application, the control unit comprises a micro-motor coupled to the microdisplay array and/or the microlens array, by which the microdisplay array and/or the microlens array is driven to move.
In a particular embodiment of the present application, the control unit comprises a controller and an actuation means. The controller is coupled to the actuation means, and a user may effect control of the actuation means by operating the controller. I.e. corresponding to a screw threaded with a microdisplay array and/or a microlens array, for example as above, or a micro-machine coupled with a microdisplay array and/or a microlens array.
It will be appreciated by those skilled in the art that the above embodiments are only some exemplary ways of moving the microdisplay array and/or the microlens array by the control unit according to the present application, and that those skilled in the art will be able to envision other ways of adjusting the relative positions of the microdisplay array and the microlens array, and that the variable focus optical imaging array system thus obtained falls within the scope of the present application.
As described in the first aspect of the present application, when the spacing between the microdisplay array and the microlens array is changed, to ensure that the imaging conditions of the present application are met, i.e. the image light from all the sub-image sources disassembled from the image source forms a continuous distribution of incident angles in the field angle range with the pupil as the vertex, the structural parameters of the optical imaging array assembly (e.g. microdisplay size D) are required1And microdisplay pitch P1) And making corresponding adjustment. In other words, according to the present application, when the microdisplay array and/or the microlens array are moved by the control unit, the image source is disassembled again according to the new pitch of the microdisplay array and the microlens array, and a sub-image source which is matched with the new pitch and is changed in position and size is formed. In the variable focus optical imaging array system of the present application, the above process is realized by the image processing unit.
Optionally, the image processing unit comprises a measurement module configured to measure a pitch of the microdisplay array and the microlens array. The present application is not particularly limited with respect to the structure, form, etc. of the measuring module, which may be implemented in any manner known in the art. In one embodiment of the present application, the measurement module is configured to emit a light beam by one of the microdisplay array and the microlens array toward the other and receive the light beam reflected by the latter, and measure the spacing between the microdisplay array and the microlens array by detecting the time from emission to reception of the light beam or the phase difference of the emitted light and the reflected light, wherein the light beam may be selected from laser light, infrared light, and the like.
Optionally, the image processing unit comprises a computing module configured to convert pitch data of the microdisplay array and the microlens array into an array defining a display area in a microdisplay of the microdisplay array. In one embodiment of the application, the calculation module is configured to receive the microdisplay array to microlens array spacing data from the measurement module and computationally convert it into an array of positions and sizes of a region to be displayed in the microdisplay of the microdisplay array, such as center point coordinates and edge line coordinates of the region.
Optionally, the image processing unit comprises a decommissioning module configured to decommissioning the image source into a sub-image source according to an array of microdisplays defining a display area in the microdisplay array. In one embodiment of the present application, the un-binning module is configured to receive center point coordinates and edge line coordinates of a region to be displayed in a microdisplay of the microdisplay array from the computing module, thereby un-binning the image source into a plurality of portions having predetermined intervals, wherein each portion contains color information and/or brightness information of a corresponding pixel, thereby forming a plurality of sub-image sources. In addition, in one embodiment of the present application, a disassembly module in the image processing unit is coupled to the microdisplay array, and the sub-image source obtained through the above process is displayed in the microdisplay of the microdisplay array.
FIG. 10 shows the structure of an image processing unit according to one embodiment of the present application, and the manner in which the image processing unit is connected to an optical imaging array assembly. As shown in fig. 10, the optical imaging array module AM includes a microdisplay array 11, an aperture array 12, and a microlens array 13, which are arranged in this order along the image light propagation direction, wherein the aperture array 12 and the microlens array 13 are arranged adjacently. The image processing unit 15 comprises a measuring module 151, a calculating module 152 and a disassembling module 153, wherein the measuring module 151 measures the microdisplayThe pitch (L) between the array 11 and the microlens array 13DA) And sent to the calculation module 152; the calculation module 152 converts the received distance data between the microdisplay array 11 and the microlens array 13 into an array defining a display area in the microdisplay of the microdisplay array 11, and sends the array to the disassembling module 153; a disassembler module 153 is coupled to microdisplay array 11, which disassembles image source S into sub-image sources S from the received array defining the display area in the microdisplay of microdisplay array 11iAnd displayed in the microdisplays of the microdisplay array 11.
In a specific embodiment of the present application, a process of disassembling an image source into a sub-image source according to a distance between a microdisplay array and a microlens array by the image processing unit is described as follows, wherein the image processing unit includes a measuring module, a calculating module and a disassembling module, and the microdisplay array includes 9 microdisplays as an example for illustration.
Firstly, the measuring module measures the distance between the micro display array and the micro lens array, and the distance data LDA1And sending to a computing module. The calculation module receives the distance data LDA1Converting into coordinate values (x) corresponding to each bit on the microdisplay array according to a predetermined algorithm1,y1),(x2,y2)……(xn,yn) Wherein the coordinate values are divided into 9 groups, which respectively define the boundaries of 9 regions to be displayed in 9 microdisplays of the microdisplay array, and are sent to a disassembling module; wherein the boundaries of the 9 regions may be such that: corresponding to the pitch data LDA1The 9 regions are formed by the pinhole array and the micro lens array and the pupil of the human eye, sub-images formed on the retina can be spliced into a complete image, and no image-free region or ghost image region exists between the sub-images. The disassembling module determines the size and the position of the area to be displayed according to the received coordinate values, accordingly, the image source is disassembled into 9 parts correspondingly, and the resolution and the spacing distance of the disassembled parts are adjusted according to the determined size and position information, so that 9 sub-image sources are obtained.
When the distance between the micro display array and the micro lens array is changedThen, the measurement module will send new pitch data LDA2And sending to a computing module. The calculation module receives the distance data LDA2Converting into new coordinate value (x) according to predetermined algorithm1',y1'),(x2',y2')……(xn',yn') wherein the coordinate values are equally divided into 9 groups, but the boundaries of their defined areas are different from before, and sent to the dismantling module. And the disassembling module re-determines the size and the position of the area to be displayed according to the received new coordinate values, and adjusts the resolution and the spacing distance of each disassembled part of the image source according to the new coordinate values, so that 9 sub-image sources are obtained again.
Fig. 11 shows a variable focus optical imaging array system AS according to an embodiment of the present application, comprising an optical imaging array assembly AM and a control unit 14 and an image processing unit 15 coupled to the optical imaging array assembly AM.
The optical imaging array module AM includes a microdisplay array 11, an aperture array 12, and a microlens array 13 arranged in this order along the image light propagation direction, wherein the aperture array 12 and the microlens array 13 are arranged adjacently. The microdisplay array 11 includes microdisplays 111, 112, and 113, the aperture array 12 includes apertures 121, 122, and 123, and the microlens array 13 includes microlenses 131, 132, and 133, which are in one-to-one correspondence.
The control unit 14 is coupled to the microdisplay array 11 in the optical imaging array assembly AM, and can move the microdisplay array 11 along the z-axis direction to change the spacing (L) between the microdisplay array 11 and the microlens array 13DA)。
The image processing unit 15 can measure the pitch (L) between the microdisplay array 11 and the microlens array 13DA) And determining the disassembly mode of the image source S to obtain 3 sub-image sources S1、S2And S3. The image processing unit 15 is coupled to the microdisplay array 11 in the optical imaging array module AM for coupling the sub-image source S1、S2、S3Are displayed in microdisplays 111, 112, and 113, respectively, of microdisplay array 11.
The variable focus optical imaging array system as shown in FIG. 11Let us, fig. 11(a) shows its imaging process under use by a person with normal vision. As described above, for a person with normal vision, parallel light beams from the outside are refracted through the pupil and then converged to a point on the retina, thereby forming a clear image. Referring to fig. 11(a), under normal-sighted use, the microdisplay array 11 is adjusted by the control unit 14 to move in the z-axis direction to position PeWhich is located at the focal plane of the microlens array 13 (i.e. equal to 1 focal length (═ f)), so that a sub-image source S is displayed in the microdisplayiThe image light emitted by each pixel point is shielded by the corresponding small hole and modulated by the micro lens to form parallel light beams and passes through the eyeball ENIs refracted and then converges to a point on the retina. With appropriate settings, the sub-image source S1、S2And S3The imaging of (2) can meet the imaging conditions of the application, namely, the image lights emitted by all the sub-image sources disassembled from the image sources form continuous distribution of incident angles in a field angle range taking a pupil as a vertex, so that people with normal vision can view a clear image corresponding to the image source S under near-eye display.
Such AS the variable focus optical imaging array system AS shown in fig. 11, fig. 11(B) shows its imaging process in use by a myopic person. As mentioned above, for a person with myopia, parallel light beams from the outside are refracted by the pupil and then converged in front of the retina, and then pass through the convergence point and then are scattered to the retina to form an aperture; by modulating the incident parallel light beam into a light beam which diverges to some extent, the convergence point of the light beam after refraction through the pupil can be moved backward compared with the incidence of the parallel light beam, and can be moved onto the retina under the condition of proper modulation, thereby forming a clear image. Referring to fig. 11(B), in use by a myopic person, the microdisplay array 11 is moved by the control unit 14 from position P in fig. 11(a)eGradually approaching the microlens array 13 along the z-axis direction. In the process, the microdisplay array 11 is in the focal plane of the microlens array 13 (i.e., less than 1 focal length: (f) (m))<f) Thus displaying a sub-image source S in the microdisplay)iThe image light emitted by each pixel point is formed after being shielded by the corresponding small hole and modulated by the micro lensA beam that diverges somewhat with respect to the parallel beam. When the microdisplay array 11 is moved to a specific position PeWhen' the light beam, which diverges to a corresponding degree, passes through the eye ESThe pupil may converge to a point on the retina after refraction. At this time, as can be seen by comparison with FIG. 11(A), when the microdisplay array 11 is moved from position Pe(drawn with dotted line) to PeIn the meantime, to ensure that each imaging optical path still satisfies the imaging conditions of the present application, the sub-image source S needs to be reduced1、S2And S3The size and spacing of (a). For this purpose, the image processing unit 15 first measures the position P of the microdisplay array 11e' spacing L from microlens array 13DAThen, the image source S is disassembled according to the corresponding sub-image source size and space, and the newly formed sub-image source S is formed1、S2And S3Displayed in microdisplays 111, 112, and 113, respectively. The variable focus optical imaging array system shown in fig. 11 thus enables a myopic person to view a sharp image corresponding to the image source S in a near-eye display.
Such AS the variable focus optical imaging array system AS shown in fig. 11, fig. 11(C) shows its imaging process in use by a person with far vision. As mentioned above, for a person with far vision, parallel light beams from the outside are refracted by the pupil and then converged behind the retina, and the parallel light beams are incident on the retina to form an aperture before actually forming a convergence point; by modulating the incident parallel light beam into a light beam which is converged to a certain degree, the convergent point of the light beam refracted by the pupil can be moved forward compared with the incident parallel light beam, and can be moved to the retina under the condition of proper modulation, so that a clear image is formed. Referring to fig. 11(C), in use by a person with far vision, the microdisplay array 11 is moved by the control unit 14 from position P in fig. 11(a)eGradually moving away from the microlens array 13 along the z-axis direction. In this process, the microdisplay array 11 is out of the focal plane of the microlens array 13 (i.e., greater than 1 focal length: (f))>f) Thus displaying a sub-image source S in the microdisplay)iThe image light emitted by each pixel point is shielded by the corresponding small hole and modulated by the micro lens to form a light beam which is converged to a certain degree relative to the parallel light beam. Array of micro displayThe row 11 is moved to a specific position PeWhen the light beam is converged to a corresponding degree, the light beam passes through the eyeball ELThe pupil converges to a point on the retina after refraction. At this time, as can be seen by comparison with FIG. 11(A), when the microdisplay array 11 is moved from position Pe(drawn with dotted line) to Pe"in time, to ensure that each imaging optical path still satisfies the imaging conditions of the present application, the sub-image source S needs to be increased1、S2And S3The size and spacing of (a). For this purpose, the image processing unit 15 first measures the position P of the microdisplay array 11e"pitch L from microlens array 13DAThen, the image source S is disassembled according to the corresponding sub-image source size and space, and the newly formed sub-image source S is1、S2And S3Displayed in microdisplays 111, 112, and 113, respectively. The variable focus optical imaging array system shown in fig. 11 thus enables a person with far vision to view a sharp image corresponding to the image source S in a near-to-eye display.
FIG. 12 illustrates a variable focus optical imaging array system AS according to an embodiment of the present application, which again includes an optical imaging array assembly AM and a control unit 14 and an image processing unit 15 coupled to the optical imaging array assembly AM, similar to that shown in FIG. 11; moreover, the image processing unit 15 is coupled to the microdisplay array 11 in the optical imaging array module AM, which can determine the disassembling mode of the image source S by measuring the distance between the microdisplay array 11 and the microlens array 13, and then disassemble 3 sub-image sources S1、S2And S3Are displayed in microdisplays 111, 112, and 113, respectively, of microdisplay array 11. However, unlike the variable focus optical imaging array system AS shown in FIG. 11, the control unit 14 is coupled to the microlens array 13 in the optical imaging array assembly AM, which causes the microlens array 13 to move in the z-axis direction, changing the pitch (L) of the microlens array 13 from the microdisplay array 11DA)。
Such AS the variable focus optical imaging array system AS shown in fig. 12, fig. 12(a) shows its imaging process under normal-vision use by a person. Similar to that shown in fig. 11, see fig. 12(a), under normal-vision use, the adjustment is made by the control unit 14The microlens array 13 is moved to the position P in the z-axis directionfSuch that the microdisplay array 11 is at the focal plane of the microlens array 13 (i.e., equal to 1 focal length (═ f)). At this time, the sub-image source S is displayed in the micro displayiThe image light emitted by each pixel point is shielded by the corresponding small hole and modulated by the micro lens to form parallel light beams and passes through the eyeball ENIs refracted and then converges to a point on the retina. The variable focus optical imaging array system shown in fig. 12 enables a normal-sighted person to view a sharp image corresponding to the image source S in a near-eye display, provided that the imaging conditions of the present application are met.
Such AS the variable focus optical imaging array system AS shown in fig. 12, fig. 12(B) shows its imaging process in use by a myopic person. Similarly to that shown in fig. 11, referring to fig. 12(B), in the use by a myopic person, the microlens array 13 is moved by the control unit 14 from the position P in fig. 12(a)fGradually approaching the microdisplay array 11 along the z-axis direction. In the process, the microdisplay array 11 is in the focal plane of the microlens array 13 (i.e., less than 1 focal length: (f) (m))<f) Thus displaying a sub-image source S in the microdisplay)iThe image light emitted by each pixel point is shielded by a corresponding small hole and modulated by a micro lens to form a light beam which is diverged to a certain degree relative to the parallel light beam; and when the micro lens array 13 moves to a specific position PfWhen' the light beam, which diverges to a corresponding degree, passes through the eye ESCan converge to a point on the retina after refraction. At this time, to ensure that each imaging optical path still satisfies the imaging conditions of the present application, the image processing unit 15 first measures the position P of the microlens array 13f' spacing L from microdisplay array 11DAThe image sources S are then disassembled according to their corresponding sub-image source size and spacing to form sub-image sources S of reduced size and spacing compared to FIG. 12(A) (drawn in dashed lines)1、S2And S3And displayed in microdisplays 111, 112, and 113, respectively. The variable focus optical imaging array system shown in fig. 12 thus enables a myopic person to view a sharp image corresponding to the image source S in a near-eye display.
Similarly, the variable focus optical imaging array system shown in fig. 12 enables a far-sighted person to view a clear image corresponding to the image source S under the near-eye display, and the process and principle thereof can be referred to above, and are not described herein again.
Thus, when using the variable focus optical imaging array system of the present application, a user can adapt to his or her near or far vision condition by continuously adjusting the spacing between the microdisplay array and the microlens array until the sharpest image is seen.
As can be seen from the above description with respect to fig. 11 and 12, for persons with different levels of vision, the desired object-viewing effect can be achieved by adjusting the spacing between the microdisplay array and the microlens array in the variable focus optical imaging array system of the present application. Theoretically, for a person at any level of vision (including high myopia/hyperopia), the microdisplay-microlens spacing L would be such thatDAAre set small/large enough to enable a clear image to be viewed. However, as qualitatively described in the first aspect of the present application, when the microdisplay-microlens pitch LDAWhen too small, the sub-image source SiThe corresponding reduction in size results in too low a resolution; and when the microdisplay-microlens spacing LDAWhen too large, the corresponding increase in thickness of the optical module results in an oversized product.
In the variable focus optical imaging array system of the present application, the microdisplay-microlens spacing L for a given microlens focal length fDACan be varied within the range f + -deltaf to suit persons of different levels of vision. For example, in L relative to normal visionDAFor use at or near f, persons of different degrees of myopia/hyperopia may be adjusted to determine the appropriate L by adjusting within the range f- Δ f/f + Δ f, respectivelyDAThe value is obtained.
Optionally, f is in the range of 1 to 20mm, wherein the upper limit of f is selected from the group consisting of 20mm, 19mm, 18mm, 17mm, 16mm, 15mm, 14mm, 13mm, 12mm, 11mm, 10mm, 9mm, 8mm, 7mm, 6mm, 5mm, 4mm, 3mm, and 2mm and the lower limit is selected from the group consisting of 1mm, 2mm, 3mm, 4mm, 5mm, 6mm, 7mm, 8mm, 9mm, 10mm, 11mm, 12mm, 13mm, 14mm, 15mm, 16mm, 17mm, 18mm, and 19 mm. In one embodiment of the present application, f is in the range of 2 to 10 mm.
Alternatively, Δ f ranges from 0< Δ f ≦ 0.5f, where Δ f has an upper limit selected from 0.5f, 0.45f, 0.4f, 0.35f, 0.3f, 0.25f, 0.2f, 0.15f, 0.1f, 0.05f, and 0.03f and a lower limit selected from 0.01f, 0.03f, 0.05f, 0.1f, 0.15f, 0.2f, 0.25f, 0.3f, 0.35f, 0.4f, and 0.45 f. In one embodiment of the present application, Δ f is in the range of 0.05f ≦ Δ f ≦ 0.3 f.
Alternatively, LDAIn the range of 0.5 to 30 mm. In one embodiment of the present application, LDAIn the range of 1.4 to 13 mm.
Therefore, the variable-focus optical imaging array system can realize continuous adjustment of the divergence/convergence degree of image light incident to the pupil relative to parallel light beams by constructing the distance between the micro display array and the micro lens array in the optical imaging array assembly into a variable structure and disassembling the image source in a preset mode, thereby simultaneously meeting the use requirements of people with different vision levels (such as normal vision and people with various myopia and hyperopia degrees), and having improved applicability and convenience.
A third aspect of the present application relates to a near-eye display device. The near-eye display device comprises a variable focus optical imaging array system as described above.
In one embodiment of the present application, the near-eye display device comprises two of the variable focus optical imaging array systems for displaying respective images for the left and right eyes of a user. Preferably, the two variable focus optical imaging array systems perform the imaging process independently for the level of vision of the user's left and right eyes respectively. In this case, even if the level of vision of the left and right eyes of the user is different, a clear image can be viewed by the variable focus optical imaging array system of the present application.
A fourth aspect of the present application relates to an optical system image projection method. The optical system image projection method includes the steps of:
s1: providing a micro display array, and an aperture array and a micro lens array which are positioned at the downstream of the micro display array along the image light transmission direction and are adjacently arranged, wherein the micro display array, the aperture array and the micro lens array respectively comprise a plurality of micro displays, apertures and micro lenses with the same number and arrangement mode, and the micro displays, the apertures and the micro lenses are in one-to-one correspondence and respectively form light channels;
s2: monitoring the distance between the micro display array and the micro lens array;
s3: disassembling an image source into a plurality of sub-image sources according to the distance between a micro display array and a micro lens array, wherein each sub-image source forms a part of the image source and is displayed in a corresponding micro display;
s4: making the image light which is emitted by each micro display and displays the corresponding sub-image source enter the pupil after being shielded by the corresponding small hole and modulated by the micro lens, and forming sub-images at the preset area of the retina respectively;
s5: the sub-images are made to constitute an image corresponding to the image source.
Optionally, the optical system image projection method further includes the steps of:
s6: changing the distance between the micro display array and the micro lens array;
s7: when the change of the distance between the micro display array and the micro lens array is monitored, the disassembling mode of the image source is changed, and the sub images formed by the plurality of sub image sources are obtained again to form the image corresponding to the image source.
In one embodiment of the present application, the steps S2 and S7 each include:
a method for measuring the distance between a micro display array and a micro lens array includes emitting a light beam to one of the micro display array and the micro lens array by detecting the time from the emission to the reception of the light beam or the phase difference of the emitted light and the reflected light, and receiving the light beam reflected by the micro lens array, wherein the light beam is selected from laser, infrared light, and the like.
In one embodiment of the present application, the steps S3 and S7 each include:
converting the spacing data of the microdisplay array and the microlens array into an array defining a display area in the microdisplay of the microdisplay array;
according to the array of display regions in the microdisplays defining the microdisplay array, an image source is disassembled into a plurality of portions with predetermined intervals, where each portion contains color information and/or luminance information for a corresponding pixel, forming a plurality of sub-image sources.
In one embodiment of the present application, the step S6 includes:
the microdisplay array and/or microlens array is moved by rotation of a screw in threaded connection with the microdisplay array and/or microlens array, or by driving of a micro-motor coupled to the microdisplay array and/or microlens array.
In one embodiment of the application, the optical system image projection method is implemented by a variable focus optical imaging array system as described above or a near-eye display device comprising a variable focus optical imaging array system as described above.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be understood by those skilled in the art that the scope of the present invention in the present application is not limited to the specific combination of the above-mentioned features, but also covers other embodiments formed by arbitrary combinations of the above-mentioned features or their equivalents, for example, the above-mentioned features are replaced with (but not limited to) features having similar functions disclosed in the present application, without departing from the inventive concept.

Claims (28)

1. A variable focus optical imaging array system, comprising:
the optical imaging array assembly comprises a micro display array, an aperture array and a micro lens array, wherein the aperture array and the micro lens array are positioned at the downstream of the micro display array along the image light transmission direction, the aperture array and the micro lens array are adjacently arranged, the micro display array, the aperture array and the micro lens array respectively comprise a plurality of micro displays, apertures and micro lenses which are the same in number, the micro displays, the apertures and the micro lenses are in one-to-one correspondence to form corresponding light channels, and the micro displays can emit image light;
a control unit coupled to the optical imaging array assembly and configured to change a pitch of a microdisplay array and a microlens array; and
and the image processing unit is coupled with the optical imaging array assembly and is configured to adjust the display in the micro display according to the distance between the micro display array and the micro lens array, so that the image light emitted by each micro display is shielded by the corresponding small hole and modulated by the micro lens and then enters the pupil to be imaged at the preset region of the retina.
2. The variable focus optical imaging array system according to claim 1, wherein a plurality of microdisplays in the microdisplay array each display a plurality of sub-image sources from which the image source is disassembled, wherein the image light from each microdisplay, which displays the corresponding sub-image source, is blocked by a corresponding aperture and modulated by a microlens and then enters the pupil, forming sub-images at a predetermined area of the retina, said sub-images constituting an image corresponding to said image source.
3. The variable focus optical imaging array system of claim 2, wherein said image processing unit is configured to change the way of disassembling the image source according to the change of the pitch of the microdisplay array and the microlens array, and to display the retrieved plurality of sub-image sources in the corresponding microdisplays, so that the sub-images formed by each of said sub-image sources still constitute the image corresponding to said image source.
4. The variable focus optical imaging array system of claim 3, wherein the variable focus optical imaging array system is configured such that image light emitted by all sub-image sources disassembled from image sources forms a continuous distribution of incident angles over a range of field angles with the pupil as a vertex.
5. The variable focus optical imaging array system of any of claims 1-4, wherein the control unit is coupled to a microdisplay array, the pitch of which from a microlens array is changed by controlling the microdisplay array to move;
or the control unit is coupled with the micro-lens array and changes the distance between the micro-lens array and the micro-display array by controlling the micro-lens array to move;
or the control unit is respectively coupled with the microdisplay array and the microlens array, and the distance between the microdisplay array and the microlens array is changed by controlling the microdisplay array and the microlens array to move.
6. The variable focus optical imaging array system of claim 5, wherein the control unit comprises a screw in threaded connection with a microdisplay array and/or a microlens array, the microdisplay array and/or microlens array being moved by rotation of the screw;
or the control unit comprises a micro motor coupled with the micro display array and/or the micro lens array, and the micro motor drives the micro display array and/or the micro lens array to move.
7. The variable focus optical imaging array system of any of claims 1-4, wherein said image processing unit comprises a measurement module, a calculation module, and a disassembly module, wherein,
the measurement module is configured to measure a spacing between the microdisplay array and the microlens array;
the computing module is configured to convert pitch data of the microdisplay array and microlens array into an array defining a display area in a microdisplay of a microdisplay array;
the disassembly module is configured to disassemble an image source into a sub-image source according to the array of display regions in the microdisplays defining the microdisplay array.
8. The variable focus optical imaging array system of claim 7, wherein the measurement module is configured to emit a light beam by one of a microdisplay array and a microlens array to the other and to receive the light beam reflected by the latter, the separation of the microdisplay array and microlens array being measured by detecting the time from emission to reception or the phase difference of the emitted light and the reflected light, wherein the light beam is selected from laser or infrared light.
9. The variable focus optical imaging array system of claim 7, wherein the calculation module is configured to receive pitch data of a microdisplay array and a microlens array from a measurement module and to convert it computationally into an array of positions and sizes of regions to be displayed in the microdisplays of the microdisplay array.
10. The variable focus optical imaging array system of claim 7, wherein the disassembly module is configured to receive an array of positions and sizes of regions to be displayed in microdisplays of a microdisplay array from a computing module, thereby disassembling the image source into a plurality of portions having predetermined intervals, wherein each portion contains color information and/or brightness information of a corresponding pixel, thereby forming a plurality of sub-image sources.
11. The variable focus optical imaging array system of claim 7, wherein said disassembly module is coupled to a microdisplay array, and a sub-image source disassembled via an image source is displayed in a microdisplay of said microdisplay array.
12. The variable focus optical imaging array system of any of claims 1 to 4, wherein the distance of the microdisplay array from the microlens array in a direction perpendicular to both can be varied within a range of f ± Δ f, where f denotes the focal length of the microlenses in the microlens array, within a range of 1-20 mm; the range of delta f is 0< delta f < 0.5 f.
13. The variable focus optical imaging array system of claim 12, wherein the distance between the microdisplay array and microlens array in a direction perpendicular to both is in the range of 0.5-30 mm.
14. The variable focus optical imaging array system of any of claims 1 to 4, wherein in the optical imaging array assembly the array of apertures is located upstream or downstream of the array of microlenses in the direction of image light propagation.
15. The variable focus optical imaging array system according to any of claims 1-4, wherein in the optical imaging array assembly, a microdisplay array is positioned at or near the focal plane of a microlens array, such that image light emitted by the microdisplay, after being blocked by a corresponding aperture and modulated by the microlens, forms a parallel or nearly parallel light beam;
or the micro display array is positioned in the focal plane of the micro lens array, so that the image light emitted by the micro display is shielded by the corresponding small hole and modulated by the micro lens to form a light beam which is divergent relative to the parallel light beam;
or the micro display array is positioned outside the focal plane of the micro lens array, so that the image light emitted by the micro display is shielded by the corresponding small hole and modulated by the micro lens to form a light beam which is converged relative to the parallel light beam.
16. The variable focus optical imaging array system of any of claims 1 to 4, wherein in the optical imaging array assembly a microdisplay array comprises a microdisplay device integrally overlaid with pixels, the microdisplay device being configured to control pixels in its designated area to emit light and pixels in other areas to not emit light upon command, wherein the area that emits light corresponds to a microdisplay.
17. The variable focus optical imaging array system of any of claims 1-4, wherein in the optical imaging array assembly a plurality of microdisplays in an array of microdisplays are arranged in a rectangle along a first direction and a second direction perpendicular to the first direction.
18. The variable focus optical imaging array system of claim 17, wherein in the optical imaging array assembly the distance between the array of apertures and the center of the pupil along a direction perpendicular to the array of apertures is in the range of 5-30 mm;
in the optical imaging array assembly, the total number of the micro-displays included in the micro-display array is more than or equal to 2, and the number of the micro-displays included in the micro-display array in the first direction or the second direction is within the range of 1-10.
19. The variable focus optical imaging array system of claim 17, wherein in the optical imaging array assembly, a ratio of a diameter of a pupil to a crosstalk safe distance is in a range of 0.2-0.6, wherein the crosstalk safe distance represents a distance of a center of a pupil from a center of another pupil along the first or second direction;
in the optical imaging array assembly, the ratio of the length of a microdisplay to the midpoint spacing of adjacent microdisplays in the first or second direction is in the range of 0.2-0.6.
20. The variable focus optical imaging array system of any of claims 1-4, wherein in the optical imaging array assembly, physical boundaries are provided between optical channels to mitigate cross-talk.
21. A near-eye display device comprising the variable focus optical imaging array system of any of claims 1-20.
22. The near-eye display device of claim 21, wherein the near-eye display device comprises two of the variable focus optical imaging array systems for displaying respective images for a user's left and right eyes, respectively;
preferably, the two variable focus optical imaging array systems perform the imaging process independently for the level of vision of the user's left and right eyes respectively.
23. An optical system image projection method comprising the steps of:
s1: providing a micro display array, and an aperture array and a micro lens array which are positioned at the downstream of the micro display array along the image light transmission direction, wherein the aperture array and the micro lens array are adjacently arranged, the micro display array, the aperture array and the micro lens array respectively comprise a plurality of micro displays, apertures and micro lenses which are the same in number, and the micro displays, the apertures and the micro lenses are in one-to-one correspondence to form corresponding light channels;
s2: monitoring the distance between the micro display array and the micro lens array;
s3: according to the distance between the micro display array and the micro lens array, the image source is disassembled into a plurality of sub-image sources, and one sub-image source is displayed in each micro display;
s4: making the image light which is emitted by each micro display and displays the corresponding sub-image source enter the pupil after being shielded by the corresponding small hole and modulated by the micro lens, and forming sub-images at the preset area of the retina respectively;
s5: the sub-images are made to constitute an image corresponding to the image source.
24. The method of claim 23, further comprising the steps of:
s6: changing the distance between the micro display array and the micro lens array;
s7: when the change of the distance between the micro display array and the micro lens array is monitored, the disassembling mode of the image source is changed, and the sub images formed by the plurality of sub image sources are obtained again to form the image corresponding to the image source.
25. The method of claim 24, wherein the steps S2 and S7 each include:
a method for measuring the distance between a micro display array and a micro lens array includes emitting a light beam to one of the micro display array and the micro lens array by the other and receiving the light beam reflected by the latter, and measuring the distance between the micro display array and the micro lens array by detecting the time from the emission to the reception of the light beam or the phase difference of the emitted light and the reflected light, wherein the light beam is selected from laser light or infrared light.
26. The method of claim 24, wherein the steps S3 and S7 each include:
converting the spacing data of the microdisplay array and the microlens array into an array defining a display area in the microdisplay of the microdisplay array;
according to the array of display regions in the microdisplays defining the microdisplay array, an image source is disassembled into a plurality of portions with predetermined intervals, where each portion contains color information and/or luminance information for a corresponding pixel, forming a plurality of sub-image sources.
27. The method according to claim 24, wherein the step S6 includes:
the microdisplay array and/or microlens array is moved by rotation of a screw in threaded connection with the microdisplay array and/or microlens array, or by driving of a micro-motor coupled to the microdisplay array and/or microlens array.
28. The method of claim 23, wherein the method is implemented by the variable focus optical imaging array system of any of claims 1-20 or the near-eye display device of claim 21 or 22.
CN202010341951.0A 2020-04-27 2020-04-27 Variable-focus optical imaging array system, near-to-eye display device and optical system image projection method Pending CN113640987A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010341951.0A CN113640987A (en) 2020-04-27 2020-04-27 Variable-focus optical imaging array system, near-to-eye display device and optical system image projection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010341951.0A CN113640987A (en) 2020-04-27 2020-04-27 Variable-focus optical imaging array system, near-to-eye display device and optical system image projection method

Publications (1)

Publication Number Publication Date
CN113640987A true CN113640987A (en) 2021-11-12

Family

ID=78414913

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010341951.0A Pending CN113640987A (en) 2020-04-27 2020-04-27 Variable-focus optical imaging array system, near-to-eye display device and optical system image projection method

Country Status (1)

Country Link
CN (1) CN113640987A (en)

Similar Documents

Publication Publication Date Title
US20230400693A1 (en) Augmented reality display comprising eyepiece having a transparent emissive display
US11604353B2 (en) Multi-resolution display assembly for head-mounted display systems
US20210144361A1 (en) Near Eye Wavefront Emulating Display
CN107430277B (en) Advanced refractive optics for immersive virtual reality
US11086131B2 (en) Near-eye display and near-eye display system
JP2009535889A (en) 3D projection system
WO2011017485A9 (en) 3d autostereoscopic display with true depth perception
US11841511B2 (en) Wearable display systems with nanowire LED micro-displays
CN114365027A (en) System and method for displaying object with depth of field
US11860370B2 (en) Augmented and virtual reality display systems with correlated in-coupling and out-coupling optical regions for efficient light utilization
KR20140133330A (en) System for stereoscopic display
CN113640988A (en) Optical imaging array system, near-to-eye display device and optical system image projection method
KR20090038843A (en) A stereo projection system
CN113640987A (en) Variable-focus optical imaging array system, near-to-eye display device and optical system image projection method
US20220121027A1 (en) Display system having 1-dimensional pixel array with scanning mirror
CN113640986A (en) Combined optical imaging array system, near-to-eye display device and optical system image projection method
CN113960789A (en) Optical imaging array system, near-to-eye display device and image projection method
US10908418B1 (en) Naked eye 3D head-up display device with reflective diffuser sheet
TW202136861A (en) System and method for displaying an object with depths
WO2022025768A1 (en) A display screen adapted to correct for presbyopia
NO20200867A1 (en) A Display Screen Adapted to Correct for Presbyopia
CN109633904A (en) Retinal projection&#39;s display methods, system and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230419

Address after: 100094 Room 701-2, Floor 7, Building 1, Yard 1, No. 81, Beiqing Road, Haidian District, Beijing

Applicant after: Beijing Yilian Technology Co.,Ltd.

Address before: 102200 Zhonghai Shanghu family a5-2-102, Changping District, Beijing

Applicant before: Jiang Jing

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination