CN109856796A - Image source mould group, waveguide, near-eye display system and its control method - Google Patents

Image source mould group, waveguide, near-eye display system and its control method Download PDF

Info

Publication number
CN109856796A
CN109856796A CN201811382075.5A CN201811382075A CN109856796A CN 109856796 A CN109856796 A CN 109856796A CN 201811382075 A CN201811382075 A CN 201811382075A CN 109856796 A CN109856796 A CN 109856796A
Authority
CN
China
Prior art keywords
image
source module
image source
scanner
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811382075.5A
Other languages
Chinese (zh)
Inventor
姚长呈
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Idealsee Technology Co Ltd
Original Assignee
Chengdu Idealsee Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Idealsee Technology Co Ltd filed Critical Chengdu Idealsee Technology Co Ltd
Priority to CN201811382075.5A priority Critical patent/CN109856796A/en
Publication of CN109856796A publication Critical patent/CN109856796A/en
Pending legal-status Critical Current

Links

Abstract

The embodiment of the present application discloses image source mould group, waveguide, near-eye display system and its control method.Eye movement tracks the rotational angle of mould group real-time monitoring user's eye, to determine the visual field of user corresponding region on the image, and generates adjustment signal and is sent to described image source mould group and/or the waveguide, the light beam of outgoing to be adjusted.Light beam through overregulating is showing the image that can be only shown in the human eye visual field when imaging, to also can be reduced interference of the shown content to user while saving system consumption.

Description

Image source module, waveguide, near-to-eye display system and control method thereof
Technical Field
The application relates to the technical field of laser scanning display, in particular to an image source module, a waveguide, a near-to-eye display system and a control method thereof.
Background
Nowadays, with the rapid development of Display technologies such as Augmented Reality (AR), Virtual Reality (VR), etc., near-eye Display devices such as Head-Mounted Display (HMD) are also hot spots in the Display industry.
The existing near-eye display device can realize image display of a larger field of view by means of laser scanning, but in some scenes, the image of the larger field of view may bring problems, for example: if a user wears the AR equipment to walk, the image with a too large view field can block too much sight of the user, so that potential safety hazards can be brought; another example is: in the process of analyzing objects by the AR device, the image with a large field of view can display rich object information, but actually, in a display environment with a very short distance, the information outside the focusing range of the field of view cannot be seen clearly by human eyes, so that part of the information displayed by the AR device may not be fully utilized by the user, and meanwhile, redundant burden is caused on the system.
Disclosure of Invention
An object of the present application is to provide an image source module, a waveguide, a near-eye display system and a control method thereof, for solving the problem of image display in near-eye display.
The embodiment of the application provides an image source module, include: the device comprises a rotating cloud platform, a first mechanical arm, a second mechanical arm, an image light source and a scanner which are wholly or partially packaged in a shell.
One end of the first mechanical arm is connected with the rotating part of the rotating holder, the other end of the first mechanical arm is rotatably connected with one end of the second mechanical arm, the free end of the second mechanical arm is directly or indirectly connected with the scanner, and the scanner performs scanning output based on the light beam which is output by the image light source and contains image information;
under the drive of the rotating part of the rotating holder, the first mechanical arm and the second mechanical arm, the emergent angle and/or the emergent position of the light beam output by scanning can be adjusted.
Furthermore, the image source module further comprises a collimation element arranged on the emergent light path of the scanner, and the collimation element is tightly embedded with the light outlet of the shell so as to collimate the light beams scanned and output by the scanner.
Furthermore, the rotating holder is fixed on the inner wall of the non-light-emitting side of the shell, and the free end of the second mechanical arm is directly connected with the scanner so as to adjust the angle and/or position of the scanning output of the scanner in the shell.
Furthermore, the rotating holder is fixed on an external supporting structure, the scanner and the collimation element are fixedly arranged in the shell, and the free end of the second mechanical arm is connected to the outer wall of the non-light-emitting side of the shell to drive the shell and the scanner and the collimation element therein to move integrally.
Further, the scanner is a fiber scanner; the image light source is disposed inside the housing or outside the housing.
The embodiment of the present application further provides a near-eye display system, which comprises the image source module, the image adjusting module and the eye tracking module, wherein,
the image source module generates light beams containing image information, scans and outputs the light beams to the waveguide, and adjusts the output light beams according to the adjusting signals sent by the eye movement tracking module;
the waveguide expands the light beams input by the image source module in a first direction and a second direction and outputs the light beams, and the output light beams are adjusted according to the adjusting signals sent by the eye movement tracking module;
the eye tracking module monitors the rotation angle of the eyes of the user in real time so as to determine the corresponding area of the visual field of the user on the image, generate an adjusting signal and send the adjusting signal to the image source module and/or the waveguide, and adjust the emergent light beam.
Further, the waveguide includes: an incoupling part, an expanding part, and an outcoupling part disposed on the waveguide, wherein,
the coupling-in part is arranged on the surface of the waveguide; the extension component is arranged on an emergent light path of the coupling-in component and extends along the direction of the emergent light path, and the emergent light direction of the extension component is perpendicular to the extension direction of the extension component; the light-in side of the coupling-out component is parallel and opposite to the extending direction of the extension component, the coupling-out component extends in the light-out direction of the extension component, and the coupling-out component emits light towards one side of human eyes.
Further, the coupling-out part is a grating or a lens with adjustable optical parameters;
wherein the optical parameters include: at least one of reflectivity, refractive index, exit angle and exit position.
The embodiment of the present application further provides a control method for the near-to-eye display system, where the method includes:
the eye tracking module monitors the rotation angle of human eyes and determines the focusing area of the human eyes according to the rotation angle;
and generating an adjusting signal based on the determined focusing area, sending the adjusting signal to an image source module and/or a waveguide, and adjusting the light beam output by the near-eye display system to form an image on the focusing area.
Further, generating an adjusting signal based on the determined focusing area, sending the adjusting signal to an image source module and/or a waveguide, and adjusting the light beam output by the near-eye display system, wherein the adjusting signal comprises:
and generating an adjusting signal based on the determined focusing area, sending the adjusting signal to an image source module, and adjusting the light beam output by the image source module.
Further, based on the focus area of confirming generate adjusting signal and send to the image source module, adjust the light beam of image source module output includes:
the eye tracking module generates an adjusting signal based on the determined focusing area, sends the adjusting signal to the image source module, receives the effect of the adjusting signal, outputs a light beam corresponding to an image in the focusing area by an image light source in the image source module, and adjusts the angle and/or the area scanned and output by the scanner by a rotating holder, a first mechanical arm and a second mechanical arm in the image source module.
Further, based on the focus area of confirming generate adjusting signal and send to the image source module, adjust the light beam that the image source module is exported, include:
the eye movement tracking module generates an adjusting signal to be sent to the image source module based on the determined focusing area, the adjusting signal acts on the adjusting signal, an image light source in the image source module outputs a light beam corresponding to an image in the focusing area, and a scanner in the image source module conducts scanning translation to enable the scanned and output light beam to translate to a position corresponding to the focusing area in a two-dimensional direction.
Further, based on the focus area of confirming generate adjusting signal and send to the image source module, adjust the light beam that the image source module is exported, include:
the eye tracking module generates an adjusting signal based on the determined focusing area, sends the adjusting signal to the image source module, receives the action of the adjusting signal, outputs a light beam corresponding to an image in the focusing area by an image light source in the image source module, scans and translates by a scanner in the image source module, and adjusts the scanning output angle and/or area of the scanner by a rotating holder, a first mechanical arm and a second mechanical arm in the image source module.
Further, generating an adjusting signal based on the determined focusing area, sending the adjusting signal to an image source module and/or a waveguide, and adjusting the light beam output by the near-eye display system, wherein the adjusting signal comprises:
and generating an adjusting signal based on the determined focusing area, sending the adjusting signal to the waveguide, and adjusting the light beam output by the waveguide.
Further, generating an adjusting signal to be sent to the waveguide based on the determined focusing area, and adjusting the light beam output by the waveguide, including:
the eye tracking module generates an adjusting signal based on the determined focusing area and sends the adjusting signal to the waveguide, and the waveguide controls the coupling-out component to change the emergent angle of the emergent light beam according to the adjusting signal, so that the image formed by the emergent light beam in the human eye is located in the focusing area.
Further, generating an adjusting signal to be sent to an image source module and/or a waveguide based on the determined focusing area so as to adjust the light beam output by the near-eye display system, including:
the eye tracking module generates adjusting signals based on the determined focusing area and respectively sends the adjusting signals to the image source module and the waveguide, the image source module adjusts the angle and/or area of light beams output by the image light source and the scanning output of the scanner according to the adjusting signals, and the waveguide adjusts the emergent angle of the coupling-out component according to the adjusting signals.
The embodiment of the application further provides a near-to-eye display device, the near-to-eye display device is used as an augmented reality display device and at least comprises one set of near-to-eye display system, light beams emitted by the coupling-out part of the waveguide in the near-to-eye display system can enter human eyes, and external environment light penetrates through the waveguide to enter the human eyes.
The embodiment of the present application further provides another near-eye display device, where the near-eye display device is used as a virtual reality display device, and the near-eye display device includes two sets of the near-eye display systems described above, where a light beam emitted from an outcoupling member of a waveguide in a first set of the near-eye display systems enters a left eye, and a light beam emitted from an outcoupling member of a waveguide in a second set of the near-eye display systems enters a right eye.
By adopting the technical scheme in the embodiment of the application, the following technical effects can be realized:
the light beam emitted by the near-eye display system acts on human eyes or a display medium (such as a lens) so that a user can watch a corresponding image, thereby realizing near-eye display. Meanwhile, under the monitoring of the eye movement tracking module 40, the near-eye display system can adjust the emergent light beams in real time according to the rotation direction of the eyes of the user, and the adjusted light beams can only display images in the visual field of the eyes when displaying images, so that the interference of the displayed content on the user can be reduced while the system consumption is saved.
In addition, in an actual display scene, if the relative position of the image content to be displayed is fixed and the range exceeds the visual field of human eyes, the near-to-eye display system can only display the image in the visual field of the human eyes according to the rotation of the human eyes; if the image content to be displayed does not exceed the visual field of human eyes and the relative position is not fixed, the display position of the image content can be adjusted by the near-eye display system according to the rotation of the human eyes, and the image content is displayed along with the visual field of the human eyes.
In addition, the image source module can only output images in a focusing area of human eyes, so that the working frequency and the swing amplitude of the scanner can be reduced, and the power consumption and the element loss of the system can be reduced to a certain degree.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
fig. 1 is a schematic diagram of a laser scanning display principle provided by an embodiment of the present application;
FIG. 2 is a simplified diagram of a human eye viewing field acting on an image according to an embodiment of the present disclosure;
fig. 3a is a schematic structural diagram of a near-eye display system according to an embodiment of the present disclosure;
fig. 3b is a schematic diagram illustrating a connection relationship between the image source module 20 and the waveguide 30 in the near-eye display system according to the embodiment of the present application;
fig. 4a is a schematic structural diagram of a first image source module according to an embodiment of the present disclosure;
fig. 4b is a schematic structural diagram of a second image source module according to an embodiment of the present application;
fig. 4c is a schematic structural diagram of a third image source module according to an embodiment of the present application;
fig. 4d is a schematic structural diagram of a fourth image source module according to an embodiment of the present application;
fig. 4e is a schematic structural diagram of a fifth image source module according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a waveguide 30 provided in an embodiment of the present application;
fig. 6 is a flowchart of a control method based on a near-eye display system according to an embodiment of the present application;
fig. 7 is a schematic optical path diagram under a control method provided in an embodiment of the present application;
fig. 8 is a schematic diagram illustrating adjustment of scan output of an image source module according to a control method provided in an embodiment of the present application;
FIG. 9 is a schematic view of a human eye's field of view when actually viewed;
fig. 10a is a schematic diagram of a near-eye display device provided by an embodiment of the present application;
fig. 10b is a schematic diagram of another near-eye display device provided in an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
For ease of understanding, the basic principle of laser scanning imaging in existing near-eye display devices is first explained. As shown in fig. 1, which is a schematic diagram, fig. 1 includes: a laser light source 101, a scanner 102 and a human eye retina 103.
When the imaging is displayed, the laser emitted by the laser source acts on a certain pixel point position after being output by the scanner, so that the scanning of the pixel point position is realized, and the laser beam moves to the next pixel point position to scan under the control of the scanner. In other words, the laser beam outputted by the scanner will be lighted up at each pixel position with corresponding color, gray scale or brightness according to a certain sequence. In a frame of time, the laser beam traverses each pixel position at a high enough speed, and due to the characteristic of "visual residual" existing in the observation of things by human eyes, the human eyes cannot detect the movement of the laser beam at each pixel position, but see a complete image (in fig. 1, the user can see the image whose content is displayed as "Hi"). Of course, the content shown in fig. 1 is only for simple illustration of the basic principle of laser scanning imaging in the near-eye display, so as to facilitate understanding of the technical solutions in the embodiments of the present application, and should not be taken as a limitation of the present application.
However, in the near-eye display scene, the field of view of the human eye is limited. Specifically, referring to fig. 2, it is assumed that the image in the near-eye display is as shown in fig. 2, and actually, the human eye cannot cover the entire image in the visual field in such a short distance. Taking the areas a1 and a2 as examples, assuming that human eyes focus on the area a1, the field of vision of human eyes can cover the area, which means that the human eyes can view the image content in the area a1, but at this time, due to the field of vision limitation of human eyes, the image content in other areas is difficult to see or even cannot be seen; if it is desired to view the image content in region a2, the focus position of the human eye needs to be shifted from region a1 to a 2. Of course, the above description is only for the purpose of illustrating the way the image is viewed by human eyes in a near-eye display scene so as to facilitate the understanding of the subsequent aspects of the present application, and should not be construed as limiting the present application.
Based on the foregoing, embodiments of the present application provide a near-eye display system, as shown in fig. 3 a. The near-eye display system includes: image source module 20, waveguide 30, and eye-tracking module 40. Wherein,
image source module 20 generates a laser beam containing image information for scanning output to waveguide 30 and adjusts the output laser beam according to the adjustment signal sent by eye tracking module 40. For convenience of description, the laser beam may be simply referred to as a beam in the embodiments of the present application, and the laser beam and the beam represent the same concept in the embodiments of the present application unless otherwise specified.
The waveguide 30 expands the laser beam input by the image source module 20 in the first direction and the second direction, the expanded beam is output from the waveguide 30, and the waveguide 30 adjusts the output beam according to the adjustment signal sent by the eye tracking module 40. As shown in fig. 3, the first direction and the second direction respectively represent the propagation directions of the light beam in the waveguide 30, and when the user actually uses the display device corresponding to the near-eye display system, the first direction can be regarded as a vertical direction in the visual field plane of the human eye, and the second direction can be regarded as a horizontal direction in the visual field plane of the human eye, so the first direction and the second direction can be also referred to as: a vertical direction and a horizontal direction. It is to be understood that the terms "first" and "second" are used herein for distinguishing and should not be construed as limiting in sequence.
The eye tracking module 40 monitors the rotation angle of the user's eyes in real time to determine the corresponding area of the user's field of view on the image, and generates corresponding adjustment signals to be sent to the image source module 20 and/or the waveguide 30 to adjust the outgoing light beam.
The light beam emitted by the near-eye display system acts on human eyes or a display medium (such as a lens) so that a user can watch a corresponding image, thereby realizing near-eye display. Meanwhile, under the monitoring of the eye movement tracking module 40, the emergent light beam can be adjusted by the near-eye display system according to the rotation direction of the eyes of the user, and the adjusted light beam can only display the image in the visual field of the eyes when the image is displayed, so that the interference of the displayed content on the user can be reduced while the system consumption is saved.
It should be noted that, in an actual display scene, if the relative position of the image content to be displayed is fixed and the range of the image content exceeds the field of vision of human eyes (for example, the image shown in fig. 2), the near-eye display system may display only the image (partial image) in the field of vision of human eyes according to the rotation of human eyes; if the image content to be displayed does not exceed the visual field of human eyes and the relative position is not fixed, the display position of the image content can be adjusted by the near-eye display system according to the rotation of the human eyes, and the image content is displayed along with the visual field of the human eyes.
In some embodiments, image source module 20 and waveguide 30 may be a unitary structure, with image source module 20 being secured in a designated position on waveguide 30.
In other embodiments of the present application, image source module 20 and waveguide 30 are detachable. Specifically, referring to fig. 3b, a positioning portion 2001 is disposed on a side of the image source module 20 facing the waveguide 30, and the positioning portion 2001 is engaged with a positioning engaging portion 3001 on the waveguide 30, so that the image source module 20 is fixedly mounted on a designated position of the waveguide 30. The positioning portion 2001 may be a snap, a locking member, and the like, and correspondingly, the positioning engagement portion 3001 on the waveguide 30 may be a snap groove, a locking groove, and the like. Of course, the specific structures of the positioning portion 2001 and the positioning matching portion 3001 shown in fig. 3b can be interchanged, and the connection manner after the interchange is also the scope covered by the embodiments of the present application.
The eye tracking module 40 is typically a separate module, and in some embodiments, it may be integrated with the waveguide 30, and of course, will be specifically configured according to the needs of the application, and should not be construed as limiting the present application.
For further understanding of the present application, the modules of the near-eye display system described above in the present application are described in detail below with different embodiments.
Fig. 4a is a schematic structural diagram of an image source module 20 according to an embodiment of the present disclosure. The image source module 20 may include: a rotating pan-tilt head 201, a first mechanical arm 202, a second mechanical arm 203, an image light source 204, and a scanner 205 enclosed in a housing 206. Wherein,
one end of the first mechanical arm 202 is connected to the rotating part of the rotating holder 201, and the rotating part of the rotating holder 201 can drive the first mechanical arm 202 to rotate. The other end of the first mechanical arm 202 is rotatably connected with one end of the second mechanical arm 203, the free end of the second mechanical arm 203 is directly connected with the scanner 205, and the scanner 205 performs scanning output based on the light beam containing the image information output by the image light source 204.
In an embodiment of the present application, the rotating platform 201 is fixed on the inner wall of one side of the housing 206 by a fixing part such as a base (generally, the side where the rotating platform 201 is fixed is opposite to the side where the laser light is emitted), however, in practical application, the position where the rotating platform 201 is fixed is not limited to the position shown in fig. 4a, and may be located on the inner wall of the other side of the housing 206 as shown in fig. 4 b. It is easy to understand that, in order not to affect the exit of the light beam from the image source module 20, the rotating platform 201 is not usually fixed on the side where the light beam exits.
The rotating part of the rotating platform 201 can adopt a universal rotating shaft, and the first mechanical arm 202 and the second mechanical arm 203 can be connected through the universal rotating shaft. Therefore, under the driving of the rotating part of the rotating holder 201, the first mechanical arm 202 and the second mechanical arm 203, the scanner 205 can flexibly rotate at any angle and any position in the housing 206 to adjust the emitting angle and/or the emitting position of the scanned and output light beam. It will be readily appreciated that the lengths of the first and second robot arms 202 and 203 should be suitable for rotation within the housing 205 without scratching or impacting the inner wall of the housing 205 as the scanner 205 is rotated, and the lengths of the first and second robot arms 202 and 203 are not specifically limited herein.
In one possible embodiment, the driving manner of the rotating platform 201, the first mechanical arm 202 and the second mechanical arm 203 may be electrically driven.
The image light source 204 may transmit the laser beam output by the image light source 204 to the scanner 205 through an optical fiber, and in an embodiment, the image light source 204 may further include an image signal control unit and a corresponding laser (both not shown in the figure), and the image signal control unit receives an external image signal to perform processing such as decoding, and may control the laser beam output by the laser according to the processing result. The type of laser may be specifically an atomic laser, an ion laser, a semiconductor laser, or the like. Meanwhile, in order to ensure the display effect, any one or a combination of red (R), green (G), and blue (B) monochromatic lasers are generally used, or a white laser (it should be understood that the white laser can be separated into the foregoing RGB monochromatic lasers by corresponding optical devices), and of course, the laser of the corresponding color and the corresponding type can be specifically selected according to the needs of the practical application. The image light source 204 may further include optical elements such as a beam combiner, which may be configured according to the actual application requirement, and will not be described herein again.
The scanner 205 includes a scan driver 2051 and a fiber optic cantilever 2052 formed of an optical fiber extending at an exit end of the scan driver 2051. When the scan driver 2051 operates, it receives the adjustment signal sent by the eye tracking module 40, and adjusts the scanning frequency, scanning amplitude, etc., so that the fiber suspension arm 2052 swings at the corresponding frequency and swing amplitude, thereby implementing fiber scanning. Generally, the scanner 205 is embodied as a two-dimensional scanner. In some embodiments of the present application, the scan driver 2051 may be implemented by piezoelectric ceramics, that is, the piezoelectric ceramics drive the fiber cantilever 2052 to swing with a corresponding frequency and a corresponding amplitude under a corresponding voltage, so as to implement fiber scanning. Of course, the detailed fiber scanning is not described in detail herein.
The housing 206 may take the form of a solid structure such as a cylinder, cube, etc., with a space formed therein to enclose some or all of the other components in the image source module 20. To ensure that the light beam exits, the side of the housing 206 is not closed and the light beam exits the open side. It is easy to understand that the housing 206 can play a role of fixing and supporting the components such as the rotating pan-tilt 201, the image light source 204, etc. on one hand, and can play a role of protecting the components therein on the other hand. The specific packaging process and manufacturing process of the housing 206 will not be described in detail herein.
In addition to the structure of the image source module 20 shown in fig. 4a or 4b, in different embodiments of the present application, an image source module 20 with different structures is further provided, specifically:
on the basis of the structure of the image source module 20 shown in fig. 4a to 4b, a collimating element 207 may be further disposed, referring to fig. 4c, the collimating element 207 may be specifically disposed on the light-emitting path of the scanner and tightly embedded with the light-emitting port on the side of the housing 206 outputting the laser beam, the collimating element 207 may collimate the laser beam output by the scanner 205, and the collimated laser beam is input into the waveguide 30 in the form of parallel light. It will be readily appreciated that the size and shape of the collimating element 207 should match the light exit of the housing 206. In general, the collimating element 207 may be a collimating mirror/collimating lens set, which will be determined according to the requirements of the practical application, and is not to be construed as limiting the present application.
In some embodiments of the present application, some components of the image source module 20 are enclosed within the housing 206, and another component can be disposed outside the housing 206. Referring to fig. 4d, the rotating platform 201, the first mechanical arm 202, the second mechanical arm 203 and the image light source 204 are disposed outside the housing 206, and the scanner 205 and the collimating element 207 are fixedly disposed inside the housing 206. The scanner 205 is fixed on the inner wall of the opposite side of the light-emitting direction of the housing 206, the rotating platform 201 is fixedly connected with an external structure (fig. 4d does not show the external structure), and the free end of the second mechanical arm 203 is fixedly connected to the outer wall of the opposite side of the light-emitting direction of the housing 206 (i.e. can be regarded as being indirectly connected with the scanner 205) so as to drive the housing 206 and the scanner 205 and the collimating element 207 therein to move together. The position of the scanner 205 with respect to the collimating element 207 is not changed, which simplifies the design of the collimating element 207 and stabilizes the image output. Of course, in practical applications, the free end of the second mechanical arm 203 may also be fixedly connected to the side wall of the housing 206, and also may move the housing 206 and the scanner 205 and the collimating element 207 therein together.
Referring to fig. 4e, in an embodiment of the present application, a main control unit 208 may be disposed in the image source module 20, and the main control unit 208 may be used as a central controller in the image source module 20 and connected to each controlled element in the image source module 20 (fig. 4e is simply illustrated by using a connection line, which should not be construed as limiting the present application). The main control unit 208 can receive external image signals and process (e.g., perform decoding process) the image signals so as to control the laser beam emitted by the image light source 204; and also receives the adjustment control signal of the eye tracking module 40 to adjust the rotation of the rotational platform 201, the first robot 202 and the second robot 203; the scanning frequency and swing of the scanner 205 may also be adjusted. Of course, the position of the main control unit 208 shown in fig. 4e is only one possible way, and in practical applications, the main control unit 208 can also be disposed inside the housing 206, and it is easy to understand that the main control unit 208 is disposed and fixed in a way that ensures stable operation, and the embodiment of the present invention also falls within the scope covered by the present application.
It should be noted that, in the different structures of the image source module 20, the image source 204 may be disposed inside the housing 206 or outside the housing 206, and the disposed position does not affect the adjustment of the exit angle and/or the exit position of the light beam emitted from the image source module 20, so the different disposed positions of the image source 204 are not to be construed as limitations of the present application.
As a possible implementation manner, the rotating platform 201, the first mechanical arm 202 and the second mechanical arm 203 may adopt a hollow structure of a communicating type, wherein a conducting wire, an optical fiber, etc. may be disposed to be connected to the scanner 205, that is, the optical fiber at the emitting end of the image light source 204 and the signal line of the eye tracking module 40 may penetrate through the rotating platform 201, the first mechanical arm 202 and the second mechanical arm 203 and be connected to the scanner 205, so as to provide the scanner 205 with, for example, a scanning driving signal and/or an image light beam, etc., which is, of course, only one possible implementation manner of the technical solution of the present application and should not be construed as a limitation of the present application.
The above is a description of image source module 20 in the embodiments of the present application, and waveguide 30 is now described.
Referring to fig. 5, a schematic diagram of a waveguide 30 in the embodiment of the present application is shown. The waveguide 30 includes a coupling-in member 301, an expanding member 302, and a coupling-out member 303 provided on the waveguide 30.
The waveguide 30 may adopt a spatial light modulator, the coupling-in component 301 is located on the surface of the waveguide 301 and opposite to the light outlet of the image source module 20, so that the light beam output by the image source module 20 can be coupled into the waveguide 301, and under the action of the coupling-in component 301, the light beam entering the waveguide 301 will be further input to the expansion component 302.
The coupling-in member 301 is not limited to the position shown in fig. 5, and may be located on the side of the waveguide 301, and may be configured according to the requirement of the practical application.
The expanding member 302 and the outcoupling member 303 expand the light beam entering the waveguide 30 in the first direction and the second direction, respectively. Specifically, the extension member 302 is disposed on the light exit path of the incoupling member 301 and extends in the direction of the light exit path, and the light exit direction of the extension member 302 is perpendicular to the extension direction of the extension member 302. The light incident side of the coupling-out member 303 is parallel to and opposite to the extending direction of the extending member 302, the coupling-out member 303 extends in the light emitting direction of the extending member 302, and the coupling-out member 303 emits light toward the human eye side.
The coupling-in member 301 may specifically employ a grating or a light-transmitting thin layer. The expanding member 302 may specifically employ an arrayed reflective waveguide, a grating, or the like. The coupling-out component 303 may specifically employ a grating (e.g., an electrically controlled liquid crystal grating) with adjustable optical parameters, wherein the optical parameters may include, but are not limited to: reflectivity, refractive index, exit angle, exit position, etc. the exit part 303 can further change the exit position and/or exit angle of the light beam exiting from the exit part 303 under the action of the control signal. The control signal may be an adjustment signal sent by the eye tracking module 40, or may be a control signal generated by the corresponding control unit according to the adjustment signal of the eye tracking module 40, and the specific manner of the control signal may be determined according to the needs of the actual application, which is not specifically limited herein.
It should be noted that, in practical applications, the size and shape of each component in the waveguide 30 are not limited to the state shown in fig. 5, and are only an embodiment given for explaining the waveguide 30, and therefore should not be construed as limiting the present application.
For the eye tracking module 40, the conventional tracking technology and corresponding algorithm can be adopted, and will not be described in detail herein.
On the basis of the above, an embodiment of the present application further provides a control method based on the near-eye display system, as shown in fig. 6, the method specifically includes the following steps:
step S601: the eye tracking module 40 monitors the rotation angle of the human eye and determines the focusing area of the human eye according to the rotation angle.
In combination with the foregoing, it is readily understood that in a near-eye display scene, the field of vision of the human eye is the visible region in which the human eye is effective in the focus region of the image, and the human eye may be obscured or even invisible for images beyond this region. On the basis of this, a corresponding adjustment signal can be generated for adjusting the generated light beam. It should be noted that the process of monitoring the human eye focusing area and generating the adjustment signal by the eye tracking module 40 can be implemented by using an existing monitoring algorithm or model, and will not be described in detail herein.
Step S603: an adjustment signal is generated based on the determined focal region and sent to image source module 20 and/or waveguide 30 to adjust the light beam generated by the near-eye display system to image on the focal region.
In the embodiment of the present application, under the action of the adjustment signal, the image source module 20 and/or the waveguide 30 may adjust the light beam, so that the near-eye display system only displays the image in the focus area of the field of view of the human eye.
Based on the above method, different adjustment control modes can be specifically adopted in practical application, which is described in detail below.
The first method is as follows:
in this manner, eye tracking module 40 only sends the adjustment signal to waveguide 30 and not to image source module 20, that is, image source module 20 still scans out a complete image without changing the exit angle and the exit position of the scanned out light beam. The waveguide 30 receives the adjusting signal sent by the eye tracking module 40, so as to adjust the exit angle of the coupling-out component 303 in the waveguide 30, that is, the waveguide 30 controls the coupling-out component 303 to change the exit angle of the exiting light beam according to the adjusting signal, so that the image formed by the exiting light beam in the human eye is located in the focusing area.
Assuming that the eye tracking module 40 determines the focusing area of the user's eye to be the area a1 so as to generate the adjusting signal to act on the coupling-out member 303 of the waveguide 30, the exit angle of the light beam output by the coupling-out member 303 is changed under the action of the adjusting signal, as shown in fig. 7, and the image formed by the human eye is located in the area a1 after acting on the human eye.
The second method comprises the following steps:
in this manner, eye tracking module 40 only sends an adjustment signal to image source module 20, but not to waveguide 30. Specifically, the image light source 204 in the image source module 20 only outputs the light beam of the image in the human eye focusing area under the action of the adjustment signal, and during scanning output, the area and/or angle of the light beam scanned and output by the adjustment scanner 205 (that is, the light beam of the image in the human eye focusing area is only scanned and output by the image source module 20 according to the adjustment signal) is adjusted, and because the angle and/or position of the light beam input into the waveguide 30 after being adjusted and scanned and output are different, when the light beam is transmitted in the waveguide 30 and output from the waveguide 30, the light beam can also only act on the focusing area of the human eye.
Referring to fig. 2 and 8, assuming that the human eye focuses on the area a1 on the image, the light beam outputted from the image source 204 in the image source module 20 only corresponds to the image of the area a1 under the action of the adjustment signal outputted from the eye tracking module 40, and at the same time, the rotating pan head 201, the first mechanical arm 202 and the second mechanical arm 203 cooperate to adjust the scanning area and/or scanning angle of the light beam scanned and outputted by the scanner 205, and the adjusted light beam can also act on the area a1 after being transmitted in the waveguide 30 and outputted from the waveguide 30, that is, the image viewed by the user is the image in the area a 1. In this process, since the image source module 20 only outputs the image in the focus area of the human eye, the working frequency of the scanner 205 and the swing of the optical fiber cantilever 2052 are both reduced, which can reduce the system power consumption and the device loss.
Mode III
In this manner, since the scanner 205 in some embodiments of the present application is a two-dimensional scanner, the two-dimensional scanner can be enabled to scan and translate (i.e., the scanning area is translated) during operation by applying a dc component to the two-dimensional scanner. Specifically, the eye tracking module 40 sends the generated adjustment signal to the scanner 205 in the image source module 20, and considering that the scan driver 2051 in the scanner 205 may be implemented by a piezoelectric ceramic, the adjustment signal may be a dc component of a voltage, so that the scan driver 2051 controls the fiber cantilever 2052 to translate in a corresponding direction under the action of the dc component of the voltage. Generally, a two-dimensional scanner can scan out X, Y in two directions, and if translation in the X direction is to be achieved, a corresponding voltage dc component can be applied in the X direction; similarly, a corresponding voltage dc component may also be applied in the Y direction to effect translation in the Y direction.
Referring now to FIG. 2, it is assumed that scan driver 2051 is subjected to a DC component of a Y-direction voltage (i.e., a conditioning signal) that causes fiber suspension 2052 to translate in the Y-direction. Since the scanned-out light beam is translated, its corresponding field of view will also be translated, and thus when the image is displayed, the corresponding field of view is translated from region a1 to region a 2. It should be noted that, in this manner, only the scanning beam of the scanner 205 is translated in two-dimensional directions, so as to realize image display in the focusing area of the human eye, and the rotating pan/tilt head 201, the first mechanical arm 202, the second mechanical arm 203, or the waveguide 30 in the image source module 20 is not affected by the adjustment signal (i.e., does not rotate).
Mode IV
It is contemplated that in some practical situations, although applying a dc component to the scanner 205 may cause the scanner to translate, the range of translation is limited, and it may be difficult to image in the field of view of the human eye when the angle of rotation of the human eye is too large. Therefore, in this embodiment, the dc component is applied to the scanner 205, and the rotation pan/tilt head 201, the first robot arm 202, and the second robot arm 203 in the image source module 20 can be controlled. Specifically, the eye tracking module 40 generates an adjustment signal based on the determined focusing area and sends the adjustment signal to the image source module 20, under the action of the adjustment signal, the image light source 201 in the image source module 20 outputs a light beam corresponding to an image in the focusing area, the scanner in the image source module 20 performs scanning translation, and the rotating pan/tilt head 201, the first mechanical arm 202 and the second mechanical arm 203 in the image source module 20 adjust the angle and/or the area scanned and output by the scanner 205.
Mode five
In this manner, eye tracking module 40 transmits the generated adjustment signals to image source module 20 and waveguide 30, respectively. That is, image source module 20 and waveguide 30 will both respond according to the adjustment signal. Specifically, the image source module 20 only scans and outputs the image in the focusing area of the human eye according to the adjustment signal, and during scanning and outputting, the scanning area and the scanning angle of the scanner 205 are adjusted, and the adjusted light beam scanned and output is input to the waveguide 30. The waveguide 30 receives the adjustment signal sent by the eye tracking module 40, so as to adjust the exit angle of the light beam output by the coupling-out component 303 in the waveguide 30.
It is understood that the above specific control method can be used to adjust the imaging area. In practical applications, any one of the above manners may be adopted, or the above manners may be adopted in combination, and the specific manner may be determined according to the needs of practical applications, and is not specifically limited herein.
In addition, in an actual display scene, a user can adjust the size of the displayed area to adapt to different requirements.
It should be noted that, in the embodiment of the present application, the foregoing content of "dividing" the image is only a simple case adopted for convenience of describing the technical solution of the present application, and it should be understood that, in practical application, the solution in the embodiment of the present application is applicable to more complicated human eye rotation scenes. Specifically, the change in the field of view caused by the rotation of the human eye is not an area strictly divided by a broken line as shown in the foregoing, but as shown in fig. 9, the field of view of the human eye is generally cone-shaped, acting on the image plane to form a circular area, and the rotation of the human eye is arbitrary, and the corresponding field of view may correspond to an arbitrary area on the image. Of course, no limitation to the present application is intended thereby.
In actual image display, image content in a region other than the field of vision of human eyes may not be completely displayed, and display in a low-brightness state, a low-contrast state, a low-resolution state, or the like may be performed. Of course, the setting will be specifically performed according to the needs of practical applications, and therefore, the detailed description is not repeated herein.
In some practical scenes, an object with a large longitudinal or transverse size may appear, and if only a local image is displayed, it may cause an obstacle to a user to identify the object, at this time, a pre-image processing technology may be used to identify characteristics of a displayed virtual object, and adjust the image size of a display area (in the embodiment of the present application, adjustment of the image size may be achieved by adjusting a scan driving voltage, which is not described herein in detail), and the dynamic eye focusing area is displayed to optimize the display effect, and of course, the image light source needs to be correspondingly processed and implemented by matching with a modulation algorithm.
In practical applications, the near-eye display system provided by the embodiment of the present application can be applied to a near-eye display device such as an AR device or a VR device.
Specifically, the near-eye display device in the embodiment of the present application includes at least one set of the near-eye display system described in the foregoing and can be controlled by using at least one control manner of the foregoing.
Referring to fig. 10a, the near-eye display device is mainly used as an augmented reality AR display device, in which case the near-eye display device may only include one set of near-eye display system S1, light beams emitted from an output component of a waveguide in the near-eye display system S1 may enter human eyes, and meanwhile, external ambient light may also enter human eyes through the waveguide, so that a user views a corresponding augmented reality image. Of course, one possible form of a near-eye display device is shown in fig. 10a, i.e., using integrally formed lenses (i.e., the left and right lenses are not separately separable). Of course, in practical applications, the AR display device may also adopt a separate lens structure (refer to fig. 10b), and in this case, the AR display device may still include a near-eye display system S1, which emits a light beam to act on the left eye or the right eye. The determination can be specifically performed according to the needs of practical application, and redundant description is omitted here.
Referring to fig. 10b, the near-eye display apparatus is mainly used as a virtual reality VR display apparatus, in this case, the near-eye display apparatus includes two sets of near-eye display systems, wherein the light beams emitted from the coupling-out part of the waveguide in the first set of near-eye display system S3 enter the left eye, and the light beams emitted from the coupling-out part of the waveguide in the second set of near-eye display system S5 enter the right eye. Of course, one possible form of a near-eye display device is shown in fig. 10b, i.e. using two separate lenses. Of course, in practical applications, the VR display device may also adopt an integrated lens structure (refer to fig. 10a), which may be determined according to the needs of practical applications, and will not be described in detail herein.
The embodiments in the present application are described in a progressive manner, and the same and similar parts among the embodiments can be referred to each other, and each embodiment focuses on the differences from the other embodiments. Especially, as for the device, apparatus and medium type embodiments, since they are basically similar to the method embodiments, the description is simple, and the related points may refer to part of the description of the method embodiments, which is not repeated here.
Thus, particular embodiments of the present subject matter have been described. Other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may be advantageous.
The expressions "first", "second", "said first" or "said second" used in various embodiments of the present disclosure may modify various components regardless of order and/or importance, but these expressions do not limit the respective components. The above description is only configured for the purpose of distinguishing elements from other elements. For example, the first user equipment and the second user equipment represent different user equipment, although both are user equipment. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the present disclosure.
When an element (e.g., a first element) is referred to as being "operably or communicatively coupled" or "connected" (operably or communicatively) to "another element (e.g., a second element) or" connected "to another element (e.g., a second element), it is understood that the element is directly connected to the other element or the element is indirectly connected to the other element via yet another element (e.g., a third element). In contrast, it is understood that when an element (e.g., a first element) is referred to as being "directly connected" or "directly coupled" to another element (a second element), no element (e.g., a third element) is interposed therebetween.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (12)

1. An image source module, comprising: a rotating pan-tilt head, a first mechanical arm, a second mechanical arm, an image light source and a scanner which are all or partially encapsulated in the shell,
one end of the first mechanical arm is connected with the rotating part of the rotating holder, the other end of the first mechanical arm is rotatably connected with one end of the second mechanical arm, the free end of the second mechanical arm is directly or indirectly connected with the scanner, and the scanner performs scanning output based on the light beam which is output by the image light source and contains image information;
under the drive of the rotating part of the rotating holder, the first mechanical arm and the second mechanical arm, the emergent angle and/or the emergent position of the light beam output by scanning can be adjusted.
2. The image source module as claimed in claim 1, wherein the image source module further comprises a collimating element disposed on the exit light path of the scanner, and the collimating element is tightly engaged with the light exit port of the housing to collimate the light beam scanned and output by the scanner.
3. The image source module as claimed in claim 2, wherein the rotational stage is fixed to an inner wall of the non-light-emitting side of the housing, and the free end of the second mechanical arm is directly connected to the scanner to adjust an angle and/or a position of a scanning output of the scanner within the housing.
4. The image source module as claimed in claim 2, wherein the rotational stage is fixed to an external structure, the scanner and the collimating element are fixedly disposed in the housing, and a free end of the second mechanical arm is connected to an outer wall of the non-light-emitting side of the housing to drive the housing and the scanner and the collimating element therein to move integrally.
5. The image source module of any of claims 1-4, wherein the scanner is a fiber optic scanner;
the image light source is disposed inside the housing or outside the housing.
6. A near-eye display system, comprising: a waveguide, an eye-tracking module, and an image source module of any of the preceding claims 1-5.
7. The near-eye display system of claim 6 wherein the waveguide comprises: an incoupling part, an expanding part, and an outcoupling part disposed on the waveguide, wherein,
the coupling-in part is arranged on the surface of the waveguide; the extension component is arranged on an emergent light path of the coupling-in component and extends along the direction of the emergent light path, and the emergent light direction of the extension component is perpendicular to the extension direction of the extension component; the light-in side of the coupling-out component is parallel and opposite to the extending direction of the extension component, the coupling-out component extends in the light-out direction of the extension component, and the coupling-out component emits light towards one side of human eyes.
8. The near-eye display system of claim 7 wherein the outcoupling means is an optical parameter tunable grating;
wherein the optical parameters include: at least one of reflectivity, refractive index, light transmittance, emergent angle and emergent position.
9. A method of controlling a near-eye display system according to any one of claims 6-8, the method comprising:
the eye tracking module monitors the rotation angle of human eyes and determines the focusing area of the human eyes according to the rotation angle;
and generating an adjusting signal to be sent to an image source module based on the determined focusing area, and adjusting the light beam output by the image source module to form an image on the focusing area.
10. The method of claim 9, wherein generating an adjustment signal based on the determined area of focus to send to an image source module to adjust the light beam output by the image source module comprises:
the eye tracking module generates an adjusting signal based on the determined focusing area, sends the adjusting signal to the image source module, receives the effect of the adjusting signal, outputs a light beam corresponding to an image in the focusing area by an image light source in the image source module, and adjusts the angle and/or the area scanned and output by the scanner by a rotating holder, a first mechanical arm and a second mechanical arm in the image source module.
11. The method of claim 9, wherein generating an adjustment signal based on the determined area of focus to send to an image source module to adjust the light beam output by the image source module comprises:
the eye movement tracking module generates an adjusting signal to be sent to the image source module based on the determined focusing area, the adjusting signal acts on the adjusting signal, an image light source in the image source module outputs a light beam corresponding to an image in the focusing area, and a scanner in the image source module conducts scanning translation to enable the scanned and output light beam to translate to a position corresponding to the focusing area in a two-dimensional direction.
12. The method of claim 9, wherein generating an adjustment signal based on the determined area of focus to send to an image source module to adjust the light beam output by the image source module comprises:
the eye tracking module generates an adjusting signal based on the determined focusing area, sends the adjusting signal to the image source module, receives the action of the adjusting signal, outputs a light beam corresponding to an image in the focusing area by an image light source in the image source module, scans and translates by a scanner in the image source module, and adjusts the scanning output angle and/or area of the scanner by a rotating holder, a first mechanical arm and a second mechanical arm in the image source module.
CN201811382075.5A 2018-11-20 2018-11-20 Image source mould group, waveguide, near-eye display system and its control method Pending CN109856796A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811382075.5A CN109856796A (en) 2018-11-20 2018-11-20 Image source mould group, waveguide, near-eye display system and its control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811382075.5A CN109856796A (en) 2018-11-20 2018-11-20 Image source mould group, waveguide, near-eye display system and its control method

Publications (1)

Publication Number Publication Date
CN109856796A true CN109856796A (en) 2019-06-07

Family

ID=66890085

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811382075.5A Pending CN109856796A (en) 2018-11-20 2018-11-20 Image source mould group, waveguide, near-eye display system and its control method

Country Status (1)

Country Link
CN (1) CN109856796A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110456508A (en) * 2019-07-30 2019-11-15 成都理想境界科技有限公司 A kind of near-eye display system and intelligent glasses
CN112130321A (en) * 2019-06-24 2020-12-25 成都理想境界科技有限公司 Waveguide module and near-to-eye display module and equipment based on waveguide
CN113933998A (en) * 2021-10-22 2022-01-14 小派科技(上海)有限责任公司 Optical module/system, display device, head-mounted display equipment and display system
CN114019679A (en) * 2021-10-22 2022-02-08 小派科技(上海)有限责任公司 Optical module/system, display device, head-mounted display equipment and display system
US11275250B2 (en) 2019-11-19 2022-03-15 Apple Inc. Optical alignment for head-mountable device
CN114280786A (en) * 2021-12-24 2022-04-05 深圳珑璟光电科技有限公司 Optical waveguide element, construction method thereof and near-to-eye display device
CN114647082A (en) * 2022-04-02 2022-06-21 深圳市光舟半导体技术有限公司 Pupil expanding device, binocular display method and image display method
WO2022222383A1 (en) * 2021-04-22 2022-10-27 歌尔股份有限公司 Display system, display glasses, and display system control method
WO2023216794A1 (en) * 2022-05-13 2023-11-16 湖北星纪魅族科技有限公司 Display device adjusting apparatus and adjusting method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006308674A (en) * 2005-04-26 2006-11-09 Mitsubishi Electric Corp Image display device
US20170099478A1 (en) * 2015-10-04 2017-04-06 Thika Holdings Llc Eye gaze responsive virtual reality headset

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006308674A (en) * 2005-04-26 2006-11-09 Mitsubishi Electric Corp Image display device
US20170099478A1 (en) * 2015-10-04 2017-04-06 Thika Holdings Llc Eye gaze responsive virtual reality headset

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112130321A (en) * 2019-06-24 2020-12-25 成都理想境界科技有限公司 Waveguide module and near-to-eye display module and equipment based on waveguide
CN110456508A (en) * 2019-07-30 2019-11-15 成都理想境界科技有限公司 A kind of near-eye display system and intelligent glasses
US11275250B2 (en) 2019-11-19 2022-03-15 Apple Inc. Optical alignment for head-mountable device
WO2022222383A1 (en) * 2021-04-22 2022-10-27 歌尔股份有限公司 Display system, display glasses, and display system control method
US12099201B2 (en) 2021-04-22 2024-09-24 Goertek Inc. Display system, display glasses and display system control method
CN113933998A (en) * 2021-10-22 2022-01-14 小派科技(上海)有限责任公司 Optical module/system, display device, head-mounted display equipment and display system
CN114019679A (en) * 2021-10-22 2022-02-08 小派科技(上海)有限责任公司 Optical module/system, display device, head-mounted display equipment and display system
WO2023066387A1 (en) * 2021-10-22 2023-04-27 小派科技(上海)有限责任公司 Optical module and optical system, display apparatus, heat-mounted display device, and display system
CN114280786A (en) * 2021-12-24 2022-04-05 深圳珑璟光电科技有限公司 Optical waveguide element, construction method thereof and near-to-eye display device
CN114647082A (en) * 2022-04-02 2022-06-21 深圳市光舟半导体技术有限公司 Pupil expanding device, binocular display method and image display method
WO2023216794A1 (en) * 2022-05-13 2023-11-16 湖北星纪魅族科技有限公司 Display device adjusting apparatus and adjusting method

Similar Documents

Publication Publication Date Title
CN109856796A (en) Image source mould group, waveguide, near-eye display system and its control method
US10419731B2 (en) Virtual image generator
TWI588535B (en) Adjustable focal plane optical system
KR102139268B1 (en) Eye projection system
US6396461B1 (en) Personal display with vision tracking
US6388641B2 (en) Scanned beam display with adjustable accommodation
KR102461253B1 (en) Projection display apparatus including eye tracker
US5903397A (en) Display with multi-surface eyepiece
JP2018533062A (en) Wide-field head-mounted display
JPH11160650A (en) Picture display device
CN209842240U (en) Near-to-eye display equipment
CN109407317A (en) Waveguide, near-eye display system and its control method
CN114815468A (en) Multi-plane projection with laser beam scanning in augmented reality displays
CN109348210A (en) Image source mould group, near-eye display system, control method and near-eye display device
US20220050286A1 (en) Beam scanner with pic input and display based thereon
US7497574B2 (en) Retinal image display device
US20220269079A1 (en) Systems, devices, and methods for inputting light from a scanning projector into a waveguide
CN114488539A (en) Scanning display module and near-to-eye display equipment
JP2011070093A (en) Head-mounted display
US10690919B1 (en) Superluminous LED array for waveguide display
JP2010085621A (en) Image display device
US20240027748A1 (en) Scanning projector performing consecutive non-linear scan with multi-ridge light sources
EP1335234A2 (en) Image projecting device
US20220413207A1 (en) Cascaded eyebox expansion in extended reality image projection devices
EP4414769A1 (en) Folded beam two-dimensional (2d) beam scanner

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190607

RJ01 Rejection of invention patent application after publication