WO2018001323A1 - 近眼显示系统、虚拟现实设备及增强现实设备 - Google Patents

近眼显示系统、虚拟现实设备及增强现实设备 Download PDF

Info

Publication number
WO2018001323A1
WO2018001323A1 PCT/CN2017/090839 CN2017090839W WO2018001323A1 WO 2018001323 A1 WO2018001323 A1 WO 2018001323A1 CN 2017090839 W CN2017090839 W CN 2017090839W WO 2018001323 A1 WO2018001323 A1 WO 2018001323A1
Authority
WO
WIPO (PCT)
Prior art keywords
scanning fiber
array
scanning
output
eye
Prior art date
Application number
PCT/CN2017/090839
Other languages
English (en)
French (fr)
Inventor
黄琴华
Original Assignee
成都理想境界科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 成都理想境界科技有限公司 filed Critical 成都理想境界科技有限公司
Publication of WO2018001323A1 publication Critical patent/WO2018001323A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays

Definitions

  • the present invention relates to the field of visual technologies, and in particular, to a near-eye display system, a virtual reality device, and an augmented reality device.
  • Augmented Reality is a technology that uses virtual objects or information to enhance the reality of real scenes.
  • Augmented reality technology is usually based on the real physical environment image obtained by the image acquisition device such as a camera.
  • the computer system recognizes and analyzes the query, and displays the virtual image generated by the virtual content such as text content, image content or image model associated with the virtual reality image.
  • the user can obtain the extended information such as the annotation, description and the like of the real object in the real physical environment, or experience the stereoscopic and highlighted enhanced visual effect of the real object in the real physical environment.
  • Existing augmented reality devices generally converge the light of a virtual image into a user's pupil through an optical lens, which imposes stricter restrictions on the position of the human eye.
  • the user's pupil position changes such as the user's eyeball rotation, or two users with different pupil distances use the same augmented reality device
  • the user needs to adjust the distance of the augmented reality device, or automatically by the augmented reality device.
  • Perform the interpupillary adjustment the accuracy of the two is not high at present, which may cause the virtual image light to not enter the human eye, so that the augmented reality device cannot send the virtual image to the user, or the transmitted virtual image is not effective, and then the user cannot be A good augmented reality experience.
  • the augmented reality device has a strict restriction on the position of the human eye, and the user cannot be given a good augmented reality experience.
  • the present invention provides a near-eye display system, a virtual reality device, and an augmented reality device, which are capable of being
  • the hole position automatically opens the corresponding output channel to output the light beam, and controls the output beam to all enter the human eye, thereby achieving no limitation on the position of the human eye observation, and can give the user a good augmented reality experience.
  • a first aspect of the present application provides a near-eye display system, including a laser light source, a beam splitting component, a scanning fiber array, a pupil position detector, and a controller, wherein the beam splitting component includes M*N output channels.
  • M and N are integers not less than 2;
  • the laser light source is for outputting laser light modulated according to image information;
  • the light splitting component is for dividing laser light output by the laser light source into M*N light beams;
  • the detector is configured to obtain a pupil position of the user's pupil;
  • the controller is electrically connected to the beam splitting component and the pupil position detector, respectively, for acquiring the user pupil and the scanning according to the acquired pupil position Controlling the relative positions of the fiber arrays, and controlling the opening and closing of each of the output channels according to the relative position and the display field of view gray of the image information; scanning fibers in the scanning fiber array Coupling with the M*N output channels for transmitting an output beam output from the beam splitting assembly and projecting the output beam to a human eye.
  • the laser light source comprises a three-color laser light source, a collimating mirror group, a combiner, a coupler and a coupling optical fiber, wherein the three-color laser light source outputs a three-color laser;
  • the collimating mirror group is disposed on An optical path of the three-color laser light source for collimating the three-color laser;
  • the combiner is disposed on an outgoing light path of the collimating lens group for using the collimating mirror a group of emitted lasers are subjected to a combination processing;
  • the coupler is disposed on an outgoing light path of the combiner for coupling a laser beam emitted from the combiner to the coupling optical fiber;
  • the coupler is coupled to transmit the laser coupled through the coupler.
  • the scanning fiber array comprises a horizontal scanning fiber bundle and a vertical scanning fiber bundle, wherein the horizontal scanning fiber bundle is used for expanding a horizontal outgoing beam, and the vertical scanning fiber bundle is used for vertical The outgoing beam is expanded.
  • the horizontal scanning fiber bundle is a closely arranged or spaced scanning fiber bundle
  • the vertical scanning fiber bundle is a closely arranged or spaced scanning fiber bundle, wherein the tight row
  • the meaning of the cloth is that the interval between each adjacent two bundles of fibers is not greater than a predetermined distance, and the interval arrangement means that the interval between each adjacent two bundles of fibers is greater than the preset distance.
  • each of the scanning fibers includes a scanner, and the scanner is disposed on the scanning fiber for deflecting the scanning fiber such that a beam of the scanning fiber is also deflected with the scanning fiber .
  • the beam splitting component comprises a 1*M type first optical splitter, M 1*N type second optical splitters and M*N channel switches, the first optical splitter The incident end of the device is connected to the exit end of the laser light source, and the M second optical splitters are connected to the M output ends of the first optical splitter one by one; the M*N channel switch One-to-one correspondence with the M*N output channels for controlling turn-on and turn-off of the M*N output channels.
  • the beam splitting component comprises an M*N type optical splitter, and the M*N type optical splitter integrates M*N channel switches, and the M*N channel switches and The M*N output channels are in one-to-one correspondence for controlling the opening and closing of the M*N output channels.
  • the controller is electrically connected to the scanning fiber array, and is configured to divide the scanning fiber in the scanning fiber array into S non-interference regions according to a preset condition, and control the S at the same time.
  • Non-interference areas to display S field of view light, where S is an integer not less than 2.
  • the controller is configured to divide the scanning fiber in the scanning fiber array into the S non-interference regions according to the size of the exit pupil diameter.
  • the near-eye display system further includes a concentrating lens array group, the concentrating lens array group includes a first concentrating lens array and a second concentrating lens array, and the first concentrating lens array is disposed on the scanning fiber array Adjacent to the side of the human eye, the second converging lens array is disposed on a side of the scanning fiber array that is away from the human eye.
  • the first converging lens array and the second converging lens array are both collimating lens arrays, and the first converging lens array and the second converging lens array form a 1:1 telescope system .
  • the first converging lens array and the second converging lens array are both electrically controlled liquid microlens arrays, and the first converging lens array and the second converging lens array form a 1:1 no Focus system.
  • the near-eye display system further includes a concentrating lens array disposed on a side of the scanning fiber array near the human eye.
  • the near-eye display system further includes a dimming structure disposed on a side of the scanning fiber array away from the human eye.
  • the second aspect of the present application further provides a virtual reality device, comprising: two sets of near-eye display systems according to the first aspect, wherein the first near-eye display system corresponds to a left eye of the person, and the second near-eye The display system corresponds to the person's right eye.
  • the third aspect of the embodiments of the present application further provides an augmented reality device, comprising: two sets of near-eye display systems according to the first aspect, wherein the first near-eye display system corresponds to a left eye of the person, and the second near-eye The display system corresponds to the right eye of the person, and ambient light enters the left eye of the person through the group of converging lens arrays of the first near-eye display system, and enters the person through the group of converging lens arrays of the second near-eye display system Right eye.
  • the controller is electrically connected to the beam splitting component and the pupil position detector respectively, so that the controller acquires the user pupil and the scanning fiber according to the pupil position acquired by the pupil position detector.
  • the relative positions of the arrays are further controlled to turn on and off each of the output channels in accordance with the relative position and the display field of view gray of the image information. Therefore, it can be known that the corresponding output channel can be automatically turned on according to the relative position of the user pupil and the scanning fiber array to output the light beam, so that the selected subset of the output channels are evenly distributed around the pupil position, thereby ensuring the output. All of the light beams enter the human eye, allowing the human eye to receive all of the output beams at any viewing position.
  • the position of the human eye is not limited, and the user can have a good augmented reality experience, and the user does not need to adjust the distance adjustment of the augmented reality device, thereby avoiding the user being unable to obtain the inaccurate adjustment result.
  • FIG. 1 is a schematic structural view of a near-eye display system according to an embodiment of the present invention.
  • FIG. 2 is a schematic structural view of a laser light source according to an embodiment of the present invention.
  • FIG. 3 is a schematic structural diagram of a second optical splitter according to an embodiment of the present invention.
  • FIG. 4 is a schematic diagram showing a first structure of a scanning optical fiber according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a second structure of a scanning optical fiber according to an embodiment of the present invention.
  • FIG. 6 is a schematic structural diagram of a scanning optical fiber array and a collimating lens array group according to an embodiment of the present invention
  • FIG. 7 is a schematic diagram of an optical path of a telephoto system of 1:1 according to an embodiment of the present invention.
  • FIG. 8 is a view showing an arrangement of scanning fibers of M rows and N columns in an embodiment of the present invention.
  • FIG. 9 is a schematic structural diagram of a relative position of a user's pupil and a transparent substrate according to an embodiment of the present invention.
  • FIG. 10 is a schematic structural diagram of a scanning optical fiber array and an electronically controlled liquid lens array group according to an embodiment of the present invention
  • FIG. 11 is a schematic structural view of a scanning optical fiber array, an electronically controlled liquid lens array, and a dimming structure according to an embodiment of the present invention
  • FIG. 12 is a diagram showing a distribution of a scanning fiber array into S non-interference regions according to an embodiment of the present invention.
  • 10 - laser source 101 - red laser source, 102 - green laser source, 103 - blue laser source, 104 - collimating mirror, 1041 - collimating mirror, 1042 - collimating mirror Group, 1043 - collimating mirror group, 105 - beam combiner, 1051 - dichroic mirror, 1052 - dichroic mirror, 1053 - dichroic mirror, 106 - coupler, 107 - Coupling fiber, 20 - splitter component, 201 - 1*M type first optical splitter, 202 - M 1 * N type second optical splitter, 2021 - second optical splitter , 203 - 1 * N channel switches, 204 - N coupled out of the fiber, 30 - scanning fiber array, 301 - scanning fiber, 302 - PZT piezoelectric ceramics, 303 - Casing, 304 - mount, 305 - transparent substrate, 40 - controller, 50 - first collimating lens array, 51 - second collimating lens
  • the invention provides a near-eye display system, a virtual reality device and an augmented reality device, which can automatically turn on a corresponding output channel according to a user's pupil position to output a light beam, and control the output beam to all enter the human eye, thereby realizing the pair of people There is no limit to the position of the eye observation, which can give the user a good augmented reality experience.
  • Embodiment 1 is a diagrammatic representation of Embodiment 1:
  • a first aspect of an embodiment of the present invention provides a near-eye display system including a laser light source 10, a beam splitting assembly 20, a scanning fiber array 30, a pupil position detector, and a controller 40.
  • the beam splitting assembly 20 includes M*N output channels, and M and N are integers not less than 2.
  • the laser light source 10 is for outputting laser light modulated according to image information.
  • the laser light output from the laser light source 10 passes through the beam splitting assembly 20 and is divided into M*N light beams.
  • the pupil position detector is used to acquire the pupil position of the user's pupil.
  • the controller 40 is electrically connected to the beam splitting assembly 20 and the pupil position detector, respectively, for acquiring the relative position of the user pupil and the scanning fiber array 30 according to the obtained pupil position, and according to the relative position and the The display field of view gray scale controls the turn-on and turn-off of each output channel in the beam splitting assembly 20.
  • a scanning fiber in the scanning fiber array 30 is coupled to the M*N output channels for transmitting an output beam output from the beam splitting assembly 20 and projecting the output beam to the human eye.
  • the laser light source 10 can be a monochromatic laser source or a multi-color laser source.
  • the laser light source 10 is a monochrome laser tube light source, it is used to display a monochrome image.
  • the laser light source 10 is a multi-color laser light source, it is used to display a monochrome image and a multi-color image.
  • the laser light source 10 may specifically be a three-color laser light source, such as an RGB laser light source or the like. The following is a specific example of a three-color laser source.
  • the display field of view gray of the image information is the pixel point gray level corresponding to the current display field of view.
  • the gray level of each pixel in the image corresponding to the image information may be acquired according to the image information, so that the display field of view gray level may be acquired. For example, if the current display field of view is 0° field of view, the pixel point gray level corresponding to the 0° field of view is obtained, for example, one of 0 to 255.
  • the laser light source 10 includes a red laser light source 101, a green laser light source 102, and a blue laser light source 103.
  • the red laser source 101 is used to emit a red laser
  • the green laser source 102 is used to emit a green laser
  • the blue laser source 103 is used to emit a blue laser.
  • the laser source 10 further includes a collimating mirror assembly 104, a combiner 105, a coupler 106, and a coupling fiber 107.
  • the collimating lens group 104 is disposed on the outgoing light path of the laser light source 10 for collimating the laser light emitted from the laser light source 10.
  • the collimating mirror group 104 includes a collimating mirror 1041, a collimating mirror 1042 and a collimating mirror 1043.
  • the collimating mirror 1041 is disposed on the outgoing optical path of the red laser source 101 for collimating the red laser, and collimating The mirror 1042 is disposed on the outgoing light path of the green laser light source 102 for collimating the green laser light, and the collimating mirror 1043 is disposed on the outgoing light path of the blue laser light source 103 for collimating the blue laser light.
  • the laser source 10 may also be composed of a red laser source 101, a green laser source 102, a blue laser source 103, a combiner 105, a coupler 106, and a coupling fiber 107, but does not include the collimator group 104.
  • the coupling fiber 107 can be a crystalline fiber fiber, such as a silica optical fiber.
  • the combiner 105 is disposed on the exiting optical path of the collimating lens group 104 for combining the laser light emitted by the collimating lens group 104.
  • the combiner 105 includes a dichroic mirror 1051, a dichroic mirror 1052, and a dichroic mirror 1053, wherein the dichroic mirror 1051 reflects red light and transmits green light, and the dichroic mirror 1052 transmits green light, dichroic color.
  • the mirror 1053 transmits red light, green light, and reflects blue light, thereby synthesizing the laser light emitted from the collimator group 104 into a single beam, which will not be described again.
  • the coupler 106 is disposed on the exiting optical path of the combiner 105 for coupling the laser light exiting the combiner 105 into the coupling fiber 107.
  • the coupling fiber 107 is coupled to a coupler 106 for transmitting laser light coupled via the coupler 106.
  • the beam splitting assembly 20 can be an M*N type optical splitter that includes M*N output channels.
  • the M*N type optical splitter integrates M*N channel switches, and the M*N channel switches and the M*N output channels are in one-to-one correspondence, and are used for controlling the M*N
  • the output channel is turned on and off.
  • the beam splitting assembly 20 can also be an M*N type optical splitter and M*N channel switches.
  • the channel switch may be an optical switch or an optical attenuator or the like. When the channel switch is an optical switch, it can control the opening and closing of the output channel. When the channel switch is an optical attenuator, it can not only control the opening and opening of the output channel, but also control the energy of the output beam of the output channel.
  • the output channel is turned on, the beam is transmitted through the output channel to the scanning fiber array 30. When the output channel is disconnected, the beam cannot be transmitted through the output channel to the scanning fiber array 30.
  • the following is an example of an optical switch.
  • the channel switch is an optical attenuator
  • the output energy of the corresponding output channel is controlled by the optical attenuator to be 0, it can be determined that the output channel is disconnected, and if the output energy of the corresponding output channel is controlled by the optical attenuator Greater than 0, to determine that the output channel is turned on.
  • the beam splitting component 20 can also be a 1*M type first optical splitter 201, M 1*N type second optical splitter 202 and M*N channel switches.
  • the incident end of the first optical splitter 201 is connected to the exit end of the laser light source 10, that is, the incident end of the first optical splitter 201 is connected to the coupling optical fiber 107.
  • the M second optical splitters 202 are connected to the M output ends of the first optical splitter 201 one by one to form M*N output channels.
  • the M*N channel switches and the M*N output channels are in one-to-one correspondence, and are used for controlling opening and opening of the M*N output channels, so that each output channel can pass through a corresponding channel switch. To independently control the turn-on and turn-off of the output channel.
  • Each output channel is a fiber, so that the M*N output channels are M*N fibers.
  • the spectroscopic unit 20 splits the laser light output from the laser light source 10 into M*N beams of equal energy when the laser light output from the laser light source 10 is divided into M*N light beams.
  • the maximum output energy set by the red laser source 101 is ER
  • the maximum energy emitted by the exit end of each output channel of the final fiber splitter after being split by the beam splitting assembly 20 is:
  • the gray level of the image is constrained by the scanning fiber, so the M*N beam scanning fiber can achieve M*N gray levels. If the gray level of the image corresponding to the image information is 8 bits, that is, there are 256 gray levels, the red laser demand energy corresponding to the unit gray level is ER/256.
  • a second optical splitter 2021 of the M second optical splitters 202 will be described as an example.
  • the incident end of the second optical splitter 2021 is coupled to an exit end of the first optical splitter 201.
  • the N exit ends of the second optical splitter 2021 are connected to the 1*N channel switches 203 one by one.
  • the output of the 1*N channel switch 203 can also be connected to the N coupled fiber 204 for connection to the scanning fiber in the scanning fiber array 30.
  • the 1*N channel switches 203 are used to control the turn-on and turn-off of the N output terminals of the second optical splitter 2021, thereby controlling the turn-on and turn-off of the N output channels of the second optical splitter 2021.
  • the scanning fiber array 30 includes an M*N beam scanning fiber.
  • the light beams output by the M*N output channels are coupled into the M*N beam scanning fiber, and the light beams output by the M*N output channels are deflected by the M*N beam scanning fiber, and the beam is deflected
  • the rear beam is projected onto the human eye.
  • the scanning fiber array 30 can constitute a scanning fiber optic panel. Further, the scanning fiber array 30 may include a horizontal scanning fiber bundle and a vertical scanning fiber bundle.
  • the horizontal scanning fiber bundle is used to expand a horizontal exit beam; the vertical scanning fiber bundle is used to expand a vertical exit beam.
  • fields of view of 120°, 130°, and 140° can be displayed, so that the display field of view matches the field of view of the human eye. .
  • the horizontal scanning fiber bundles are closely arranged or spacedly arranged scanning fiber bundles.
  • the vertical scanning fiber bundles are closely arranged or spacedly arranged scanning fiber bundles.
  • the meaning of the tight arrangement is that the interval between each adjacent two bundles of fibers is not greater than a preset distance, and the interval arrangement means that the interval between each adjacent two bundles of fibers is greater than a preset distance.
  • the preset distance is set according to actual conditions.
  • the preset distance may be a value of not less than 25 micrometers (um), for example, 25 um, 30 um, and 35 um, etc., which is not specifically limited herein.
  • each of the scanning fibers includes a scanner, and the scanner is disposed on the scanning fiber for deflecting the scanning fiber, so that the beam emitted by the scanning fiber is also deflected, thereby realizing Expansion in the horizontal and vertical directions.
  • the scanner may specifically be a two-dimensional scanner such as a PZT piezoelectric ceramic or the like.
  • the scanning fiber is deflected in the horizontal direction and the vertical direction (two-dimensional scanning) under the driving of the PZT piezoelectric ceramic, and the laser light output from the laser light source 10 is processed into an image beam, thereby realizing the purpose of transmitting the virtual image to the eyes of the user.
  • a scanning fiber 301 in the scanning fiber array 30 includes a PZT piezoelectric ceramic 302.
  • the scanning fiber 301 is disposed in the sleeve 303.
  • the PTZ piezoelectric ceramic 302 is fixed in the sleeve 303 through a fixing base 304, and the PZT piezoelectric ceramic 302 is disposed on the scanning optical fiber 301.
  • Both ends of the holder 304 are connected to the inner wall of the sleeve 303 such that the holder 304 is fixed in the sleeve 303.
  • the fixing seat 304 can also be connected to the inner wall of the sleeve 303 only at one end, and the fixing seat 304 is also fixed in the sleeve 303, as shown in FIG.
  • the exit end face of the scanning fiber in the scanning fiber array 30 may be a plane or a curved surface.
  • the scanning fiber can be a crystalline fiber fiber, such as a silica optical fiber. This kind of fiber can obtain a beam with a small beam waist and a large numerical aperture.
  • the exit end face of the scanning fiber is a concave curved surface having a certain curvature, the concave curved surface will gather the light beam, so that the maximum scanning angle of each scanning optical fiber is reduced, thereby increasing the frequency of the optical fiber scanning in the scanning optical fiber array 30.
  • the convex curved surface When the exit end face of the scanning fiber is a convex curved surface having a certain curvature, the convex curved surface will diverge the light beam, so that the maximum scanning angle of each scanning optical fiber is increased, thereby reducing the frequency of scanning the optical fiber in the scanning optical fiber array 30.
  • the scanning fiber array 30 may be packaged in the transparent substrate 305.
  • the scanning fiber 301 is coated with a very thin transparent protective glue, that is, a coating layer, for the outer layer of the bare fiber.
  • the gap between each adjacent two scanning fibers is filled with a material having the same or similar refractive index as the coating layer.
  • the transparent substrate 305 is a substrate having a transparency greater than a preset transparency.
  • the preset transparency has a value ranging from 75% to 100%, that is, any value between 75% and 100%, for example, 75%, 85%, 100%, and the like.
  • the near-eye display system further includes a condenser lens array group.
  • the concentrating lens array group includes a first concentrating lens array and a second concentrating lens array.
  • the first converging lens array is disposed on a side of the scanning fiber array 30 near the human eye
  • the second converging lens array is disposed on a side of the scanning fiber array 30 away from the human eye.
  • the first concentrating lens array and the second concentrating lens array may both be collimating lens arrays.
  • a first collimating lens array 50 is disposed on a side of the scanning fiber array 30 close to the human eye
  • a second collimating lens array 51 is disposed on a side of the scanning fiber array 30 away from the human eye
  • the mirror array 51 constitutes a 1:1 telescope system. Since the scanning fiber array 30 is encapsulated in the transparent substrate 305, the ambient light enters the human eye through the 1:1 telescope system, and since the ambient light enters the human eye through the 1:1 telephoto system, it will not The outside world zooms in or out, so that users can feel the external environment more realistically.
  • the optical path principle of the 1:1 telephoto system is shown in Figure 7.
  • the first collimating lens array 50 is disposed on the outgoing optical path of the scanning fiber array 30 for collimating the light beam emitted from the scanning optical fiber array 30.
  • the following is specifically described by taking one of the first collimating lens arrays 50 as an example.
  • a collimating lens 501 is further disposed on the outgoing optical path of the scanning optical fiber 301.
  • the collimating lens 501 is used to collimate the cone beam that the PZT piezoelectric ceramic 302 scans out so that it can be projected into the human eye in an approximately parallel manner.
  • the pupil position detector may be disposed on a side surface of the transparent substrate 305, or may be disposed under the transparent substrate 305.
  • the pupil position detector may specifically be a position detecting sensor for acquiring the pupil position of the user's pupil in real time and transmitting the pupil position to the controller 40.
  • the controller 40 acquires the pupil position acquired by the pupil position detector in real time.
  • the relative position of the user pupil and the scanning fiber array 30 is obtained according to the pupil position and the array position of the scanning fiber array 30 stored in advance.
  • the relative position of the user pupil to the center of the scanning fiber array 30 is obtained.
  • the relative position may also be a position of the user's pupil relative to the center of the transparent substrate 305.
  • the array locations can be stored in external storage hardware, at which point controller 40 reads the array locations from the external storage hardware.
  • the external storage hardware may be, for example, a storage device such as a memory card, a hard disk, or a USB device.
  • the array location may also be stored in a storage space in the controller 40, which is not specifically limited herein.
  • the controller 40 may be a single chip microcomputer, a processing chip, a control circuit, or the like. Further, the near-eye display system is applied to a single eye, and when applied to both eyes, two sets of the near-eye display systems are required.
  • the controller 40 controls the opening and closing of each output channel in the beam splitting assembly 20 according to the relative position and the display field of view gray scale of the image information. Specifically, the controller 40 selects K*F output channels from the M*N output channels according to the relative position and the display field gamma of the image information, and turns on, but is not selected. The output channel is disconnected. Since the M*N scanning fibers in the scanning fiber array 30 are coupled to the M*N output channels, when K*F output channels are turned on, the output beams output by the K*F output channels are transmitted to A corresponding K*F root scan fiber, and then a K*F root scan fiber scans the output beam and projects it to the human eye. Where K and F are both positive integers.
  • the K*F output channels are selected according to the relative position and the display field of view gray of the image information, and the relative position is the position of the user pupil relative to the center of the scanning fiber array 30, thereby determining The K*F output channels are selected according to the pupil position such that the selected K*F output channels are evenly distributed around the pupil position. In this way, it can be ensured that the output beam outputted by the selected K*F output channels can all enter the human eye through the scanning fiber, so that the human eye can receive all the output beams at any viewing position, thereby realizing the position observed by the human eye. There is no limit to give users a good augmented reality experience.
  • the controller 40 may select K*F output channels from the M*N output channels by using an interpolation method according to the relative position and the display field gradation of the image information, so that the selection is performed.
  • the K*F output channels are evenly distributed around the pupil position to ensure that the output beams of the selected K*F output channels can all enter the human eye through the scanning fiber, so that the human eye can receive at any viewing position.
  • the interpolation method may be, for example, a nearest neighbor interpolation method or the like.
  • the controller 40 may first determine the number of output channels that need to be opened from the M*N output channels according to the display field of view gray scale of the image information, and then adopt the number according to the relative position.
  • the nearest neighbor interpolation method selects the K*F output channels.
  • the channels corresponding to each field of view are M*N output channels.
  • the near-eye display system defaults to that the user's bore 60 is centered on the central axis 306 of the transparent substrate 305. If the image corresponding to the image information is 8 bits, that is, there are 256 gray levels, and the currently displayed zero field of view has a gray value of 160, the corresponding number of output channels to be turned on is:
  • the center of the user's bore 60 may not be located at the central axis 306 of the transparent substrate 305, for example, as shown in Figure 9, the user's bore 60 is offset to the left by 3 millimeters (mm).
  • the 563 channel is uniformly selected in the 30*30 channel by using the nearest neighbor interpolation method.
  • the rows and columns obtained by the nearest neighbor interpolation method to obtain the output channel are as shown in Table 2 below:
  • the M*N output channels correspond to the M rows and N columns of scanning fibers in the scanning fiber array 30, it can be seen that the first row in Tables 1 and 2 also represents the rows of the scanning fibers in the fiber scanning array 30.
  • the second row in Tables 1 and 2 indicates the column number of the scanned fiber in the fiber scanning array 30. Therefore, the row number of the corresponding scanning fiber can be determined according to the output channel row number of the opening. As shown in Table 1 and Table 2, when the user's pupil 60 is shifted to the left by 3 mm, the scanning fiber corresponding to the opened output channel is also shifted to the left, thereby ensuring the selected K*F output channel outputs.
  • the output beam can all enter the human eye through the scanning fiber.
  • the channel switch is an optical switch
  • the light corresponding to the K*F output channels is turned on, and the light corresponding to the unselected output channel is turned on, thereby controlling the K. * F output channels output the output beam.
  • the unselected output channels are off and no beams are transmitted to the corresponding scanning fiber for output.
  • the output energy of each of the K*F output channels is adjusted by an optical attenuator, so that the total of the K*F output channels is The difference between the output energy and the energy required to display the gray of the field of view is not greater than a predetermined threshold.
  • the preset threshold is set according to actual conditions.
  • the preset threshold For example, a value of not more than 20*ER/(M*N) may be taken such that the total output energy of the K*F output channels is the same as or slightly different from the energy required to display the field of view gray scale.
  • the output energy of the unselected output channel is controlled to be 0 by the optical attenuator. In this way, the effect of image display can be effectively improved.
  • the first converging lens array and the second converging lens array may also be electrically controlled liquid microlens arrays.
  • the electrically controlled liquid microlens array can be, for example, an electrically controlled liquid crystal microlens array.
  • a first electronically controlled liquid crystal microlens array 52 is disposed on a side of the scanning fiber array 30 near the human eye
  • a second electronically controlled liquid crystal microlens array 53 is disposed on a side of the scanning fiber array 30 away from the human eye.
  • the first electrically controlled liquid crystal microlens array 52 and the second electrically controlled liquid crystal microlens array 53 constitute a 1:1 afocal system.
  • the scanning fiber array 30 is encapsulated in the transparent substrate 305, the ambient light enters the human eye through the 1:1 afocal system, and since the ambient light enters the human eye through the 1:1 afocal system, the The outside world zooms in or out, so that users can feel the external environment more realistically.
  • the first electrically controlled liquid crystal microlens array 52 is disposed on the outgoing optical path of the scanning fiber array 30 for collimating the light beam emitted from the scanning optical fiber array 30.
  • the first electronically controlled liquid crystal microlens array 52 and the second electronically controlled liquid crystal microlens array 53 do not operate without voltage, the first electrically controlled liquid crystal microlens array 52 and the second electrically controlled liquid crystal microlens array 53 have no light convergence or divergence. The function is that it does not exhibit the effect of light deflection, and does not have a light transition to the ambient light. In this way, the ambient light can pass through the first electrically controlled liquid crystal microlens array 52 and the transparent substrate 305 and then enter the human eye through the second electrically controlled liquid crystal microlens array 53 to realize the observation of the actual external environment.
  • the near-eye display system may further include a concentrating lens array disposed on a side of the scanning fiber array 30 close to the human eye, and the concentrating lens array is disposed on the scanning fiber
  • the outgoing light path of the array 30 is used to collimate the light beam exiting the scanning fiber array 30.
  • the concentrating lens array may be a collimating lens array or an electrically controlled liquid microlens array. The following is specifically described by taking the converging lens array as an electrically controlled liquid crystal microlens array as an example.
  • a first electrically controlled liquid crystal microlens array 52 is disposed on the side of the scanning fiber array 30 near the human eye for collimating the light beam emitted from the scanning fiber array 30 to display a virtual image.
  • the dimming structure 54 when the near-eye display system including the first electronically controlled liquid crystal microlens array 52 is used for performing augmented reality display, the dimming structure 54 needs to be disposed on the side of the scanning fiber array 30 away from the human eye.
  • the dimming structure 54 may specifically be a polymer dispersed liquid crystal (PDLC) film layer with an optical switch.
  • PDLC polymer dispersed liquid crystal
  • the virtual image and the real environment are displayed in a time-division manner. It is assumed that the refresh rate of the human eye is 30 Hz, and the time period corresponding to the refresh rate is divided into two segments. For a period of time to display virtual images, this period makes PDLC The optical switch of the film layer is broken, making the PDLC film layer opaque. Another period of time is used to observe the actual external environment.
  • the optical switch of the PDLC film layer is turned on, thereby applying a voltage to the PDLC film layer to make it transparent, so that the ambient light can pass through the PDLC film layer and the transparent substrate 305. .
  • no voltage is applied to the first electrically controlled liquid crystal microlens array 52. Since the first electronically controlled liquid crystal microlens array 52 does not operate without voltage application, the first electronically controlled liquid crystal microlens array 52 has no light converging or diverging function, that is, does not exhibit the effect of light deflection, and does not have a light transition to the external ambient light. In this way, external ambient light can pass through the PDLC film layer and the transparent substrate 305 and then enter the human eye through the first electronically controlled liquid crystal microlens array 52, thereby realizing the observation of the actual external environment.
  • the controller is electrically connected to the beam splitting component and the pupil position detector respectively, so that the controller obtains the relative position of the user pupil and the scanning fiber array according to the pupil position acquired by the pupil position detector. And determining, according to the relative position and the display field of view gray level of the image information, the opening and closing of each of the output channels. Therefore, it can be known that the corresponding output channel can be automatically turned on according to the relative position of the user pupil and the scanning fiber array to output the light beam, so that the selected subset of the output channels are evenly distributed around the pupil position, thereby ensuring the output. All of the light beams enter the human eye, allowing the human eye to receive all of the output beams at any viewing position.
  • the position of the human eye is not limited, and the user can have a good augmented reality experience, and the user does not need to adjust the distance adjustment of the augmented reality device, thereby avoiding the user being unable to obtain the inaccurate adjustment result.
  • Embodiment 2 is a diagrammatic representation of Embodiment 1:
  • a first aspect of an embodiment of the present invention further provides a near-eye display system including a laser light source 10, a beam splitting assembly 20, a scanning fiber array 30, a pupil position detector, and a controller 40.
  • the beam splitting assembly 20 includes M*N output channels, and M and N are integers not less than 2.
  • the laser light source 10 is for outputting laser light modulated according to image information.
  • the laser light output from the laser light source 10 passes through the beam splitting assembly 20 and is divided into M*N light beams.
  • the pupil position detector is used to acquire the pupil position of the user's pupil.
  • the controller 40 is electrically connected to the beam splitting component 20 and the pupil position detector, respectively, for acquiring the relative position of the user pupil and the scanning fiber array 30 according to the obtained pupil position, and then according to the relative position and the image
  • the display field of view gray scale controls the turn-on and turn-off of each of the output channels in the beam splitting assembly 20.
  • a scanning fiber in the scanning fiber array 30 is coupled to the M*N output channels for transmitting an output beam output from the beam splitting assembly 20 and projecting the output beam to the human eye.
  • the corresponding relationship between the scanning fiber in the scanning fiber array 30 and the S non-interference regions may be pre-stored in the controller 40.
  • the correspondence relationship includes area field information corresponding to each non-interference area, wherein one non-interference area corresponds to one area field of view, and S is an integer not less than 2.
  • the corresponding relationship is prestored in the controller 40, and when the scanning fiber in the scanning fiber array 30 outputs the output beam, the controller 40 controls the scanning in the scanning fiber array 30 according to the correspondence relationship.
  • the optical fiber is caused to exit the output beam to form S field of view light and project the S field of view light to the human eye.
  • the near-eye display system can display S field-of-view light, ie, S pixel points, at each moment. Only one pixel point can be displayed at a time in the prior art. In this way, the switching frequency of the channel switch can be effectively reduced, and the energy utilization rate is also increased in the case where the switching frequency is reduced per unit time.
  • the controller 40 may In the corresponding relationship, the K*F root scanning fiber corresponding to the K*F output channels is controlled to emit the output beam to form S field light, and the S field light is projected to the human eye.
  • the controller 40 may divide the scanning fiber in the scanning fiber array 30 into S non-interference regions according to a preset condition, thereby acquiring the corresponding relationship, and acquiring the corresponding relationship.
  • the corresponding relationship is stored in the storage space of the controller 40 or stored in external storage hardware.
  • the controller 40 needs to read the correspondence from the external storage hardware.
  • the external storage hardware may be, for example, a storage device such as a memory card, a hard disk, or a USB device.
  • the non-interference region indicates that the region does not overlap with any other region.
  • the scanning fiber array 30 can control the S non-interference regions at the same time to display S field lights, that is, can be displayed at each moment.
  • S field light one of which corresponds to one pixel.
  • the preset condition may be a preset division manner, and the preset division manner may be divided into S non-interference regions according to the number of scanning fibers in the scanning fiber array 30.
  • H and J are integers not less than 2, and H and J may be the same or different.
  • the preset division manner may be divided according to a display field of view of the near-eye display system. The larger the field of view is, the larger the value of S is. The smaller the field of view is, the smaller the value of S is.
  • This application is not specifically limited.
  • the preset division manner may also be to directly set the value of S, and then divide the scanning fiber in the scanning fiber array 30 into S non-interference regions.
  • the controller 40 may further divide the scanning fiber in the scanning fiber array 30 into the S non-interference regions according to the size of the exit pupil diameter and the display field of view of the near-eye display system.
  • the diameter of the exit pupil in the horizontal direction is denoted by B 1
  • the diameter of the exit pupil in the vertical direction is denoted by B 2
  • the display field of view of the near-eye display system in the horizontal direction is denoted by C 1
  • the near-eye display system The display field of view in the vertical direction is represented by C 2
  • the M*N scanning fibers in the scanning fiber array 30 can be divided into S non-interference regions, wherein
  • L represents the distance from the human eye to the scanning fiber array 30.
  • the M*N beam channel can be divided into
  • the horizontal direction of the scanning fiber array 30 is divided into three regions, each of which does not overlap, and the vertical direction of the scanning fiber array 30 is also divided into three regions, each of which does not overlap, thereby obtaining nine non- Interference area.
  • the nine non-interference regions are non-interference regions of A1, A2, A3, A4, A5, A6, A7, A8, and A9, and each non-interference region does not overlap.
  • A1 displays a field of view of -20° to -7° in the horizontal direction
  • A1 displays a field of view of 7° to 20° in the vertical direction.
  • the area of view of the area displayed by A1 is ⁇ (-20° to -7°), (7° to 20°) ⁇ .
  • the field of view displayed by A2 is ⁇ ((-7° ⁇ 7°), (7° ⁇ 20°) ⁇ ;
  • the field of view displayed by A3 is ⁇ (7° ⁇ 20°), (7° ⁇ 20°) ⁇ ;
  • A4 shows the field of view as ⁇ (-20° ⁇ -7°), (-7° ⁇ 7°) ⁇ ;
  • A5 shows the area of view as ⁇ (-7° ⁇ 7°), (-7° ⁇ 7°) ⁇ ;
  • A6 shows the field of view as ⁇ (7° ⁇ 20°), (-7° ⁇ 7°) ⁇ ;
  • A7 shows the field of view as ⁇ (-20° ⁇ - 7°), (-20° ⁇ -7°) ⁇ ;
  • the field of view of A8 is ⁇ (-7° ⁇ 7°), (-20° ⁇ -7°) ⁇ ;
  • the field of view displayed by A9 It is ⁇ (7° ⁇ 20°), (-20° ⁇ -7°) ⁇ .
  • the near-eye display system is capable of displaying nine fields of view light at each moment, that is, displaying nine pixel points.
  • the example of displaying an image of 800*600 is taken as an example, wherein the refresh rate of the human eye is required to be at least 30 Hz. If the displayed image is an RGB color image, the required switching frequency of the channel switch is at least Scanning fiber has the lowest scanning frequency
  • the switching frequency of the channel switch is at least Scanning fiber has the lowest scanning frequency If RGB color images are to be displayed, the color image needs to be time-series. Therefore, the switching frequency of the channel switch required in the prior art is the lowest. Scanning fiber has the lowest scanning frequency
  • the above embodiment of the present application can solve the technical problem of high switching frequency of the channel switch in the prior art, and achieve the effect of effectively reducing the switching frequency of the channel switch, and When the switching frequency is reduced per unit time, the energy utilization rate will also increase.
  • the maximum scanning angle of the scanning fiber in one non-interference region is the scan angle corresponding to the field of view of the region.
  • each fiber in the scanning fiber array needs to correspond to the total display field of view of the near-eye display system such that the maximum scanning angle of each fiber is the scanning angle corresponding to the total display field of view. Since an area field of view is only a part of the total display field of view, the scan angle corresponding to the area field of view must be made smaller than the scan angle corresponding to the total display field of view. Therefore, the maximum scanning angle required for scanning the optical fiber in the embodiment of the present application is reduced, so that the scanning frequency of the scanning optical fiber can be improved.
  • each non-interference region can display all gray levels of the image corresponding to the image information.
  • A1 is taken as an example, and A1 uses 100 scanning fibers to display 256 gray levels.
  • the channel switches corresponding to 100 scanning fibers are turned on, the displayed gray level is 256.
  • the near-eye display system further includes a concentrating lens array group.
  • Converging lens array The column set includes a first converging lens array and a second converging lens array.
  • the first converging lens array is disposed on a side of the scanning fiber array 30 near the human eye.
  • the second converging lens array is disposed on a side of the scanning fiber array 30 that is away from the human eye.
  • the first concentrating lens array and the second concentrating lens array may both be collimating lens arrays.
  • a first collimating lens array 50 is disposed on a side of the scanning fiber array 30 close to the human eye
  • a second collimating lens array 51 is disposed on a side of the scanning fiber array 30 away from the human eye
  • the first quasi-first The straight lens array 50 and the second collimating lens array 51 constitute a 1:1 telescope system. Since the scanning fiber array 30 is encapsulated in the transparent substrate 305, the ambient light enters the human eye through the 1:1 telescope system, and since the ambient light enters the human eye through the 1:1 telephoto system, it will not The outside world zooms in or out, so that users can feel the external environment more realistically.
  • the optical path principle of the 1:1 telephoto system is shown in Figure 7.
  • the first converging lens array and the second converging lens array may also be electrically controlled liquid microlens arrays.
  • the electrically controlled liquid microlens array can be, for example, an electrically controlled liquid crystal microlens array.
  • a first electronically controlled liquid crystal microlens array 52 is disposed on a side of the scanning fiber array 30 near the human eye
  • a second electronically controlled liquid crystal microlens array 53 is disposed on a side of the scanning fiber array 30 away from the human eye.
  • the first electrically controlled liquid crystal microlens array 52 and the second electrically controlled liquid crystal microlens array 53 constitute a 1:1 afocal system.
  • the scanning fiber array 30 is encapsulated in the transparent substrate 305, the ambient light enters the human eye through the 1:1 afocal system, and since the ambient light enters the human eye through the 1:1 afocal system, the The outside world zooms in or out, so that users can feel the external environment more realistically.
  • the near-eye display system may further include a concentrating lens array disposed on a side of the scanning fiber array 30 close to the human eye, and the concentrating lens array is disposed on the scanning fiber
  • the outgoing light path of the array 30 is used to collimate the light beam exiting the scanning fiber array 30.
  • the concentrating lens array may be a collimating lens array or an electrically controlled liquid microlens array. The following is specifically described by taking the condensing lens array as an electrically controlled liquid crystal microlens array as an example.
  • a first electrically controlled liquid crystal microlens array 52 is disposed on the side of the scanning fiber array 30 near the human eye for collimating the light beam emitted from the scanning fiber array 30 to display a virtual image.
  • the dimming structure 54 when the near-eye display system including the first electronically controlled liquid crystal microlens array 52 is used for performing augmented reality display, the dimming structure 54 needs to be disposed on the side of the scanning fiber array 30 away from the human eye.
  • the dimming structure 54 may specifically be a PDLC film layer with an optical switch.
  • the virtual image and the real environment are displayed in a time-division manner. It is assumed that the refresh rate of the human eye is 30 Hz, and the time period corresponding to the refresh rate is divided into two segments. For a period of time, the virtual image is displayed, and the optical switch of the PDLC film layer is disconnected during this period, so that the PDLC film layer is opaque. Another paragraph The time is used to observe the actual external environment.
  • the optical switch of the PDLC film layer is turned on, thereby applying a voltage to the PDLC film layer to make it transparent, so that the ambient light can pass through the PDLC film layer and the transparent substrate 305.
  • no voltage is applied to the first electrically controlled liquid crystal microlens array 52. Since the first electronically controlled liquid crystal microlens array 52 does not operate without voltage application, the first electronically controlled liquid crystal microlens array 52 has no light converging or diverging function, that is, does not exhibit the effect of light deflection, and does not have a light transition to the external ambient light. In this way, external ambient light can pass through the PDLC film layer and the transparent substrate 305 and then enter the human eye through the first electronically controlled liquid crystal microlens array 52, thereby realizing the observation of the actual external environment.
  • time period corresponding to the refresh rate may also be divided into at least three segments, one or more of which are used to display the virtual image, and the remaining at least one time is used to observe the actual external environment.
  • the controller is electrically connected to the beam splitting component and the pupil position detector respectively, so that the controller obtains the relative position of the user pupil and the scanning fiber array according to the pupil position acquired by the pupil position detector. And turning on and off each output channel in the beam splitting component according to the relative position and the display field of view gray level of the image information. Therefore, it can be known that the corresponding output channel can be automatically turned on according to the relative position of the user pupil and the scanning fiber array to output the light beam, so that the selected subset of the output channels are evenly distributed around the pupil position, thereby ensuring the output. All of the light beams enter the human eye, allowing the human eye to receive all of the output beams at any viewing position.
  • the position of the human eye is not limited, and the user can have a good augmented reality experience, and the user does not need to adjust the distance adjustment of the augmented reality device, thereby avoiding the user being unable to obtain the inaccurate adjustment result.
  • the controller when the output beam output from the beam splitting component is transmitted through the scanning fiber in the scanning fiber array, the controller according to the correspondence between the scanning fiber and the S non-interference regions in the scanning fiber array, The scanning fiber in the scanning fiber array is controlled to emit the output beam to form S field of view light, and the S field of view light is projected to the human eye.
  • the near-eye display system can display S field of view light at each moment, that is, S pixels, where S is an integer not less than 2.
  • S pixels where S is an integer not less than 2.
  • S an integer not less than 2.
  • only one pixel can be displayed at a time. In this way, the switching frequency of the channel switch can be effectively reduced, and the energy utilization rate is also increased in the case where the switching frequency is reduced per unit time.
  • Embodiment 3 is a diagrammatic representation of Embodiment 3
  • a second aspect of the embodiments of the present invention further provides a virtual reality device, comprising two sets of near-eye display systems as described in the first aspect, wherein the first near-eye display system corresponds to a person's left eye, and the second near-eye display system and a person The right eye corresponds.
  • the virtual reality device may further include a housing, and the first near-eye display system and the second near-eye display system are both disposed in the housing.
  • Embodiment 4 is a diagrammatic representation of Embodiment 4:
  • a third aspect of the embodiments of the present invention further provides an augmented reality device, comprising two sets of near-eye display systems as described in the first aspect, wherein the first near-eye display system corresponds to a person's left eye, and the second near-eye display system and a person The right eye corresponds.
  • Ambient ambient light enters the left eye of the person through the set of converging lens arrays of the first near-eye display system and enters the right eye of the person through the group of converging lens arrays of the second near-eye display system.
  • the augmented reality device may further include a housing, and the first near-eye display system and the second near-eye display system are both disposed in the housing.
  • the concentrating lens array group includes a first concentrating lens array and a second concentrating lens array, the first concentrating lens array is disposed on a side of the scanning fiber array 30 near the human eye, and the second concentrating lens array is disposed. The side of the optical fiber array 30 that is away from the human eye is scanned.
  • the first concentrating lens array and the second concentrating lens array may both be collimating lens arrays.
  • a first collimating lens array 50 is disposed on a side of the scanning fiber array 30 close to the human eye
  • a second collimating lens array 51 is disposed on a side of the scanning fiber array 30 away from the human eye
  • the first quasi-first The straight lens array 50 and the second collimating lens array 51 constitute a 1:1 telescope system. Since the scanning fiber array 30 is encapsulated in the transparent substrate 305, the ambient light enters the human eye through the 1:1 telescope system, and since the ambient light enters the human eye through the 1:1 telephoto system, it will not The outside world zooms in or out, so that users can feel the external environment more realistically.
  • the optical path principle of the 1:1 telephoto system is shown in Figure 7.
  • the first converging lens array and the second converging lens array may also be electrically controlled liquid microlens arrays.
  • the electrically controlled liquid microlens array can be, for example, an electrically controlled liquid crystal microlens array.
  • a first electronically controlled liquid crystal microlens array 52 is disposed on a side of the scanning fiber array 30 near the human eye
  • a second electronically controlled liquid crystal microlens array 53 is disposed on a side of the scanning fiber array 30 away from the human eye.
  • the first electrically controlled liquid crystal microlens array 52 and the second electrically controlled liquid crystal microlens array 53 constitute a 1:1 afocal system.
  • the scanning fiber array 30 is encapsulated in the transparent substrate 305, the ambient light enters the human eye through the 1:1 afocal system, and since the ambient light enters the human eye through the 1:1 afocal system, the The outside world zooms in or out, so that users can feel the external environment more realistically.
  • the concentrating lens array group may further be a first electronically controlled liquid crystal microlens array 52 and a light modulating structure 54.
  • a near-eye display system including the first electronically controlled liquid crystal microlens array 52 is used for augmented reality display, it is necessary to provide a dimming structure 54 on the side of the scanning fiber array 30 that is away from the human eye.
  • the dimming structure 54 may specifically be a PDLC film layer with an optical switch.
  • the virtual image and the real environment are displayed in a time-division manner. It is assumed that the refresh rate of the human eye is 30 Hz, and the time period corresponding to the refresh rate is divided into two segments.
  • the virtual image is displayed, and the optical switch of the PDLC film layer is disconnected during this period, so that the PDLC film layer is opaque.
  • Another period of time is used to observe the actual external environment.
  • the optical switch of the PDLC film layer is turned on, thereby applying a voltage to the PDLC film layer to make it transparent, so that the ambient light can pass through the PDLC film layer and the transparent substrate 305. .
  • no voltage is applied to the first electrically controlled liquid crystal microlens array 52.
  • the first electronically controlled liquid crystal microlens array 52 Since the first electronically controlled liquid crystal microlens array 52 does not operate without voltage application, the first electronically controlled liquid crystal microlens array 52 has no light converging or diverging function, that is, does not exhibit the effect of light deflection, and does not have a light transition to the external ambient light. In this way, external ambient light can pass through the PDLC film layer and the transparent substrate 305 and then enter the human eye through the first electronically controlled liquid crystal microlens array 52, thereby realizing the observation of the actual external environment.
  • time period corresponding to the refresh rate may also be divided into at least three segments, one or more of which are used to display the virtual image, and the remaining at least one time is used to observe the actual external environment.
  • the controller is electrically connected to the beam splitting component and the pupil position detector respectively, so that the controller obtains the relative position of the user pupil and the scanning fiber array according to the pupil position acquired by the pupil position detector. And determining, according to the relative position and the display field of view gray level of the image information, the opening and closing of each of the output channels. Therefore, it can be known that the corresponding output channel can be automatically turned on according to the relative position of the user pupil and the scanning fiber array to output the light beam, so that the selected subset of the output channels are evenly distributed around the pupil position, thereby ensuring the output. All of the light beams enter the human eye, allowing the human eye to receive all of the output beams at any viewing position.
  • the position of the human eye is not limited, and the user can have a good augmented reality experience, and the user does not need to adjust the distance adjustment of the augmented reality device, thereby avoiding the user being unable to obtain the inaccurate adjustment result.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

近眼显示系统、虚拟现实设备及增强现实设备。近眼显示系统包括激光光源(10)、分光组件(20)、扫描光纤阵列(30)、瞳孔位置检测器和控制器(40),分光组件(20)包括M*N个输出通道,M和N均为不小于2的整数;激光光源(10)用于输出根据影像信息调制的激光,分光组件(20)用于将所输出的激光分成M*N个光束;控制器(40)用于根据瞳孔位置检测器获取的用户瞳孔的瞳孔位置,获取用户瞳孔与扫描光纤阵列(30)的相对位置,再根据相对位置和影像信息的显示视场灰度,控制分光组件(20)中的每个输出通道的开通和断开;扫描光纤阵列(30)中的扫描光纤与M*N个输出通道耦合,用于传输从分光组件(20)输出的输出光束,并将输出光束投射至人眼。

Description

近眼显示系统、虚拟现实设备及增强现实设备
本申请要求享有2016年7月1日提交的名称为“近眼显示系统、虚拟现实设备及增强现实设备”的中国专利申请CN201610519542.9的优先权,其全部内容通过引用并入本文中。
技术领域
本发明涉及视觉技术领域,尤其涉及一种近眼显示系统、虚拟现实设备及增强现实设备。
背景技术
增强现实(英文:Augmented Reality;简称:AR)是利用虚拟物体或信息对真实场景进行现实增强的技术。增强现实技术通常基于摄像头等图像采集设备获得的真实物理环境影像,通过计算机系统识别分析及查询检索,将与之存在关联的文本内容、图像内容或图像模型等虚拟生成的虚拟图像显示在真实物理环境影像中,从而使用户能够获得身处的现实物理环境中的真实物体的标注、说明等相关扩展信息,或者体验到现实物理环境中真实物体的立体的、突出强调的增强视觉效果。
现有的增强现实设备一般通过光学透镜将虚拟图像的光线会聚到用户的瞳孔中,对人眼观察的位置有较严格的限制。在用户的瞳孔位置发生变化时,例如用户的眼球转动,或者两个瞳距不同的用户先后使用同一个增强现实设备的时候,需要用户对增强现实设备的瞳距调节,或者由增强现实设备自动进行瞳距调节。但目前这两者的精度都不高,会导致虚拟图像的光线无法全部进入人眼,从而使得增强现实设备无法向用户发送虚拟图像,或者发送的虚拟图像的效果不佳,继而无法给用户以良好的增强现实体验。
因此,现有技术中存在因增强现实设备对人眼观察的位置有较严格的限制而导致无法给用户以良好的增强现实体验的技术问题。
发明内容
本发明提供一种近眼显示系统、虚拟现实设备及增强现实设备,其能够根据用户瞳 孔位置自动开通对应的输出通道来输出光束,并且控制所述输出光束全部进入人眼,由此实现了对人眼观察的位置没有限制,能够给用户以良好的增强现实体验的效果。
本申请实施例第一方面提供了一种近眼显示系统,其特征在于,包括激光光源、分光组件、扫描光纤阵列、瞳孔位置检测器和控制器,所述分光组件包括M*N个输出通道,M和N均为不小于2的整数;所述激光光源用于输出根据影像信息调制的激光;所述分光组件用于将所述激光光源输出的激光分成M*N个光束;所述瞳孔位置检测器用于获取用户瞳孔的瞳孔位置;所述控制器分别电性连接所述分光组件和所述瞳孔位置检测器,用于根据所获取的所述瞳孔位置,获取所述用户瞳孔与所述扫描光纤阵列的相对位置,再根据所述相对位置和所述影像信息的显示视场灰度,控制所述分光组件中的每个输出通道的开通和断开;所述扫描光纤阵列中的扫描光纤与所述M*N个输出通道耦合,用于传输从所述分光组件输出的输出光束,并将所述输出光束投射至人眼。
可选地,所述激光光源包括三色激光光源、准直镜组、合束器、耦合器和耦合光纤,其中,所述三色激光光源输出三色激光;所述准直镜组设置于所述三色激光光源的出射光路上,用于对所述三色激光进行准直处理;所述合束器设置于所述准直镜组的出射光路上,用于将所述准直镜组出射的激光进行合束处理;所述耦合器设置于所述合束器的出射光路上,用于将所述合束器出射的激光耦合到所述耦合光纤中;所述耦合光纤与所述耦合器相连,所述耦合光纤用于传输经所述耦合器耦合的激光。
可选地,所述扫描光纤阵列包括水平方向扫描光纤束和垂直方向扫描光纤束,所述水平方向扫描光纤束用于对水平出射光束进行扩束,所述垂直方向扫描光纤束用于对垂直出射光束进行扩束。
可选地,所述水平方向扫描光纤束为紧密排布或间隔排布的扫描光纤束,所述垂直方向扫描光纤束为紧密排布或间隔排布的扫描光纤束,其中,所述紧密排布的含义为每相邻两束光纤之间的间隔不大于预设距离,所述间隔排布的含义为每相邻两束光纤之间的间隔大于所述预设距离。
可选地,每束扫描光纤均包括扫描器,所述扫描器设置在所述扫描光纤上,用于将所述扫描光纤进行偏转,使得所述扫描光纤出射的光束也随所述扫描光纤偏转。
可选地,所述分光组件包括一个1*M型的第一光分路器、M个1*N型的第二光分路器和M*N个通道开关,所述第一光分路器的入射端与所述激光光源的出射端相连,所述M个第二光分路器与所述第一光分路器的M个出射端一一相连;所述M*N个通道开关和所述M*N个输出通道一一对应,用于控制所述M*N个输出通道的开通和断开。
可选地,所述分光组件包括一个M*N型的光分路器,所述M*N型的光分路器集成了M*N个通道开关,所述M*N个通道开关和所述M*N个输出通道一一对应,用于控制所述M*N个输出通道的开通和断开。
可选地,所述控制器电性连接所述扫描光纤阵列,用于根据预设条件,将所述扫描光纤阵列中的扫描光纤划分成S个非干涉区域,并且在同一时刻控制所述S个非干涉区域以显示S个视场光,其中S为不小于2的整数。
可选地,所述控制器用于根据出瞳直径的大小,将所述扫描光纤阵列中的扫描光纤划分成所述S个非干涉区域。
可选地,所述近眼显示系统还包括会聚透镜阵列组,所述会聚透镜阵列组包括第一会聚透镜阵列和第二会聚透镜阵列,所述第一会聚透镜阵列设置于所述扫描光纤阵列的靠近人眼一侧,所述第二会聚透镜阵列设置于所述扫描光纤阵列的远离人眼一侧。
可选地,所述第一会聚透镜阵列和所述第二会聚透镜阵列均为准直透镜阵列,且所述第一会聚透镜阵列和所述第二会聚透镜阵列组成1:1的望远系统。
可选地,所述第一会聚透镜阵列和所述第二会聚透镜阵列均为电控液体微透镜阵列,且所述第一会聚透镜阵列和所述第二会聚透镜阵列组成1:1的无焦系统。
可选地,所述近眼显示系统还包括会聚透镜阵列,所述会聚透镜阵列设置于所述扫描光纤阵列的靠近人眼一侧。
可选地,所述近眼显示系统还包括调光结构,所述调光结构设置于所述扫描光纤阵列的远离人眼一侧。
本申请实施例第二方面还提供了一种虚拟现实设备,其特征在于,包括两套如第一方面所述的近眼显示系统,其中第一近眼显示系统与人的左眼对应,第二近眼显示系统与人的右眼对应。
本申请实施例第三方面还提供了一种增强现实设备,其特征在于,包括两套如第一方面所述的近眼显示系统,其中第一近眼显示系统与人的左眼对应,第二近眼显示系统与人的右眼对应,并且,外界环境光通过所述第一近眼显示系统的会聚透镜阵列组进入人的左眼,并通过所述第二近眼显示系统的会聚透镜阵列组进入人的右眼。
本发明的有益效果如下:
基于上述技术方案,本发明实施例中控制器分别电性连接分光组件和瞳孔位置检测器,使控制器根据所述瞳孔位置检测器获取的瞳孔位置,获取所述用户瞳孔与扫描光纤 阵列的相对位置,再根据所述相对位置和所述影像信息的显示视场灰度,控制所述分光组件中的每个输出通道的开通和断开。由此可知,能够根据所述用户瞳孔与扫描光纤阵列的相对位置自动开通对应的输出通道来输出光束,使得选取的输出通道子集均匀分布在所述瞳孔位置的四周,从而能够确保所述输出光束全部进入人眼,使得人眼在任意观察位置均可以接收到所有输出光束。由此实现了对人眼观察的位置没有限制,能够给用户以良好的增强现实体验的效果,并且无需用户对增强现实设备进行瞳距调节,也因此避免了用户因调节结果不精确导致无法获得良好的增强现实体验的缺陷。
附图说明
图1为本发明实施例中近眼显示系统的结构示意图;
图2为本发明实施例中激光光源的结构示意图;
图3为本发明实施例中第二光分路器的结构示意图;
图4为本发明实施例中扫描光纤的第一种结构示意图;
图5为本发明实施例中扫描光纤的第二种结构示意图;
图6为本发明实施例中扫描光纤阵列和准直透镜阵列组的结构示意图;
图7为本发明实施例中1:1的望远系统的光路原理图;
图8为本发明实施例中M行N列扫描光纤的排列图;
图9为本发明实施例中用户瞳孔与透明基板的相对位置的结构示意图;
图10为本发明实施例中扫描光纤阵列和电控液体透镜阵列组的结构示意图;
图11为本发明实施例中扫描光纤阵列、电控液体透镜阵列和调光结构的结构示意图;
图12为本发明实施例中扫描光纤阵列划分成S个非干涉区域的分布图。
附图中有关标记如下:
10——激光光源,101——红色激光光源,102——绿色激光光源,103——蓝色激光光源,104——准直镜组,1041——准直镜组,1042——准直镜组,1043——准直镜组,105——合束器,1051——二向色镜,1052——二向色镜,1053——二向色镜,106——耦合器,107——耦合光纤,20——分光组件,201——1*M型的第一光分路器,202——M个1*N型的第二光分路器,2021——第二光分路器,203——1*N个通道开关,204——N根耦出光纤,30——扫描光纤阵列,301——扫描光纤,302——PZT压电陶瓷,303—— 套管,304——固定座,305——透明基板,40——控制器,50——第一准直透镜阵列,51——第二准直透镜阵列,52——第一电控液晶微透镜阵列,53——第二电控液晶微透镜阵列,54——调光结构。
具体实施方式
本发明提供一种近眼显示系统、虚拟现实设备及增强现实设备,其能够根据用户瞳孔位置自动开通对应的输出通道来输出光束,并且控制所述输出光束全部进入人眼,由此实现了对人眼观察的位置没有限制,能够给用户以良好的增强现实体验的效果。
下面结合附图对本发明优选的实施方式进行详细说明。
实施例一:
如图1所示,本发明实施例第一方面提供了一种近眼显示系统,其包括激光光源10、分光组件20、扫描光纤阵列30、瞳孔位置检测器和控制器40。分光组件20包括M*N个输出通道,M和N均为不小于2的整数。激光光源10用于输出根据影像信息调制的激光。激光光源10输出的激光经过分光组件20后,被分成M*N个光束。所述瞳孔位置检测器用于获取用户瞳孔的瞳孔位置。控制器40分别电性连接分光组件20和所述瞳孔位置检测器,用于根据所获取的瞳孔位置,获取所述用户瞳孔与扫描光纤阵列30的相对位置,再根据所述相对位置和所述影像信息的显示视场灰度,控制分光组件20中的每个输出通道的开通和断开。扫描光纤阵列30中的扫描光纤与所述M*N个输出通道耦合,用于传输从分光组件20输出的输出光束,并将所述输出光束投射至人眼。
激光光源10可以为单色激光光源或多色激光光源。在激光光源10为单色激光管光源时,其用于显示单色图像。在激光光源10为多色激光光源时,其用于显示单色图像和多色图像。进一步地,激光光源10具体可以为三色激光光源,例如为RGB激光光源等。下面具体以三色激光光源为例进行说明。
本申请实施例中,所述影像信息的显示视场灰度为当前显示视场对应的像素点灰度。根据所述影像信息可以获取到所述影像信息对应的图像中每一个像素的灰度,从而可以获取所述显示视场灰度。例如当前显示视场为0°视场,则获取0°视场对应的像素点灰度,例如为0~255中的一个值。
具体地,参见图2,激光光源10包括红色激光光源101、绿色激光光源102和蓝色激光光源103。其中,红色激光光源101用于发射红色激光,绿色激光光源102用于发射绿色激光,蓝色激光光源103用于发射蓝色激光。
继续参见图2,激光光源10还包括准直镜组104、合束器105、耦合器106和耦合光纤107。准直镜组104设置于激光光源10的出射光路上,用于对激光光源10发射的激光进行准直处理。准直镜组104包括准直镜1041、准直镜1042和准直镜1043,其中,准直镜1041设置于红色激光光源101的出射光路上,用于对红色激光进行准直处理,准直镜1042设置于绿色激光光源102的出射光路上,用于对绿色激光进行准直处理,并且准直镜1043设置于蓝色激光光源103的出射光路上,用于对蓝色激光进行准直处理。当然,激光光源10还可以是由红色激光光源101、绿色激光光源102、蓝色激光光源103、合束器105、耦合器106和耦合光纤107组成,而未包含准直镜组104。耦合光纤107可以是晶状体纤维光纤,例如二氧化硅光导纤维。
继续参见图2,合束器105设置于准直镜组104的出射光路上,用于将准直镜组104出射的激光进行合束处理。合束器105包括二向色镜1051、二向色镜1052和二向色镜1053,其中,二向色镜1051反射红光并且透射绿光,二向色镜1052透射绿光,二向色镜1053透射红光、绿光并且反射蓝光,从而将准直镜组104发出的激光合成为一个光束,在此就不再赘述了。
继续参见图2,耦合器106设置于合束器105的出射光路上,用于将合束器105出射的激光耦合到耦合光纤107中。耦合光纤107与耦合器106相连,耦合光纤107用于传输经耦合器106耦合的激光。
具体地,分光组件20可以为一个M*N型的光分路器,其包括M*N个输出通道。所述M*N型的光分路器集成了M*N个通道开关,所述M*N个通道开关和所述M*N个输出通道一一对应,用于控制所述M*N个输出通道的开通和断开。当然,分光组件20还可以是一个M*N型的光分路器和M*N个通道开关。所述通道开关可以是光开关或光衰减器等。在所述通道开关为光开关时,其能够控制输出通道的开通和断开。在所述通道开关为光衰减器时,其不仅能够控制输出通道的开通和断开,还能够控制输出通道输出光束的能量。在输出通道开通时,光束通过输出通道传输至扫描光纤阵列30。在输出通道断开时,光束不能通过输出通道传输至扫描光纤阵列30。下面具体以光开关为例进行说明。
在所述通道开关为光衰减器时,若通过光衰减器控制对应的输出通道的输出能量为0,即可以确定该输出通道已断开,若通过光衰减器控制对应的输出通道的输出能量大于0,即可以确定该输出通道已开通。
具体来讲,参见图1,分光组件20还可以是一个1*M型的第一光分路器201、M个 1*N型的第二光分路器202和M*N个通道开关。第一光分路器201的入射端与激光光源10的出射端相连,即第一光分路器201的入射端与耦合光纤107相连。M个第二光分路器202与第一光分路器201的M个出射端一一相连,从而形成M*N个输出通道。所述M*N个通道开关和所述M*N个输出通道一一对应,用于控制所述M*N个输出通道的开通和断开,进而使得每个输出通道可以通过相应的通道开关来独立控制该输出通道的开通和断开。每个输出通道均为一根光纤,使得所述M*N个输出通道即为M*N根光纤。
分光组件20在将激光光源10输出的激光分成M*N个光束时,将激光光源10输出的激光分成能量相等的M*N个光束。例如红色激光光源101设置的最大输出能量为ER,经分光组件20分束后最终光纤分束器的每一输出通道的出射端出射的最大能量为:
ER/(M*N)。
图像的灰度等级受扫描光纤的约束,因此M*N束扫描光纤可以实现M*N个灰度等级。若所述影像信息对应的图像的灰度等级为8位,即有256个灰度等级,则单位灰度对应的红色激光需求能量为ER/256。
继续参见图3,以M个第二光分路器202中的一个第二光分路器2021为例进行说明。第二光分路器2021的入射端与第一光分路器201的一个出射端相连。第二光分路器2021的N个出射端与1*N个通道开关203一一相连。1*N个通道开关203的输出端还可以连接N根耦出光纤204,用于与扫描光纤阵列30中的扫描光纤相连。1*N个通道开关203用于控制第二光分路器2021的N个出射端的开通和断开,进而控制第二光分路器2021的N个输出通道的开通和断开。
具体地,扫描光纤阵列30包括M*N束扫描光纤。所述M*N个输出通道输出的光束耦合进所述M*N束扫描光纤,再通过所述M*N束扫描光纤将所述M*N个输出通道输出的光束进行偏转,并且将偏转后的光束投射至人眼。
具体地,参见图1,扫描光纤阵列30可以组成一个扫描光纤面板。进一步地,扫描光纤阵列30可以包括水平方向扫描光纤束和垂直方向扫描光纤束。所述水平方向扫描光纤束用于对水平出射光束进行扩束;所述垂直方向扫描光纤束用于对垂直出射光束进行扩束。如此,通过水平和垂直方向的扩束,以提高所述近眼显示系统的显示视场,例如可以显示120°、130°和140°的视场,使得显示视场与人眼的视场更匹配。
所述水平方向扫描光纤束为紧密排布或间隔排布的扫描光纤束。所述垂直方向扫描光纤束为紧密排布或间隔排布的扫描光纤束。所述紧密排布的含义为每相邻两束光纤之间的间隔不大于预设距离,所述间隔排布的含义为每相邻两束光纤之间的间隔大于预设距离。 所述预设距离根据实际情况来设定。所述预设距离可以为不小于25微米(um)的值,例如为25um、30um和35um等,本申请不作具体限制。
具体地,每束扫描光纤均包括扫描器,所述扫描器设置在所述扫描光纤上,用于将所述扫描光纤进行偏转,使得所述扫描光纤出射的光束也随之偏转,从而实现了水平和垂直方向上的扩束。所述扫描器具体可以为二维扫描器,如PZT压电陶瓷等。扫描光纤在PZT压电陶瓷驱动下在水平方向和垂直方向进行偏转(二维扫描),将激光光源10输出的激光处理为图像光束,从而实现将虚拟图像发送到用户眼睛中的目的。
具体来讲,参见图4,扫描光纤阵列30中的一束扫描光纤301包括PZT压电陶瓷302。扫描光纤301设置在套管303中。PTZ压电陶瓷302通过固定座304固定在套管303中,且PZT压电陶瓷302设置在扫描光纤301上。固定座304的两端均与套管303的内壁连接,以使得固定座304固定在套管303中。当然,固定座304还可以仅一端与套管303的内壁连接,同样使得固定座304固定在套管303中,具体参见图5。
本申请实施例中,扫描光纤阵列30中的扫描光纤的出射端面可以是平面,也可以是曲面。扫描光纤可以是晶状体纤维光纤,例如二氧化硅光导纤维。该种光纤可以获得出射光斑束腰极小和大数值孔径的光束。在扫描光纤的出射端面是具有一定弧度的凹曲面时,凹曲面将会聚光束,使得每一根扫描光纤的最大扫描角度减小,进而提高扫描光纤阵列30中光纤扫描的频率。在扫描光纤的出射端面是具有一定弧度的凸曲面时,凸曲面将发散光束,使得每一根扫描光纤的最大扫描角度增大,进而降低扫描光纤阵列30中光纤扫描的频率。
本申请实施例中,参见图6,扫描光纤阵列30可以封装在透明基板305中。扫描光纤301为裸光纤外层涂覆一层极薄的透明保护胶,即涂覆层。每相邻两根扫描光纤之间的空隙用与所述涂覆层折射率相同或近似的材料填充。透明基板305为透明度大于预设透明度的基板。所述预设透明度的取值范围为75%-100%,即可以为75%-100%之间的任意一个值,例如为75%、85%和100%等。
本申请另一实施例中,所述近眼显示系统还包括会聚透镜阵列组。所述会聚透镜阵列组包括第一会聚透镜阵列和第二会聚透镜阵列。所述第一会聚透镜阵列设置于扫描光纤阵列30的靠近人眼一侧,所述第二会聚透镜阵列设置于扫描光纤阵列30的远离人眼一侧。
具体地,所述第一会聚透镜阵列和所述第二会聚透镜阵列可以均为准直透镜阵列。如图6所示,在扫描光纤阵列30的靠近人眼一侧设置第一准直透镜阵列50,在扫描光纤阵列30的远离人眼一侧设置第二准直透镜阵列51,且第一准直透镜阵列50和第二准直透 镜阵列51组成1:1的望远系统。由于扫描光纤阵列30封装在透明基板305中,使得外界环境光通过1:1的望远系统进入人眼,且由于外界环境光是通过1:1的望远系统进入人眼的,不会对外界进行放大或缩小,使得用户能够更真实的感受外界环境。1:1的望远系统的光路原理具体如图7所示。
具体地,第一准直透镜阵列50设置在扫描光纤阵列30的出射光路上,用于对扫描光纤阵列30出射的光束进行准直处理。下面具体以第一准直透镜阵列50中的一个准直透镜为例进行说明。
参见图4和图5,在本申请实施例中,扫描光纤301的出射光路上还设置有准直透镜501。准直透镜501用于将PZT压电陶瓷302扫描出射的锥形光束进行准直处理,使得其能够以近似平行的方式投射到人眼中。
本申请实施例中,所述瞳孔位置检测器可以设置在透明基板305的侧面上,当然也可以设置在透明基板305的下方。所述瞳孔位置检测器具体可以为位置检测传感器,用于实时获取用户瞳孔的瞳孔位置,并将所述瞳孔位置传输给控制器40。当然,也可以是由控制器40实时获取所述瞳孔位置检测器获取到的瞳孔位置。
本申请实施例中,控制器40获取到所述瞳孔位置之后,根据所述瞳孔位置和预先存储的扫描光纤阵列30的阵列位置,获取到用户瞳孔与扫描光纤阵列30的相对位置,具体可以是获取用户瞳孔与扫描光纤阵列30中心的相对位置。在扫描光纤阵列30封装在透明基板305中时,所述相对位置还可以是用户瞳孔相对于透明基板305中心的位置。所述阵列位置可以存储在外部存储硬件中,此时,控制器40从所述外部存储硬件中读取所述阵列位置。所述外部存储硬件例如可以是存储卡、硬盘、USB设备等存储设备。当然,所述阵列位置还可以存储在控制器40中的存储空间中,本申请不作具体限制。
本申请实施例中,控制器40可以是单片机、处理芯片和控制电路等。进一步地,所述近眼显示系统应用于单眼,而应用于双眼时需要使用两套所述近眼显示系统。
本申请实施例中,控制器40根据所述相对位置和所述影像信息的显示视场灰度,控制分光组件20中的每个输出通道的开通和断开。具体为:控制器40根据所述相对位置和所述影像信息的显示视场灰度,从所述M*N个输出通道中选取K*F个输出通道并将其开通,而将未被选取的输出通道断开。由于扫描光纤阵列30中的M*N根扫描光纤与所述M*N个输出通道耦合,因此,在K*F个输出通道开通时,K*F个输出通道输出的输出光束会传输至与其对应的K*F根扫描光纤,然后K*F根扫描光纤将所述输出光束进行扫描并投射至人眼。其中,K和F均为正整数。
由于K*F个输出通道是根据所述相对位置和所述影像信息的显示视场灰度来选取的,而所述相对位置是用户瞳孔相对于扫描光纤阵列30中心的位置,由此可以确定K*F个输出通道是根据所述瞳孔位置来选取的,使得选取的K*F个输出通道均匀分布在所述瞳孔位置的四周。如此,能够确保选取的K*F个输出通道输出的输出光束通过扫描光纤能够全部进入人眼,使得人眼在任意观察位置均可以接收到所有输出光束,由此实现了对人眼观察的位置没有限制,能够给用户以良好的增强现实体验的效果。
本申请实施例中,控制器40可以根据所述相对位置和所述影像信息的显示视场灰度,采用插值法从所述M*N个输出通道中选取K*F个输出通道,使得选取的K*F个输出通道均匀分布在所述瞳孔位置的四周,以确保选取的K*F个输出通道输出的输出光束通过扫描光纤能够全部进入人眼,使得人眼在任意观察位置均可以接收到所有输出光束。所述插值法例如可以是最近邻点插值法等。
本申请实施例中,控制器40可以首先根据所述影像信息的显示视场灰度,从所述M*N个输出通道中确定需要开通的输出通道的数量,再根据所述相对位置,采用最近邻点插值法选取所述K*F个输出通道。
例如,参见图8和图9,每个视场对应的通道均为M*N个输出通道。相应地,由于分光组件20具有M*N个输出通道,使得扫描光纤阵列30必然会存在与M*N个输出通道对应的M行N列的扫描光纤,即M*N根扫描光纤。假设M=N=30,则有30*30个输出通道。所述近眼显示系统在初始状态下,默认为用户瞳孔60中心位于透明基板305的中心轴线306上。若所述影像信息对应的图像为8bit,即有256个灰度级别,而当前显示的零视场的灰度值为160,则对应的需开通的输出通道的数量为:
Figure PCTCN2017090839-appb-000001
由于输出通道的数量需取整数,即,需要在30*30个输出通道中选取563个输出通道并将其开通。通过采用最近邻点插值法在30*30通道中均匀地选取563个通道。采取最近邻点插值法得到输出通道开通的行和列具体如下表1所示:
Figure PCTCN2017090839-appb-000002
表1
由表1可知,开通的输出通道的数量为24×24=576,其中,K=F=24,开通的输出通道的行列号具体如表1所示。
而用户瞳孔60中心可能不位于透明基板305的中心轴线306,例如,如图9所示,用户瞳孔60向左偏移了3毫米(mm)。对于上述的零视场显示,同样需要从30*30个输出通道中选取563个输出通道并将其开通。通过采用最近邻点插值法在30*30通道中均匀地选取563通道。采取最近邻点插值法得到输出通道开通的行和列具体如下表2所示:
Figure PCTCN2017090839-appb-000003
表2
由表2可知,开通的输出通道的数量同样为24×24=576,但是由于用户瞳孔60向左偏移了3mm,开通的输出通道也随之发生变化,使得开通的输出通道均匀分布在用户瞳孔60的四周。这能够确保选取的K*F个输出通道输出的输出光束通过扫描光纤能够全部进入人眼,使得人眼在任意观察位置均可以接收到所有输出光束。
由于所述M*N个输出通道与扫描光纤阵列30中的M行N列扫描光纤对应,由此可知,表1和表2中的第一行同样也表示光纤扫描阵列30中扫描光纤的行序号,表1和表2中的第二行表示光纤扫描阵列30中扫描光纤的列序号。由此根据开通的输出通道行列号即可确定对应的扫描光纤的行列号。如表1和表2所示,在用户瞳孔60向左偏移了3mm时,开通的输出通道对应的扫描光纤也相应往左偏移了一些,从而能够确保选取的K*F个输出通道输出的输出光束通过扫描光纤能够全部进入人眼。
本申请实施例中,若所述通道开关为光开关,则将所述K*F个输出通道对应的光开光开通,将未被选取的输出通道对应的光开光断开,进而控制所述K*F个输出通道输出所述输出光束。未被选取的输出通道处于断开状态,没有光束传输至相应的扫描光纤中进行输出。
本申请实施例中,若所述通道开关为光衰减器,则通过光衰减器调节所述K*F个输出通道中每个输出通道的输出能量,使得所述K*F个输出通道的总输出能量与显示视场灰度所需能量的差值不大于预设阈值。所述预设阈值根据实际情况来设定。所述预设阈值 例如可以取不大于20*ER/(M*N)的值,以使得所述K*F个输出通道的总输出能量与显示视场灰度所需能量相同或差值很小。通过光衰减器控制未被选取的输出通道的输出能量为0。如此,能够有效提高图像显示的效果。
在本申请另一实施例中,所述第一会聚透镜阵列和所述第二会聚透镜阵列还可以均为电控液体微透镜阵列。所述电控液体微透镜阵列例如可以为电控液晶微透镜阵列。如图10所示,在扫描光纤阵列30的靠近人眼一侧设置第一电控液晶微透镜阵列52,在扫描光纤阵列30的远离人眼一侧设置第二电控液晶微透镜阵列53,且第一电控液晶微透镜阵列52和第二电控液晶微透镜阵列53组成1:1的无焦系统。由于扫描光纤阵列30封装在透明基板305中,使得外界环境光通过1:1的无焦系统进入人眼,且由于外界环境光是通过1:1的无焦系统进入人眼的,不会对外界进行放大或缩小,使得用户能够更真实的感受外界环境。
具体地,第一电控液晶微透镜阵列52设置在扫描光纤阵列30的出射光路上,用于对扫描光纤阵列30出射的光束进行准直处理。
由于第一电控液晶微透镜阵列52和第二电控液晶微透镜阵列53不加电压不工作,第一电控液晶微透镜阵列52和第二电控液晶微透镜阵列53无光会聚或发散的功能,即不呈现光偏折的作用,不对外界环境光有光转折。如此,使得外界环境光能够通过第一电控液晶微透镜阵列52和透明基板305之后通过第二电控液晶微透镜阵列53进入人眼,实现观察到现实外界环境。
在本申请的另一实施例中,所述近眼显示系统还可以包括会聚透镜阵列,所述会聚透镜阵列设置于扫描光纤阵列30的靠近人眼一侧,且所述会聚透镜阵列设置在扫描光纤阵列30的出射光路上,用于对扫描光纤阵列30出射的光束进行准直处理。所述会聚透镜阵列可以为准直透镜阵列或电控液体微透镜阵列。下面具体以所述会聚透镜阵列为电控液晶微透镜阵列为例进行说明。
如图11所示,在扫描光纤阵列30的靠近人眼一侧设置第一电控液晶微透镜阵列52,用于对扫描光纤阵列30出射的光束进行准直处理,从而显示虚拟图像。
本申请实施例中,在使用包含第一电控液晶微透镜阵列52的近眼显示系统用于进行增强现实显示时,需要在扫描光纤阵列30的远离人眼一侧设置调光结构54。调光结构54具体可以是带有光开关的聚合物分散液晶(Polymer Dispersed Liquid Crystal,简称:PDLC)膜层。采用分时段方式显示虚拟图像和现实外界环境。假设人眼的刷新率为30Hz,将该刷新率对应的时间段分成2段。一段时间用于显示虚拟图像,此段时间内使PDLC 膜层的光开关断开,使得PDLC膜层呈不透明状态。另一段时间用于观察现实外界环境,此段时间内使PDLC膜层的光开关开通,从而对PDLC膜层施加电压,使其呈透明状态,使得外界环境光能够通过PDLC膜层和透明基板305。同时,不施加电压给第一电控液晶微透镜阵列52。由于第一电控液晶微透镜阵列52不加电压不工作,第一电控液晶微透镜阵列52无光会聚或发散的功能,即不呈现光偏折的作用,不对外界环境光有光转折。如此,使得外界环境光能够通过PDLC膜层和透明基板305之后通过第一电控液晶微透镜阵列52进入人眼,实现观察到现实外界环境。
本发明的有益效果如下:
基于上述技术方案,本发明实施例中控制器分别电性连接分光组件和瞳孔位置检测器,使控制器根据所述瞳孔位置检测器获取的瞳孔位置,获取所述用户瞳孔与扫描光纤阵列的相对位置,再根据所述相对位置和所述影像信息的显示视场灰度,控制所述分光组件中的每个输出通道的开通和断开。由此可知,能够根据所述用户瞳孔与扫描光纤阵列的相对位置自动开通对应的输出通道来输出光束,使得选取的输出通道子集均匀分布在所述瞳孔位置的四周,从而能够确保所述输出光束全部进入人眼,使得人眼在任意观察位置均可以接收到所有输出光束。由此实现了对人眼观察的位置没有限制,能够给用户以良好的增强现实体验的效果,并且无需用户对增强现实设备进行瞳距调节,也因此避免了用户因调节结果不精确导致无法获得良好的增强现实体验的缺陷。
实施例二:
如图1所示,本发明实施例第一方面还提供了一种近眼显示系统,其包括激光光源10、分光组件20、扫描光纤阵列30、瞳孔位置检测器和控制器40。分光组件20包括M*N个输出通道,M和N均为不小于2的整数。激光光源10用于输出根据影像信息调制的激光。激光光源10输出的激光经过分光组件20后,被分成M*N个光束。所述瞳孔位置检测器用于获取用户瞳孔的瞳孔位置。控制器40分别电性连接分光组件20和所述瞳孔位置检测器,用于根据获取的瞳孔位置,获取所述用户瞳孔与扫描光纤阵列30的相对位置,再根据所述相对位置和所述影像信息的显示视场灰度,控制分光组件20中的每个输出通道的开通和断开。扫描光纤阵列30中的扫描光纤与所述M*N个输出通道耦合,用于传输从分光组件20输出的输出光束,并将所述输出光束投射至人眼。
本申请实施例中,控制器40中可以预存扫描光纤阵列30中的扫描光纤和S个非干涉区域的对应关系。所述对应关系包括每个非干涉区域对应的区域视场信息,其中,一个非干涉区域对应一个区域视场,S为不小于2的整数。
具体来讲,控制器40中预存有所述对应关系,并且,在扫描光纤阵列30中的扫描光纤输出所述输出光束时,控制器40根据所述对应关系,控制扫描光纤阵列30中的扫描光纤,使其出射所述输出光束以形成S个视场光,并将所述S个视场光投射至人眼。如此,使得每一时刻所述近眼显示系统能够显示S个视场光,即S个像素点。现有技术中每一时刻仅能显示一个像素点。如此,能够有效降低通道开关的开关频率,并且,在单位时间内开关频率降低的情况下,其能量利用率也会随之提高。
具体地,在控制器40控制扫描光纤阵列30中的扫描光纤之前,由于控制器40已开通了从所述M*N个输出通道中选取的K*F个输出通道,则控制器40可以根据所述对应关系,控制K*F个输出通道对应的K*F根扫描光纤出射所述输出光束以形成S个视场光,并将所述S个视场光投射至人眼。
具体来讲,在获取所述对应关系时,控制器40可以根据预设条件,将扫描光纤阵列30中的扫描光纤划分成S个非干涉区域,从而获取到所述对应关系,并将获取到的所述对应关系存储到控制器40的存储空间中,或存储在外部存储硬件中。所述对应关系被存储在外部存储硬件中时,控制器40需要从所述外部存储硬件中读取所述对应关系。所述外部存储硬件例如可以是存储卡、硬盘、USB设备等存储设备。
本申请实施例中,所述非干涉区域表征该区域与任何一个其它区域均不重叠。
由于扫描光纤阵列30中的扫描光纤被划分成了S个非干涉区域,使得扫描光纤阵列30能够在同一时刻控制所述S个非干涉区域以显示S个视场光,即每一时刻可以显示S个视场光,其中一个视场光对应一个像素点。
具体地,所述预设条件可以是预设划分方式,所述预设划分方式可以是根据扫描光纤阵列30中的扫描光纤的数量来将其划分成S个非干涉区域。在所述扫描光纤的数量大于预设数量时,将扫描光纤阵列30中的扫描光纤划分成H个非干涉区域,这时S=H。在所述扫描光纤的数量不大于预设数量时,将扫描光纤阵列30中的扫描光纤划分成J个非干涉区域,这时S=J。其中,H和J均为不小于2的整数,且H和J可以相同或不同。
当然,所述预设划分方式可以根据所述近眼显示系统的显示视场来划分。显示视场越大,其S的取值也越大;显示视场越小,其S的取值也越小。本申请不作具体限制。当然,所述预设划分方式还可以是直接设置S的取值,然后将扫描光纤阵列30中的扫描光纤划分成S个非干涉区域。
具体地,控制器40根据预设条件,将扫描光纤阵列30中的扫描光纤划分成S个非干涉区域。具体为:控制器40还可以根据出瞳直径的大小,将扫描光纤阵列30中的扫描 光纤划分成所述S个非干涉区域。其中,所述出瞳直径越大时,S的取值越大;所述出瞳直径越小时,S的取值越小。例如,若所述出瞳在水平和垂直方向的直径为10*8mm,则S=8;若所述出瞳在水平和垂直方向的直径为10*10mm,则S取大于8的整数,例如为10。
本申请实施例中,控制器40还可以根据出瞳直径的大小和所述近眼显示系统的显示视场,将扫描光纤阵列30中的扫描光纤划分成所述S个非干涉区域。
具体来讲,出瞳在水平方向的直径用B1表示,出瞳在垂直方向的直径用B2表示,所述近眼显示系统在水平方向的显示视场用C1表示,所述近眼显示系统在垂直方向的显示视场用C2表示,则可以将扫描光纤阵列30中M*N根扫描光纤划分成S个非干涉区域,其中,
S=[(2L*tan(C1/2)+B1)/B1]*[(2L*tan(C2/2)+B2)/B2]。    公式1
其中,L表示人眼到扫描光纤阵列30的距离。
例如设定出瞳在水平和垂直方向的直径为8*8mm,所述近眼显示系统的在水平和垂直方向的显示视场为40*40度,则,M*N束光通道可以分成
S=[(2L*tan(40°/2)+8)/8]*[(2L*tan(40°/2)+8)/8]个非干涉区域。
具体地,所述近眼显示系统每一时刻可同时显示S个视场光,并且每一非干涉区域分别对应一个区域视场。取L=20mm,则计算出S=9。如此,将扫描光纤阵列30的水平方向分成3个区域,每个区域均不重叠,并且将扫描光纤阵列30的垂直方向也分成3个区域,每个区域均不重叠,从而获取到9个非干涉区域。如图12所示,所述9个非干涉区域为A1、A2、A3、A4、A5、A6、A7、A8和A9非干涉区域,且每个非干涉区域均不重叠。
参见图12,A1在水平方向显示的视场为-20°~-7°,A1在垂直方向显示的视场为7°~20°。如此可知,A1显示的区域视场为{(-20°~-7°),(7°~20°)}。同理,A2显示的区域视场为{((-7°~7°),(7°~20°)};A3显示的区域视场为{(7°~20°),(7°~20°)};A4显示的区域视场为{(-20°~-7°),(-7°~7°)};A5显示的区域视场为{(-7°~7°),(-7°~7°)};A6显示的区域视场为{(7°~20°),(-7°~7°)};A7显示的区域视场为{(-20°~-7°),(-20°~-7°)};A8显示的区域视场为{(-7°~7°),(-20°~-7°)};并且A9显示的区域视场为{(7°~20°),(-20°~-7°)}。
在实际应用中,在每一时刻所述近眼显示系统能够显示9个视场光,即显示9个像素点。以显示一幅800*600的图像为例进行说明,其中人眼的刷新率取最低要求30Hz。若 显示的图像为RGB彩色图像,则需要的通道开关的开关频率最低为
Figure PCTCN2017090839-appb-000004
扫描光纤的扫描频率最低为
Figure PCTCN2017090839-appb-000005
现有技术中每一时刻仅能显示一个视场光,即一个像素点。同样以显示一幅800*600的单色图像为例进行说明,其中人眼的刷新率取最低要求30Hz。则现有技术中通道开关的开关频率最低为
Figure PCTCN2017090839-appb-000006
扫描光纤的扫描频率最低为
Figure PCTCN2017090839-appb-000007
若需显示RGB彩色图像,显示彩色图像需采用时序的方法,故现有技术中需要的通道开关的开关频率最低为
Figure PCTCN2017090839-appb-000008
扫描光纤的扫描频率最低为
Figure PCTCN2017090839-appb-000009
4.8MHz远远小于43MHz,因此,与现有技术相比,采用本申请上述实施例能够解决现有技术中通道开关的开关频率高的技术问题,实现有效降低通道开关的开关频率的效果,并且,在单位时间内开关频率降低的情况下,其能量利用率也会随之提高。
进一步地,由于本申请实施例中的一个非干涉区域仅对应一个区域视场,使得一个非干涉区域中的扫描光纤的最大扫描角为该区域视场对应的扫描角。而现有技术中扫描光纤阵列中的每根光纤需要对应所述近眼显示系统的总显示视场,使得每根光纤的最大扫描角为与所述总显示视场对应的扫描角。由于一个区域视场仅是所述总显示视场中的一部分,必然使得区域视场对应的扫描角小于所述总显示视场对应的扫描角。因此,本申请实施例中扫描光纤所需的最大扫描角减小了,从而能够提高扫描光纤的扫描频率。
本申请实施例中,扫描光纤阵列30中的扫描光纤被划分为S个非干涉区域之后,每个非干涉区域均能够显示所述影像信息对应的图像的所有灰度等级。
例如,如图12所示,以30*30根扫描光纤为例,将30*30根扫描光纤划分成9个非干涉区域,则每一个非干涉区域具有的扫描光纤的数量为30*30/9=100。若所述影像信息对应的图像的灰度等级为8位,即有256个灰度等级,则以A1为例,A1使用100根扫描光纤来显示256个灰度等级。在100根扫描光纤对应的通道开关均开通时,显示的灰度等级为256。在A1显示的视场灰度值为180时,需要的光纤数量为100*180/256=70.3。由于扫描光纤为整数,则需要使用71根扫描光纤显示值为180的视场灰度。
在本申请另一实施例中,所述近眼显示系统还包括会聚透镜阵列组。所述会聚透镜阵 列组包括第一会聚透镜阵列和第二会聚透镜阵列。所述第一会聚透镜阵列设置于扫描光纤阵列30的靠近人眼一侧。所述第二会聚透镜阵列设置于扫描光纤阵列30的远离人眼一侧。
具体地,所述第一会聚透镜阵列和所述第二会聚透镜阵列可以均为准直透镜阵列。如图6所示,在扫描光纤阵列30的靠近人眼一侧设置第一准直透镜阵列50,在扫描光纤阵列30的远离人眼一侧设置第二准直透镜阵列51,且第一准直透镜阵列50和第二准直透镜阵列51组成1:1的望远系统。由于扫描光纤阵列30封装在透明基板305中,使得外界环境光通过1:1的望远系统进入人眼,且由于外界环境光是通过1:1的望远系统进入人眼的,不会对外界进行放大或缩小,使得用户能够更真实的感受外界环境。1:1的望远系统的光路原理具体如图7所示。
在本申请另一实施例中,所述第一会聚透镜阵列和所述第二会聚透镜阵列还可以均为电控液体微透镜阵列。所述电控液体微透镜阵列例如可以为电控液晶微透镜阵列。如图10所示,在扫描光纤阵列30的靠近人眼一侧设置第一电控液晶微透镜阵列52,在扫描光纤阵列30的远离人眼一侧设置第二电控液晶微透镜阵列53,且第一电控液晶微透镜阵列52和第二电控液晶微透镜阵列53组成1:1的无焦系统。由于扫描光纤阵列30封装在透明基板305中,使得外界环境光通过1:1的无焦系统进入人眼,且由于外界环境光是通过1:1的无焦系统进入人眼的,不会对外界进行放大或缩小,使得用户能够更真实的感受外界环境。
在本申请的另一实施例中,所述近眼显示系统还可以包括会聚透镜阵列,所述会聚透镜阵列设置于扫描光纤阵列30的靠近人眼一侧,且所述会聚透镜阵列设置在扫描光纤阵列30的出射光路上,用于对扫描光纤阵列30出射的光束进行准直处理。所述会聚透镜阵列可以为准直透镜阵列或电控液体微透镜阵列,下面具体以所述会聚透镜阵列为电控液晶微透镜阵列为例进行说明。
如图11所示,在扫描光纤阵列30的靠近人眼一侧设置第一电控液晶微透镜阵列52,用于对扫描光纤阵列30出射的光束进行准直处理,从而显示虚拟图像。
本申请实施例中,在使用包含第一电控液晶微透镜阵列52的近眼显示系统用于进行增强现实显示时,需要在扫描光纤阵列30的远离人眼一侧设置调光结构54。调光结构54具体可以是带有光开关的PDLC膜层。采用分时段方式显示虚拟图像和现实外界环境。假设人眼的刷新率为30Hz,将该刷新率对应的时间段分成2段。一段时间用于显示虚拟图像,此段时间内使PDLC膜层的光开关断开,使得PDLC膜层呈不透明状态。另一段 时间用于观察现实外界环境,此段时间内使PDLC膜层的光开关开通,从而对PDLC膜层施加电压,使其呈透明状态,使得外界环境光能够通过PDLC膜层和透明基板305。同时,不施加电压给第一电控液晶微透镜阵列52。由于第一电控液晶微透镜阵列52不加电压不工作,第一电控液晶微透镜阵列52无光会聚或发散的功能,即不呈现光偏折的作用,不对外界环境光有光转折。如此,使得外界环境光能够通过PDLC膜层和透明基板305之后通过第一电控液晶微透镜阵列52进入人眼,实现观察到现实外界环境。
当然,还可以将所述刷新率对应的时间段分成至少3段,其中的一段或多段时间用于显示虚拟图像,剩下的至少一段时间用于观察现实外界环境。
本发明的有益效果如下:
其一、本发明实施例中,控制器分别电性连接分光组件和瞳孔位置检测器,使控制器根据所述瞳孔位置检测器获取的瞳孔位置,获取所述用户瞳孔与扫描光纤阵列的相对位置,再根据所述相对位置和所述影像信息的显示视场灰度,控制所述分光组件中的每个输出通道的开通和断开。由此可知,能够根据所述用户瞳孔与扫描光纤阵列的相对位置自动开通对应的输出通道来输出光束,使得选取的输出通道子集均匀分布在所述瞳孔位置的四周,从而能够确保所述输出光束全部进入人眼,使得人眼在任意观察位置均可以接收到所有输出光束。由此实现了对人眼观察的位置没有限制,能够给用户以良好的增强现实体验的效果,并且无需用户对增强现实设备进行瞳距调节,也因此避免了用户因调节结果不精确导致无法获得良好的增强现实体验的缺陷。
其二、本发明实施例中,在通过扫描光纤阵列中的扫描光纤传输从分光组件输出的输出光束时,控制器根据所述扫描光纤阵列中的扫描光纤和S个非干涉区域的对应关系,控制所述扫描光纤阵列中的扫描光纤,使其出射所述输出光束以形成S个视场光,并将所述S个视场光投射至人眼。如此,使得每一时刻所述近眼显示系统能够显示S个视场光,即S个像素点,其中S为不小于2的整数。而现有技术中每一时刻仅能显示一个像素点。如此,能够有效降低通道开关的开关频率,并且,在单位时间内开关频率降低的情况下,其能量利用率也会随之提高。
实施例三:
本发明实施例第二方面还提供了一种虚拟现实设备,其包括两套如第一方面介绍的近眼显示系统,其中第一近眼显示系统与人的左眼对应,第二近眼显示系统与人的右眼对应。
在第一方面中已经详细介绍了近眼显示系统的具体结构以及运行过程,在此就不再 赘述了。
具体地,所述虚拟现实设备还可以包括外壳,所述第一近眼显示系统和所述第二近眼显示系统均设置在所述外壳中。
实施例四:
本发明实施例第三方面还提供了一种增强现实设备,其包括两套如第一方面介绍的近眼显示系统,其中第一近眼显示系统与人的左眼对应,第二近眼显示系统与人的右眼对应。外界环境光通过所述第一近眼显示系统的会聚透镜阵列组进入人的左眼,并通过所述第二近眼显示系统的会聚透镜阵列组进入人的右眼。
在第一方面中已经详细介绍了近眼显示系统的具体结构以及运行过程,在此就不再赘述了。
具体地,所述增强现实设备还可以包括外壳,所述第一近眼显示系统和所述第二近眼显示系统均设置在所述外壳中。
具体地,所述会聚透镜阵列组包括第一会聚透镜阵列和第二会聚透镜阵列,所述第一会聚透镜阵列设置于扫描光纤阵列30的靠近人眼一侧,所述第二会聚透镜阵列设置于扫描光纤阵列30的远离人眼一侧。
具体地,所述第一会聚透镜阵列和所述第二会聚透镜阵列可以均为准直透镜阵列。如图6所示,在扫描光纤阵列30的靠近人眼一侧设置第一准直透镜阵列50,在扫描光纤阵列30的远离人眼一侧设置第二准直透镜阵列51,且第一准直透镜阵列50和第二准直透镜阵列51组成1:1的望远系统。由于扫描光纤阵列30封装在透明基板305中,使得外界环境光通过1:1的望远系统进入人眼,且由于外界环境光是通过1:1的望远系统进入人眼的,不会对外界进行放大或缩小,使得用户能够更真实的感受外界环境。1:1的望远系统的光路原理具体如图7所示。
在本申请另一实施例中,所述第一会聚透镜阵列和所述第二会聚透镜阵列还可以均为电控液体微透镜阵列。所述电控液体微透镜阵列例如可以为电控液晶微透镜阵列。如图10所示,在扫描光纤阵列30的靠近人眼一侧设置第一电控液晶微透镜阵列52,在扫描光纤阵列30的远离人眼一侧设置第二电控液晶微透镜阵列53,且第一电控液晶微透镜阵列52和第二电控液晶微透镜阵列53组成1:1的无焦系统。由于扫描光纤阵列30封装在透明基板305中,使得外界环境光通过1:1的无焦系统进入人眼,且由于外界环境光是通过1:1的无焦系统进入人眼的,不会对外界进行放大或缩小,使得用户能够更真实的感受外界环境。
在本申请另一实施例中,所述会聚透镜阵列组还可以为第一电控液晶微透镜阵列52和调光结构54。在使用包含第一电控液晶微透镜阵列52的近眼显示系统用于进行增强现实显示时,需要在扫描光纤阵列30的远离人眼一侧设置调光结构54。调光结构54具体可以是带有光开关的PDLC膜层。采用分时段方式显示虚拟图像和现实外界环境。假设人眼的刷新率为30Hz,将该刷新率对应的时间段分成2段。一段时间用于显示虚拟图像,此段时间内使PDLC膜层的光开关断开,使得PDLC膜层呈不透明状态。另一段时间用于观察现实外界环境,此段时间内使PDLC膜层的光开关开通,从而对PDLC膜层施加电压,使其呈透明状态,使得外界环境光能够通过PDLC膜层和透明基板305。同时,不施加电压给第一电控液晶微透镜阵列52。由于第一电控液晶微透镜阵列52不加电压不工作,第一电控液晶微透镜阵列52无光会聚或发散的功能,即不呈现光偏折的作用,不对外界环境光有光转折。如此,使得外界环境光能够通过PDLC膜层和透明基板305之后通过第一电控液晶微透镜阵列52进入人眼,实现观察到现实外界环境。
当然,还可以将所述刷新率对应的时间段分成至少3段,其中的一段或多段时间用于显示虚拟图像,剩下的至少一段时间用于观察现实外界环境。
本发明的有益效果如下:
基于上述技术方案,本发明实施例中控制器分别电性连接分光组件和瞳孔位置检测器,使控制器根据所述瞳孔位置检测器获取的瞳孔位置,获取所述用户瞳孔与扫描光纤阵列的相对位置,再根据所述相对位置和所述影像信息的显示视场灰度,控制所述分光组件中的每个输出通道的开通和断开。由此可知,能够根据所述用户瞳孔与扫描光纤阵列的相对位置自动开通对应的输出通道来输出光束,使得选取的输出通道子集均匀分布在所述瞳孔位置的四周,从而能够确保所述输出光束全部进入人眼,使得人眼在任意观察位置均可以接收到所有输出光束。由此实现了对人眼观察的位置没有限制,能够给用户以良好的增强现实体验的效果,并且无需用户对增强现实设备进行瞳距调节,也因此避免了用户因调节结果不精确导致无法获得良好的增强现实体验的缺陷。
显然,本领域的技术人员可以对本发明进行各种改动和变型而不脱离本发明的精神和范围。这样,倘若本发明的这些修改和变型属于本发明权利要求及其等同技术的范围之内,则本发明也意图包含这些改动和变型在内。

Claims (16)

  1. 一种近眼显示系统,其特征在于,包括激光光源、分光组件、扫描光纤阵列、瞳孔位置检测器和控制器,所述分光组件包括M*N个输出通道,M和N均为不小于2的整数;
    所述激光光源用于输出根据影像信息调制的激光;所述分光组件用于将所述激光光源输出的激光分成M*N个光束;所述瞳孔位置检测器用于获取用户瞳孔的瞳孔位置;所述控制器分别电性连接所述分光组件和所述瞳孔位置检测器,用于根据所获取的瞳孔位置,获取所述用户瞳孔与所述扫描光纤阵列的相对位置,再根据所述相对位置和所述影像信息的显示视场灰度,控制所述分光组件中的每个输出通道的开通和断开;所述扫描光纤阵列中的扫描光纤与所述M*N个输出通道耦合,用于传输从所述分光组件输出的输出光束,并将所述输出光束投射至人眼。
  2. 如权利要求1所述的系统,其特征在于,所述激光光源包括三色激光光源、准直镜组、合束器、耦合器和耦合光纤,其中,所述三色激光光源输出三色激光;所述准直镜组设置于所述三色激光光源的出射光路上,用于对所述三色激光进行准直处理;所述合束器设置于所述准直镜组的出射光路上,用于将所述准直镜组出射的激光进行合束处理;所述耦合器设置于所述合束器的出射光路上,用于将所述合束器出射的激光耦合到所述耦合光纤中;所述耦合光纤与所述耦合器相连,用于传输经所述耦合器耦合的激光。
  3. 如权利要求1所述的系统,其特征在于,所述扫描光纤阵列包括水平方向扫描光纤束和垂直方向扫描光纤束,所述水平方向扫描光纤束用于对水平出射光束进行扩束,并且所述垂直方向扫描光纤束用于对垂直出射光束进行扩束。
  4. 如权利要求3所述的系统,其特征在于,所述水平方向扫描光纤束为紧密排布或间隔排布的扫描光纤束,所述垂直方向扫描光纤束为紧密排布或间隔排布的扫描光纤束,其中,所述紧密排布的含义为每相邻两束光纤之间的间隔不大于预设距离,所述间隔排布的含义为每相邻两束光纤之间的间隔大于所述预设距离。
  5. 如权利要求1所述的系统,其特征在于,每束扫描光纤均包括扫描器,所述扫描器设置在所述扫描光纤上,用于将所述扫描光纤进行偏转,使得所述扫描光纤出射的光束随所述扫描光纤偏转。
  6. 如权利要求1所述的系统,其特征在于,所述分光组件包括一个1*M型的第一光分路器、M个1*N型的第二光分路器和M*N个通道开关,所述第一光分路器的入射端与 所述激光光源的出射端相连,所述M个第二光分路器与所述第一光分路器的M个出射端一一相连,所述M*N个通道开关和所述M*N个输出通道一一对应,用于控制所述M*N个输出通道的开通和断开。
  7. 如权利要求1所述的系统,其特征在于,所述分光组件包括一个M*N型的光分路器,所述M*N型的光分路器集成了M*N个通道开关,所述M*N个通道开关和所述M*N个输出通道一一对应,用于控制所述M*N个输出通道的开通和断开。
  8. 如权利要求1所述的系统,其特征在于,所述控制器电性连接所述扫描光纤阵列,用于根据预设条件,将所述扫描光纤阵列中的扫描光纤划分成S个非干涉区域,并且在同一时刻控制所述S个非干涉区域以显示S个视场光,其中S为不小于2的整数。
  9. 如权利要求8所述的系统,其特征在于,所述控制器根据出瞳直径的大小,将所述扫描光纤阵列中的扫描光纤划分成所述S个非干涉区域。
  10. 如权利要求1-9中任一项所述的系统,其特征在于,所述近眼显示系统还包括会聚透镜阵列组,所述会聚透镜阵列组包括第一会聚透镜阵列和第二会聚透镜阵列,所述第一会聚透镜阵列设置于所述扫描光纤阵列的靠近人眼一侧,所述第二会聚透镜阵列设置于所述扫描光纤阵列的远离人眼一侧。
  11. 如权利要求10所述的系统,其特征在于,所述第一会聚透镜阵列和所述第二会聚透镜阵列均为准直透镜阵列,且所述第一会聚透镜阵列和所述第二会聚透镜阵列组成1:1的望远系统。
  12. 如权利要求10所述的系统,其特征在于,所述第一会聚透镜阵列和所述第二会聚透镜阵列均为电控液体微透镜阵列,且所述第一会聚透镜阵列和所述第二会聚透镜阵列组成1:1的无焦系统。
  13. 如权利要求1-9中任一项所述的系统,其特征在于,所述近眼显示系统还包括会聚透镜阵列,所述会聚透镜阵列设置于所述扫描光纤阵列的靠近人眼一侧。
  14. 如权利要求1所述的系统,其特征在于,所述近眼显示系统还包括调光结构,所述调光结构设置于所述扫描光纤阵列的远离人眼一侧。
  15. 一种虚拟现实设备,其特征在于,包括两套如权利要求1-14中任一项所述的近眼显示系统,其中第一近眼显示系统与人的左眼对应,第二近眼显示系统与人的右眼对应。
  16. 一种增强现实设备,其特征在于,包括两套如权利要求10-12中任一项所述的 近眼显示系统,其中第一近眼显示系统与人的左眼对应,第二近眼显示系统与人的右眼对应,并且,外界环境光通过所述第一近眼显示系统的会聚透镜阵列组进入人的左眼,并通过所述第二近眼显示系统的会聚透镜阵列组进入人的右眼。
PCT/CN2017/090839 2016-07-01 2017-06-29 近眼显示系统、虚拟现实设备及增强现实设备 WO2018001323A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610519542.9A CN107561701B (zh) 2016-07-01 2016-07-01 近眼显示系统、虚拟现实设备及增强现实设备
CN201610519542.9 2016-07-01

Publications (1)

Publication Number Publication Date
WO2018001323A1 true WO2018001323A1 (zh) 2018-01-04

Family

ID=60785861

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/090839 WO2018001323A1 (zh) 2016-07-01 2017-06-29 近眼显示系统、虚拟现实设备及增强现实设备

Country Status (2)

Country Link
CN (1) CN107561701B (zh)
WO (1) WO2018001323A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109212871A (zh) * 2018-11-13 2019-01-15 深圳创维新世界科技有限公司 投影显示装置
CN109307935A (zh) * 2018-11-13 2019-02-05 深圳创维新世界科技有限公司 空间投影显示设备
CN111487603A (zh) * 2020-04-20 2020-08-04 深圳奥锐达科技有限公司 一种激光发射单元及其制造方法

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109254410B (zh) * 2018-11-13 2023-10-10 深圳创维新世界科技有限公司 空间成像装置
CN109348210A (zh) * 2018-11-16 2019-02-15 成都理想境界科技有限公司 图像源模组、近眼显示系统、控制方法及近眼显示设备
CN109613790A (zh) * 2018-11-19 2019-04-12 成都理想境界科技有限公司 一种激光投影光学模组及近眼显示设备
KR20210095635A (ko) * 2018-11-29 2021-08-02 소니그룹주식회사 영상 투영 장치
CN110161596B (zh) * 2019-05-20 2021-07-23 河北工业大学 一种制作变焦液体微透镜的装置及方法
TWI828150B (zh) * 2022-05-20 2024-01-01 宇力電通數位整合有限公司 立體視覺眼鏡

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1721910A (zh) * 2004-07-14 2006-01-18 吴镝 一种ld(激光二极管)线阵列激光投影系统
CN2929774Y (zh) * 2006-06-05 2007-08-01 中国科学院物理研究所 一种投影显示装置
US20100296077A1 (en) * 2007-11-01 2010-11-25 Nasa Headquarters Three-dimensional range imaging apparatus and method
CN103487939A (zh) * 2013-08-28 2014-01-01 成都理想境界科技有限公司 可调头戴显示光学系统及其调节方法
CN104570347A (zh) * 2013-10-25 2015-04-29 广达电脑股份有限公司 头戴式显示装置及其成像方法
CN105717640A (zh) * 2014-12-05 2016-06-29 北京蚁视科技有限公司 基于微透镜阵列的近眼显示器

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8666212B1 (en) * 2011-04-28 2014-03-04 Google Inc. Head mounted display using a fused fiber bundle
CN103439794B (zh) * 2013-09-11 2017-01-25 百度在线网络技术(北京)有限公司 头戴式设备的校准方法和头戴式设备
CN104777615B (zh) * 2015-04-17 2017-05-10 浙江大学 基于人眼跟踪的自适应高分辨近眼光场显示装置和方法
CN205246972U (zh) * 2015-12-18 2016-05-18 北京蚁视科技有限公司 具有眼球同步显示功能的头戴式近眼显示装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1721910A (zh) * 2004-07-14 2006-01-18 吴镝 一种ld(激光二极管)线阵列激光投影系统
CN2929774Y (zh) * 2006-06-05 2007-08-01 中国科学院物理研究所 一种投影显示装置
US20100296077A1 (en) * 2007-11-01 2010-11-25 Nasa Headquarters Three-dimensional range imaging apparatus and method
CN103487939A (zh) * 2013-08-28 2014-01-01 成都理想境界科技有限公司 可调头戴显示光学系统及其调节方法
CN104570347A (zh) * 2013-10-25 2015-04-29 广达电脑股份有限公司 头戴式显示装置及其成像方法
CN105717640A (zh) * 2014-12-05 2016-06-29 北京蚁视科技有限公司 基于微透镜阵列的近眼显示器

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109212871A (zh) * 2018-11-13 2019-01-15 深圳创维新世界科技有限公司 投影显示装置
CN109307935A (zh) * 2018-11-13 2019-02-05 深圳创维新世界科技有限公司 空间投影显示设备
CN109212871B (zh) * 2018-11-13 2023-11-28 深圳创维新世界科技有限公司 投影显示装置
CN109307935B (zh) * 2018-11-13 2023-12-01 深圳创维新世界科技有限公司 空间投影显示设备
CN111487603A (zh) * 2020-04-20 2020-08-04 深圳奥锐达科技有限公司 一种激光发射单元及其制造方法

Also Published As

Publication number Publication date
CN107561701A (zh) 2018-01-09
CN107561701B (zh) 2020-01-31

Similar Documents

Publication Publication Date Title
WO2018001323A1 (zh) 近眼显示系统、虚拟现实设备及增强现实设备
US20200236331A1 (en) Image projection system
US9076368B2 (en) Image generation systems and image generation methods
US10417975B2 (en) Wide field of view scanning display
JP3435160B2 (ja) 仮想網膜表示装置
US8982014B2 (en) Image generation systems and image generation methods
JP2650878B2 (ja) ヘルメットに装着可能なカラーの表示装置
WO2018001324A1 (zh) 近眼显示系统、虚拟现实设备及增强现实设备
US20170255012A1 (en) Head mounted display using spatial light modulator to move the viewing zone
US20060033992A1 (en) Advanced integrated scanning focal immersive visual display
US6517206B2 (en) Display device
WO2018001322A1 (zh) 一种近眼显示系统、虚拟现实设备和增强现实设备
WO2018001325A1 (zh) 近眼显示系统、虚拟现实设备及增强现实设备
KR20150048789A (ko) 시각 보조 프로젝터
CN106164743A (zh) 眼睛投影系统
CN108628087B (zh) 一种投影装置及空间成像方法
US20190049898A1 (en) Holographic display system and holographic display method
WO2018001321A1 (zh) 一种近眼显示系统、虚拟现实设备和增强现实设备
US20220121027A1 (en) Display system having 1-dimensional pixel array with scanning mirror
WO2019164241A1 (ko) 접안 디스플레이 장치
US12007661B2 (en) Display apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17819320

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17819320

Country of ref document: EP

Kind code of ref document: A1