WO2021197082A1 - 增强现实设备及其显示方法 - Google Patents

增强现实设备及其显示方法 Download PDF

Info

Publication number
WO2021197082A1
WO2021197082A1 PCT/CN2021/081545 CN2021081545W WO2021197082A1 WO 2021197082 A1 WO2021197082 A1 WO 2021197082A1 CN 2021081545 W CN2021081545 W CN 2021081545W WO 2021197082 A1 WO2021197082 A1 WO 2021197082A1
Authority
WO
WIPO (PCT)
Prior art keywords
augmented reality
coupler
active shutter
reality device
light
Prior art date
Application number
PCT/CN2021/081545
Other languages
English (en)
French (fr)
Inventor
朱帅帅
王久兴
罗诚
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP21781311.2A priority Critical patent/EP4109166A4/en
Priority to US17/915,401 priority patent/US11914155B2/en
Priority to IL296491A priority patent/IL296491A/en
Publication of WO2021197082A1 publication Critical patent/WO2021197082A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/28Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/28Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising
    • G02B27/286Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising for controlling or changing the state of polarisation, e.g. transforming one polarisation state into another
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B6/00Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C7/00Optical parts
    • G02C7/02Lenses; Lens systems ; Methods of designing lenses
    • G02C7/08Auxiliary lenses; Arrangements for varying focal length
    • G02C7/081Ophthalmic lenses with variable focal length
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • G02F1/133528Polarisers
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • G02F1/133528Polarisers
    • G02F1/133531Polarisers characterised by the arrangement of polariser or analyser axes
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • G02F1/13363Birefringent elements, e.g. for optical compensation
    • G02F1/133638Waveplates, i.e. plates with a retardation value of lambda/n
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/137Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering
    • G02F1/139Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering based on orientation effects in which the liquid crystal remains transparent
    • G02F1/1396Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering based on orientation effects in which the liquid crystal remains transparent the liquid crystal being selectively controlled between a twisted state and a non-twisted state, e.g. TN-LC cell
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
    • G02B2027/012Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility comprising devices for attenuating parasitic image effects
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B6/00Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
    • G02B6/0001Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems
    • G02B6/0011Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems the light guides being planar or of plate-like form
    • G02B6/0013Means for improving the coupling-in of light from the light source into the light guide
    • G02B6/0023Means for improving the coupling-in of light from the light source into the light guide provided by one optical element, or plurality thereof, placed between the light guide and the light source, or around the light source
    • G02B6/0026Wavelength selective element, sheet or layer, e.g. filter or grating
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B6/00Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
    • G02B6/0001Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems
    • G02B6/0011Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems the light guides being planar or of plate-like form
    • G02B6/0033Means for improving the coupling-out of light from the light guide
    • G02B6/005Means for improving the coupling-out of light from the light guide provided by one optical element, or plurality thereof, placed on the light output side of the light guide
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F2413/00Indexing scheme related to G02F1/13363, i.e. to birefringent elements, e.g. for optical compensation, characterised by the number, position, orientation or value of the compensation plates
    • G02F2413/01Number of plates being 1
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F2413/00Indexing scheme related to G02F1/13363, i.e. to birefringent elements, e.g. for optical compensation, characterised by the number, position, orientation or value of the compensation plates
    • G02F2413/05Single plate on one side of the LC cell
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F2413/00Indexing scheme related to G02F1/13363, i.e. to birefringent elements, e.g. for optical compensation, characterised by the number, position, orientation or value of the compensation plates
    • G02F2413/08Indexing scheme related to G02F1/13363, i.e. to birefringent elements, e.g. for optical compensation, characterised by the number, position, orientation or value of the compensation plates with a particular optical axis orientation

Definitions

  • This application relates to the field of display combining virtual and real, and in particular to an augmented reality device and a display method thereof.
  • Augmented reality (AR) technology its principle is to use a computer-controlled image projector to project the display light carrying digital content into the human eye to form a virtual scene, and the virtual scene can be directly seen by the human eye.
  • the real scene is superimposed, so that the human eye can see the image information of the virtual scene combined with the real scene of the outside world.
  • part of the display light projected by the image projector will always be emitted from the augmented reality device, causing the display light carrying digital information to leak out, causing the leakage of the user’s privacy and reducing the user’s privacy sex.
  • the present application provides an augmented reality device and a display method thereof, which are used to reduce the possibility of display light emitting from the augmented reality device, prevent the display light carrying digital information from leaking out, and improve the privacy of the user.
  • the augmented reality device shown in this application includes a frame, a coupler, an active shutter lens, an image projector, and a processor.
  • the coupler is mounted on the frame, and the coupler includes an inner surface and an outer surface arranged opposite to each other.
  • the active shutter lens is mounted on On the outer surface of the coupler, the image projector is installed on the frame, and the processor is coupled to the image projector and the active shutter lens.
  • the processor is used to turn on the image projector and close the active shutter lens, and the image projector projects display light to the combiner.
  • the display light is light carrying digital content. Part of the display light exits from the inner surface of the coupler, and part of the display light exits from the outer surface of the coupler.
  • the active shutter lens blocks the display light from the outer surface of the coupler to prevent the display light from the outer surface of the coupler from passing through
  • the active shutter lens shoots into the external environment to prevent the display light carrying digital content from leaking out, which not only improves the privacy of the user and the sociality of the augmented reality device, but also prevents the leaked display light from forming on the surface of the augmented reality device
  • the small display window improves the extremely appearance of the user when using the augmented reality device.
  • the processor is also used to turn off the image projector and turn on the active shutter lens. After the ambient light passes through the active shutter lens, it enters the coupler from the outer surface of the coupler and exits from the inner surface of the coupler so that the user can pass through the coupler. And the active shutter lens to watch the real scene of the outside world to ensure that the augmented reality device has a certain transmittance.
  • the inner surface of the coupler refers to the surface of the coupler facing the user when the augmented reality device is worn on the user's head. That is, the inner surface of the coupler is the surface of the coupler facing the human eye.
  • the outer surface of the coupler refers to the surface of the coupler facing away from the user when the augmented reality device is worn on the user's head. That is, the outer surface of the coupler is the surface of the coupler facing away from the human eye. That is, the outer surface of the coupler is the surface of the coupler facing the outside.
  • the active shutter lens is a lens that can be opened and closed quickly under the control of the processor.
  • the processor turns on the active shutter lens, that is, when the active shutter lens is in the open state, the transmittance of the active shutter lens is relatively high, and light can normally propagate through the active shutter lens.
  • the processor turns off the active shutter lens, that is, when the active shutter lens is closed, the transmittance of the active shutter lens is close to 0, and the active shutter lens will block the light, that is, the light can hardly pass through the active shutter lens to propagate normally, that is, the active shutter The lens absorbs light.
  • the outer surface of the coupler includes a light exit area
  • the display light emitted from the outer surface of the coupler exits from the light exit area of the outer surface of the coupler
  • the active shutter lens covers the light exit area of the outer surface of the coupler.
  • the active shutter lens covers the outer surface of the coupler to ensure the completeness and consistency of the appearance of the augmented reality device, and improve the extraordinarness of the appearance of the augmented reality device.
  • the active shutter lens covers the outer surface of the coupler, which not only reduces the difficulty of the assembly process of the active shutter lens, but also does not require additional processing of the active shutter lens. , Reduce the processing difficulty of the active shutter lens, and reduce the production cost of the active shutter lens.
  • the active shutter lens is a liquid crystal light valve.
  • the active shutter lens includes a liquid crystal cell, a first polarizer and a second polarizer.
  • the second polarizer is located between the liquid crystal cell and the coupler. That is, the second polarizer is located on the side of the liquid crystal cell away from the first polarizer, that is, the second polarizer is located on the side of the liquid crystal cell away from the coupler.
  • the angle between the transmission axis direction of the second polarizer and the first polarizer is 90 degrees.
  • the processor turns on the active shutter lens
  • the ambient light is filtered by the first polarizer, and then passes through the liquid crystal cell and the second polarizer to the outer surface of the coupler, and then enters the human eye from the inner surface of the coupler, making the human eye You can see the real environment of the outside world through the active shutter lens and the coupler.
  • the liquid crystal light valve is an optical device that uses voltage to control the refractive index of liquid crystal molecules to achieve phase retardation of light.
  • the active shutter lens is an in-plane switching (IPS) type liquid crystal light valve.
  • IPS in-plane switching
  • the liquid crystal light valve When the processor turns on the active shutter glass, the liquid crystal light valve is in the power-on state.
  • the ambient light enters the liquid crystal cell after being filtered by the first polarizer.
  • the liquid crystal cell delays the phase of the light emitted by the first polarizer by ⁇ .
  • the light transmission axis directions of the two polarizers and the first polarizer are perpendicular to each other, and the light emitted through the liquid crystal cell can pass through the second polarizer and be directed toward the outer surface of the bonder.
  • the liquid crystal light valve When the processor closes the active shutter lens, the liquid crystal light valve is in a power-off state, and the ambient light enters the liquid crystal cell after being filtered by the first polarizer.
  • the liquid crystal cell does not change the phase of the light emitted through the first polarizer.
  • the light transmission axis directions of the film and the first polarizer are perpendicular to each other, and the light emitted through the liquid crystal cell cannot pass through the second polarizer and be directed toward the outer surface of the bonder, and thus is completely blocked by the second polarized light.
  • the active shutter lens is a twisted nematic (TN) liquid crystal light valve.
  • the liquid crystal light valve When the processor turns on the active shutter lens, the liquid crystal light valve is in a power-off state, and the ambient light enters the liquid crystal cell after being filtered by the first polarizer.
  • the liquid crystal cell delays the phase of the light emitted through the first polarizer by ⁇ .
  • the polarizer is perpendicular to the direction of the light transmission axis of the first polarizer, and the light emitted through the liquid crystal cell can pass through the second polarizer and be directed toward the outer surface of the bonder.
  • the liquid crystal light valve When the processor closes the active shutter lens, the liquid crystal light valve is in the power-on state, and the liquid crystal in the liquid crystal cell will rotate to a state perpendicular to the first polarizer.
  • the ambient light enters the liquid crystal cell after being filtered by the first polarizer. It will change the phase of the light emitted through the first polarizer. Since the second polarizer is perpendicular to the transmission axis of the first polarizer, the light emitted through the liquid crystal cell cannot pass through the second polarizer and be directed toward the outer surface of the bonder. , Thus completely blocked by the second polarizer.
  • the liquid crystal light valve is a vertical alignment (VA) type liquid crystal light valve, a super twisted nematic (STN) type liquid crystal light valve, or a ferroelectric liquid crystal (FLC) type. Light valve.
  • VA vertical alignment
  • STN super twisted nematic
  • FLC ferroelectric liquid crystal
  • the augmented reality device further includes a quarter wave plate, and the quarter wave plate is installed on the surface of the first polarizer facing away from the liquid crystal light valve, that is, the quarter wave plate is installed on the first polarizer.
  • the angle between the fast axis direction of the quarter wave plate and the light transmission axis direction of the first polarizer is 45 degrees.
  • liquid crystal display LCD
  • the emitted light of the liquid crystal display is linearly polarized light.
  • LCD liquid crystal display
  • a quarter One-wave plate can attenuate linearly polarized light in any polarization direction to 50%.
  • the processor turns on the active shutter lens
  • the quarter-wave plate can reduce the brightness difference when users watch electronic screens, and help improve user wear The experience of using augmented reality devices when viewing electronic screens.
  • the augmented reality device includes two augmented reality components, the two augmented reality components are installed on the frame at intervals, and each augmented reality component includes the above-mentioned coupler, an image projector, and an active shutter lens.
  • the two augmented reality components The couplers are arranged side by side.
  • one augmented reality component corresponds to the user's left eye
  • the other augmented reality component corresponds to the user's right eye.
  • the structure of the two augmented reality components is the same, that is, the two augmented reality components are both on the premise of ensuring the transmittance of the augmented reality device to prevent the display light carrying the digital content from leaking out.
  • the active shutter glass of each augmented reality component is a liquid crystal light valve
  • the active shutter glass of each augmented reality component includes a liquid crystal cell, a first polarizer and a second polarizer
  • each augmented reality component The liquid crystal cells of each augmented reality component are coupled with the processor, the first polarizer of each augmented reality component is located on the side of the liquid crystal cell of the augmented reality component facing away from the coupler, and the second polarizer of each augmented reality component is located on the side of the augmented reality component Between the liquid crystal cell and the coupler.
  • the second polarizer of each augmented reality component is located on the side of the liquid crystal cell of the augmented reality component facing away from the first polarizer, that is, the second polarizer of each augmented display component is positioned toward the liquid crystal cell of the augmented reality component.
  • the angle between the transmission axis direction of the first polarizer and the second polarizer of each augmented reality component is 90 degrees.
  • the ambient light is filtered by the first polarizer, and then passes through the liquid crystal cell and the second polarizer to the outer surface of the coupler, and then enters the human eye from the inner surface of the coupler, making the operator Both left and right eyes can observe the real environment of the outside world.
  • the augmented reality device includes two quarter wave plates, one quarter wave plate is mounted on the outer surface of a first polarizer, and the fast axis direction of the quarter wave plate is aligned with
  • the angle between the transmission axis of one first polarizer is 45 degrees
  • the other quarter-wave plate is mounted on the outer surface of the other first polarizer
  • the other quarter-wave plate is fast
  • the angle between the axis direction and the light transmission axis direction of the other first polarizer is 45 degrees, in order to reduce the brightness that exists when the left and right eyes view the electronic screen when the user wears the augmented reality device to watch the electronic screen
  • the difference helps to improve the user's experience when watching an electronic screen with an augmented reality device.
  • the transmission axis directions of the two first polarizers are the same, and the angle between the fast axis directions of the two quarter-wave plates is 90 degrees, or the transmission axis of the two first polarizers
  • the angle between the optical axis directions is 90 degrees, and the fast axis directions of the two quarter-wave plates are the same, so that when the user wears the augmented reality device to watch the electronic screen, the two augmented reality components are perpendicular to each other through the polarization directions.
  • the polarized light passes through the left-handed polarized light and the right-handed polarized light respectively.
  • the two polarized lights whose polarization directions are perpendicular to each other enter the left eye and the right eye of the user for imaging.
  • the processor turns on the active shutter glass, the user can watch three dimensions (3D) images. That is, the augmented reality device shown in this embodiment can also be used in a 3D movie theater, and can be compatible with both polarization and active shutter projection methods.
  • the augmented reality device further includes a zoom, and the zoom covers the inner surface of the coupler. That is, the zoom is located on the side of the coupler close to the eyes of the person, and is used to correct the eyesight of the user.
  • the zoom can correct the user's refractive error when the user is watching a virtual scene or a real scene outside, and improve the clarity of the user when watching a virtual scene or a real scene outside. Improve the user experience of using augmented reality devices.
  • the processor is coupled to the zoom, and the processor is used to adjust the optical power of the zoom.
  • the processor can adjust the focal point of the zoom to match the user’s vision according to the user’s diopter, so as to improve the adaptability of the augmented reality device, thereby increasing the flexibility of the use of the augmented reality device sex.
  • the augmented reality device further includes an eye tracking component, the eye tracking component is installed on the frame to track the line of sight of the eyeball, and the processor couples the zoom and the eye tracking component;
  • the processor is used to turn off the image projector and adjust the optical power of the zoomer to the first optical power to correct the user's refractive error when the user is watching the real scene of the outside world, and improve the clarity of the user when the user observes the real scene of the outside world Spend;
  • the processor is used to turn on the image projector, the eye tracking component is used to obtain the convergence depth of the virtual scene viewed by the eye, and the processor adjusts the optical power of the zoom to the second optical power according to the acquisition result of the eye tracking component.
  • the eye tracking component is used to track the line of sight of the eyeball, and obtain the convergence depth of the virtual scene that the user is looking at according to the line of sight of the eyeball.
  • the processor changes the virtual image distance of the virtual scene according to the convergence depth, and adjusts the position of the virtual scene to this In terms of the depth of convergence, it can not only correct the user’s refractive error when the user observes the virtual scene, improve the user’s clarity when observing the virtual scene, but also solve the conflict of visual convergence adjustment, reduce the user’s discomfort when using the augmented reality device, and improve User comfort.
  • the first refractive power is the refractive power of the user's eyeball
  • the second refractive power is the sum of the first refractive power and the reciprocal of the virtual image depth observed by the user.
  • the eye tracking component includes one or more infrared light-emitting diodes and one or more infrared cameras.
  • the infrared light emitted by the infrared light-emitting diodes enters the user’s eyes and is reflected by the cornea of the human eyes into the infrared camera for imaging.
  • the processor obtains the optical axis direction of the user's eyeball through the position of the infrared light spot in the image, calibrates the optical axis direction of the eyeball to obtain the user's line of sight direction, and determines the depth of the virtual scene viewed by the user according to the user's line of sight direction, and then Adjust the optical power of the zoom to the second optical power.
  • the display method of the augmented reality device shown in this application is any of the above-mentioned display methods of the augmented reality device, including:
  • the image projector is turned on and the active shutter lens is closed.
  • the image projector projects display light to the coupler. Part of the display light exits from the inner surface of the coupler. Part of the display light exits from the outer surface of the coupler.
  • the active shutter lens Blocks the display light emitted from the outer surface of the coupler, prevents the display light emitted from the outer surface of the coupler from entering the external environment through the active shutter lens, and avoids the display light carrying digital content from leaking out, which not only improves the user's privacy
  • the social nature of the augmented reality device can also prevent the leaked display light from forming a small display window on the surface of the augmented reality device, which improves the appearance of the user when using the augmented reality device.
  • the image projector is turned off and the active shutter lens is turned on.
  • the active shutter lens After the ambient light passes through the active shutter lens, it enters the coupler from the outer surface of the coupler, and exits from the inner surface of the coupler so that the user can pass through the coupler And the active shutter lens to watch the real scene of the outside world to ensure that the augmented reality device has a certain transmittance.
  • the first time period and the second time period are alternately performed to prevent the display light carrying digital content from leaking out while ensuring the transmittance of the augmented reality device.
  • the first time period and the second time period form a cycle, and a cycle is less than or equal to 1/60 second.
  • the flicker frequency perceivable by the human eye is 60 Hz. Since a period is less than or equal to 1/60 second, that is, a second includes at least 60 periods.
  • the persistence of vision phenomenon also known as the visual pause phenomenon or the afterglow effect
  • the human eye cannot perceive the virtual scene and the real scene of the outside world.
  • the switching of is equivalent to that the human eye can see the existence of the virtual scene as well as the real scene of the outside world. That is, the display light leaked from the coupler can be blocked under the premise of ensuring the transmittance of the augmented reality device.
  • FIG. 1 is a schematic structural diagram of an augmented reality device provided by an embodiment of the present application
  • Fig. 2 is a schematic structural diagram of the augmented reality device shown in Fig. 1 worn on the head of a user;
  • FIG. 3 is a simplified schematic diagram of the structure shown in Figure 2;
  • FIG. 4 is a schematic diagram of an enlarged structure of area A in the structure shown in FIG. 3 in an embodiment
  • FIG. 5 is an enlarged schematic diagram of the structure of area A in the structure shown in FIG. 3 under another embodiment
  • FIG. 6 is a schematic diagram of an enlarged structure of area B in the structure shown in FIG. 5;
  • FIG. 7 is a schematic diagram of the working state of the image projector and the active shutter lens when the augmented reality device shown in FIG. 5 is working;
  • FIG. 8 is an enlarged schematic diagram of the structure of area A in the structure shown in FIG. 3 under the third embodiment
  • FIG. 9 is a schematic diagram of an enlarged structure of area C in the structure shown in FIG. 8;
  • FIG. 10 is an enlarged schematic diagram of the structure of area A in the structure shown in FIG. 3 under the fourth embodiment.
  • FIG. 11 is an enlarged schematic diagram of the structure of area A in the structure shown in FIG. 3 under the fifth embodiment
  • FIG. 12 is a schematic diagram of the working state of the image projector, the active shutter lens and the zoomer when the augmented reality device shown in FIG. 11 is in operation.
  • FIG. 1 is a schematic structural diagram of an augmented reality device 100 provided by an embodiment of the present application.
  • the augmented reality device 100 may be an electronic product that combines digital content and a real scene, such as AR glasses, AR helmet, mixed reality (MR) glasses, or MR helmet.
  • the augmented reality device 100 of the embodiment shown in FIG. 1 uses AR glasses as an example for illustration.
  • the augmented reality device 100 includes a frame 10 and an augmented reality component 30 installed on the frame 10. Among them, there are two augmented reality components 30, and the two augmented reality components 30 are installed on the frame 10 at intervals.
  • the spectacle frame 10 includes a spectacle frame 11 and a temple 12 connected to the spectacle frame 11. Among them, there are two temples 12, and the two temples 12 are connected to opposite ends of the mirror frame 11. It should be noted that in other embodiments, the spectacle frame 10 may also include a spectacle frame 11 and a fixing belt connected to the spectacle frame 11, which is not specifically limited in this application.
  • the mirror frame 11 includes two frames 13 and a beam 14 connected between the two frames 13.
  • Each frame 13 includes a first frame 131 away from the cross beam 14 and a second frame 132 opposite to the first frame 131.
  • An accommodating cavity is provided inside the first frame 131, and the accommodating cavity of the first frame 131 is used for accommodating electronic components of the augmented reality device 100.
  • the cross beam 14 and the two frames 13 are integrally formed to simplify the molding process of the mirror frame 11 and increase the overall strength of the mirror frame 11.
  • the material of the frame 11 includes, but is not limited to, metal, plastic, resin, or natural materials. It should be understood that the spectacle frame 11 is not limited to the full-frame type spectacle frame shown in FIG. 1, and may also be a half-frame type or a frameless type spectacle frame.
  • the two temples 12 are rotatably connected to opposite ends of the spectacle frame 11. Specifically, the two temples 12 are respectively rotatably connected to the two frames 13 of the spectacle frame 11. The two temples 12 are respectively connected to the first frame 131 of the two frames 13.
  • the two temples 12 rotate relative to the frame 11 to face each other. At this time, the two temples 12 of the augmented reality device 100 can be erected on the two of the user. On each ear, the beam 14 is erected on the bridge of the user's nose to be worn on the user's head.
  • the two temples 12 When the augmented reality device 100 is in the folded state, the two temples 12 rotate relative to the mirror frame 11 to at least partially overlap each other and are housed inside the mirror frame 11, at this time the augmented reality device 100 can be stored. It is understandable that in other embodiments, the two temples 12 may be fixedly connected to the first frame 131 of the two frames 13, or the two temples 12 may be integrally formed with the frame 11, that is, the augmented reality device 100 It is always in an expanded state, and this application does not specifically limit it. It should be noted that the inside of the temple 12 may also be provided with a receiving cavity, and the receiving cavity of the temple 12 may also house the electronic components of the augmented reality device 100.
  • the terms “inside” and “outside” used in this application when referring to the augmented reality device 100 are mainly based on the orientation of the augmented reality device 100 when the user wears the head.
  • the augmented reality device 100 When the augmented reality device 100 is worn by the user, it is close to the user's head as the inner side, and away from the user's head as the outer side, which does not limit the orientation of the augmented reality device 100 in other scenes.
  • FIG. 2 is a schematic structural diagram of the augmented reality device 100 shown in Fig. 1 worn on the head of a user.
  • Fig. 3 is a simplified schematic diagram of the structure shown in Fig. 2.
  • the length direction of the augmented reality device 100 is defined as the X-axis direction
  • the width direction of the augmented reality device 100 is the Y-axis direction
  • the thickness direction of the augmented reality device 100 is Z
  • the axis direction, and the X direction, Y direction and Z direction are perpendicular to each other.
  • the X-axis direction is the direction in which one frame 13 of the spectacle frame 11 faces the other frame 13
  • the Z-axis direction is the direction of the spectacle frame 11 toward the temple 12.
  • the structures of the two augmented reality components 30 are the same. Specifically, the two augmented reality components 30 are respectively installed on the two frames 13 of the spectacle frame 11. When the augmented reality device 100 is worn on the user’s head, one augmented reality component 30 corresponds to the user’s left eye, and the other augmented reality component 30 corresponds to the user’s right eye. At this time, the user’s eyes can be viewed through the two augmented reality components 30 Virtual scene and real scene. It should be noted that in other embodiments, the structures of the two augmented reality components 30 may also be different, which is not specifically limited in this application.
  • the structure of the augmented reality component 30 is described in detail by taking the augmented reality component 30 corresponding to the user's right eye as an example.
  • FIG. 4 is a schematic diagram of an enlarged structure of the area A in the structure shown in FIG. 3 in an embodiment.
  • the augmented reality component 30 includes a combiner 31, an image projector 32, an active shutter lens 33 and a processor 34.
  • the coupler 31 is installed on the frame 10.
  • the coupler 31 includes an inner surface 312 and an outer surface 313 disposed opposite to each other.
  • the active shutter lens 33 is installed on the outer surface 313 of the coupler 31.
  • the image projector 32 is mounted on the frame 10.
  • the processor 34 is coupled to the image projector 32 and the active shutter lens 33 to control the opening and closing of the image projector 32 and the active shutter lens 33.
  • the two augmented reality components 30 may include only one processor 34, and the processor 34 is coupled to the laser projectors 32 of the two augmented reality components 30 at the same time to control the two laser projections.
  • the opening and closing of the machine 32 are not specifically limited in this application.
  • the coupler 31 is mounted on the frame 11 of the frame 10.
  • the couplers 31 of the two enhanced display assemblies 30 are arranged side by side along the X-axis direction.
  • the couplers 31 of the two augmented reality components 30 are installed on the spectacle frame 11 at intervals.
  • the coupler 31 is installed on the frame 13 of the spectacle frame 11.
  • the inner surface 312 of the coupler 31 is the surface of the coupler 31 facing the inner side of the lens frame 11. That is, the outer surface 313 of the coupler 31 is the surface of the coupler 31 facing the outside of the lens frame 11.
  • the combiner 31 is a device that uses diffractive optical waveguide technology to combine digital content and real scenes. It should be noted that in other embodiments, the coupler 31 may also be a device using bird bath, free-form surface, or reflective array optical waveguide technology.
  • the coupler 31 includes a diffractive optical waveguide 314, an in-coupling grating 315 and an out-coupling grating 316.
  • the diffractive optical waveguide 314 is installed on the frame 13. One end of the diffractive optical waveguide 314 is installed on the first frame 131 of the frame 13 and is received in the receiving cavity 133 of the first frame 131. The other end of the diffractive optical waveguide 314 is mounted on the second frame 133 of the frame 13.
  • the diffractive optical waveguide 314 includes an inner surface and an outer surface disposed opposite to each other.
  • the inner surface of the diffractive optical waveguide 314 is the surface of the diffractive optical waveguide 314 facing the inner side of the lens frame 11. That is, the outer surface of the diffractive optical waveguide 314 is the surface of the diffractive optical waveguide 314 facing the outside of the lens frame 11.
  • the coupling-in grating 315 and the coupling-out grating 316 are both blazed gratings.
  • the coupling grating 315 is installed on the outer surface of the diffractive optical waveguide 314 and is located in the receiving cavity 133 of the first frame 131.
  • the out-coupling grating 316 is installed on the outer surface of the diffractive optical waveguide 314, is spaced apart from the in-coupling grating 315, and is located between the first frame 131 and the second frame 133.
  • the in-coupling grating 315 and the out-coupling grating 316 may also be transmissive gratings.
  • the in-coupling grating 315 and the out-coupling grating 316 are mounted on the inner surface of the diffractive optical waveguide 314.
  • the in-coupling grating 315 and the out-coupling grating 316 may also be holographic gratings, tilted gratings, polarization gratings, liquid crystal gratings, holographic optical elements or diffractive optical elements, which are not specifically limited in this application.
  • the grating refers to an optical device composed of a large number of parallel slits of equal width and equal spacing.
  • the grating can periodically adjust the amplitude or phase of the light, so the light will exit the grating surface from a direction different from the incident angle.
  • the explanation of the grating will be interpreted in the same way in the following text.
  • the inner surface of the diffractive optical waveguide 314 is the inner surface 312 of the coupler 31.
  • the inner surface 312 of the coupler 31 includes a light incident area 3121 and a light output area 3122.
  • the light incident area 3121 of the inner surface 312 is located in the receiving cavity 133 of the first frame 131.
  • the light incident area 3121 of the inner surface 312 is the area covered by the projection of the coupling grating 315 on the inner surface 312. That is, the area directly opposite to the coupling grating 315 in the inner surface 312 of the coupler 31 is the light incident area 3121 of the inner surface 312.
  • the light emitting area 3122 and the light incident area 3121 of the inner surface 312 are spaced apart and located between the first frame 131 and the second frame 132.
  • the light exit area 3122 of the inner surface 312 is the area covered by the projection of the outcoupling grating 315 on the inner surface 312. That is, the area of the inner surface 312 that is directly opposite to the outcoupling grating 315 is the light exit area 3122 of the inner surface 3123.
  • the outer surface 313 of the coupler 31 includes the surface of the coupling grating 315 facing away from the diffractive optical waveguide 314, the surface of the coupling out grating 316 facing away from the diffractive optical waveguide 314, and the outer surface of the diffractive optical waveguide 314 that is not coupled into the grating 315 and out of the grating.
  • the area covered by 316 That is, the outer surface 313 of the coupler 31 includes the outer surface of the coupling-in grating 315, the outer surface of the coupling-out grating 316, and the area of the outer surface of the diffractive optical waveguide 314 that is not covered by the coupling-in grating 315 and the coupling-out grating 316.
  • the outer surface 313 of the coupler 31 includes a light-emitting area 3131.
  • the light exit area 3131 of the outer surface 313 is the surface of the outcoupling grating 316 facing away from the diffractive optical waveguide 314, that is, the outer surface of the outcoupling grating 316.
  • the image projector 32 is located in the accommodating cavity 133 of the first frame 131 and is arranged opposite to the coupler 31. Specifically, the image projector 32 is located on the side of the diffractive optical waveguide 314 away from the coupling grating 315. That is, the image projector 32 and the coupling grating 315 are located on opposite sides of the diffractive optical waveguide 314, respectively. Among them, the image projector 32 is directly facing the light incident area 3121 of the inner surface 312. It can be understood that when the coupling grating 315 is a transmissive grating, the image projector 32 and the coupling grating 315 are located on the same side of the diffractive optical waveguide 314.
  • the image projector 32 may also be located in the receiving cavity of the temple 12 (that is, inside the temple 12), or the laser projector 32 may also be partially located in the receiving cavity of the first frame 131 133. Part of it is located in the accommodating cavity of the temple 12, or the laser projector 32 may not be located in the accommodating cavity 133 of the first frame 131 or the accommodating cavity of the temple 12, but directly exposed on the surface of the frame 13, as long as it is in the augmented reality When the device 100 is in use, it is sufficient that the user's line of sight is not blocked.
  • the image projector 32 includes, but is not limited to, liquid crystal on silicon (LCOS), digital light processing (DLP), light emitting diode (LED), and organic light emitting diode (organic light emitting diode).
  • LCOS liquid crystal on silicon
  • DLP digital light processing
  • LED light emitting diode
  • organic light emitting diode organic light emitting diode
  • light-emitting diode OLED
  • QLED quantum dot light emitting diode
  • AMOLED active-matrix organic light emitting diode
  • flexible light-emitting diode flexible light-emitting diode
  • FLED field-emitting diode
  • Mini LED Micro OLED
  • Laser MEMS laser micro electro mechanical systems
  • the image projector 32 When the processor 34 turns on the image projector 32, that is, when the image projector 32 is in the on state, the image projector 32 projects the display light L 0 to the coupler 31, and part of the display light L 0 emerges from the inner surface 312 of the coupler 31.
  • the display light L 0 exits from the outer surface 313 of the coupler 31.
  • the image projector 32 projects the display light L 0 carrying digital content, and the display light L0 enters the coupler 31 from the light incident area 3121 of the inner surface 312, and passes through the light output area 3122 of the inner surface 312 and the light output area of the outer surface 313. Shoot out.
  • the light L 0 is perpendicular to the inner surface of the diffractive optical waveguide 314 (that is, the inner surface 312 of the coupler 31), from the light incident area 3121 of the inner surface 312, perpendicular to the coupling grating 315, and passing through the coupling grating 315.
  • the coupling grating 315 has adjusted the propagation direction of the display light L 0 to a state that satisfies the condition of total reflection.
  • the light L 0 undergoes at least one total reflection in the diffractive optical waveguide 314 and propagates toward the out-coupling grating 316 until it reaches the out-coupling grating 316 and diffracts.
  • Part of the light L 0 is diffracted and propagates from the light exit area 3122 of the inner surface 312 toward the inside of the coupler 31, that is, toward the human eye.
  • this part of the light is marked as the eye- entering light L 1 , and the incident light L 1 can enter Human eye imaging, so that users can see virtual scenes carrying digital content.
  • a part of the light L 0 is diffracted and propagates from the light exit area 3131 of the outer surface 313 toward the outside of the coupler 31.
  • this part of the light is marked as the leaked light L 2 . It can be understood that when the processor 34 turns off the image projector 32, that is, when the image projection and 32 are in the off state, the image projector 32 does not project the display light L 0 , and at this time, neither the eye- catching light L 1 enters the image of the human eye, nor No leakage light L 2 propagates to the outside of the coupler 31.
  • the active shutter glass 33 is located on the side of the coupling 31 away from the image projector 32, that is, the active shutter glass 33 and the image projector 32 are located on opposite sides of the coupling 31.
  • the active shutter lens 33 is a lens based on electrochromic materials (except liquid crystal). It should be understood that the active shutter lens 33 is a lens that can be quickly opened and closed under the control of the processor 34.
  • the processor 34 turns on the active shutter glass 33, that is, when the active shutter glass 33 is in the open state, the transmittance of the active shutter glass 33 is relatively high, and light can pass through the active shutter glass 33 to propagate normally.
  • the processor 34 closes the active shutter lens 33, that is, when the active shutter lens 33 is in the closed state, the transmittance of the active shutter lens 33 is close to 0, and the active shutter lens 33 will block the light, that is, the light cannot pass through the active shutter lens 33. , That is, the active shutter lens 33 can absorb light.
  • both ends of the active shutter lens 33 can be respectively mounted on the outer surface 313 of the coupler 31 through a sealant.
  • the width d of the air gap is about 50 ⁇ m. It should be understood that since the thickness of the coupling-in grating 315 and the coupling-out grating 316 is in the nanometer range, the active shutter lens 33 will not contact the coupling-in grating 315 and the coupling-out grating 316.
  • the active shutter lens 33 covers the outer surface 313 of the coupler 31 to ensure the integrity and consistency of the appearance of the augmented reality device 100 and improve the extraordinarness of the appearance of the augmented reality device 100. That is, the active shutter glass 33 covers the outer surface of the coupling-in grating 315, the outer surface of the coupling-out grating 316, and the portions of the outer surface of the diffractive optical waveguide 314 that are not covered by the coupling-in grating 315 and the coupling-out grating 316. At this time, the active shutter glass 33 can act as a protective glass to protect the coupling-in grating 315 and the coupling-out grating 316.
  • the active shutter glass 33 may only cover the light exit area 3131 of the outer surface 313, that is, the active shutter glass 33 may only cover the outer surface of the coupling out grating 316. It is understandable that, compared to the active shutter lens 33 covering only the light exit area 3131 of the outer surface 313, the active shutter lens 33 covers the outer surface 313 of the coupler 31, which not only reduces the difficulty of the assembly process of the active shutter lens 33, but also does not require The additional processing of the active shutter lens 33 reduces the processing difficulty of the active shutter lens 33 and reduces the production cost of the active shutter lens 33.
  • the processor 34 is located in the receiving cavity 133 of the first frame 131 and is electrically connected to the image projector 32 and the active shutter lens 33.
  • the processor 34 may include one or more processing units.
  • the multiple processing units may be, for example, an application processor (AP), a modem processor, a graphics processing unit (GPU), an image signal processor (ISP), a controller, and a video processor. Codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU), etc.
  • the different processing units may be independent devices or integrated in one or more processors.
  • the processor 34 may be a central processing unit (CPU) of the augmented reality device 100, or may be another processor of the augmented reality device 100.
  • the processor 34 is used to control the opening and closing of the image projector 32, and to synchronously control the closing and opening of the active shutter lens 33. That is, the processor 34 turns on the image projector 32 while closing the active shutter glass 33, and the processor 34 turns off the image projector 32 and turns on the active shutter glass 33 at the same time. That is, the processor 34 synchronously switches the state of the image projector 32 and the active shutter glass 33.
  • the active shutter lens 33 shields the light emitted from the outer surface 313 of the coupler 31
  • the light L 0 is displayed. Specifically, after the display light L 0 projected by the image projector 32 enters the coupler 31 from the light incident area 3121 of the inner surface 312, the incident light L 1 enters the human eye from the light exit area 3121 of the inner surface 312 for imaging, and the leaked light L 2
  • the light emitting area 3131 from the outer surface 313 is directed toward the active shutter glass 33.
  • the transmittance of the active shutter lens 33 is close to 0, and the active shutter lens 33 will block the leaked light L 2 , which is equivalent to the active shutter lens 33 absorbing the leaked light L 2 to prevent the self-combiner
  • the leaked light L 2 emitted from the outer surface 313 of 31 passes through the active shutter lens 33 and enters the external environment, avoiding the leaking of the leaked light L 2 carrying digital content, which can not only improve the privacy of the user and the enhancement of the augmented reality device 100
  • the sociality can also prevent the leaked light L 2 from forming a small display window on the surface of the augmented reality device 100, which improves the appearance of the user when using the augmented reality device 100.
  • the processor 34 turns off the image projector 32 and turns on the active shutter lens 33, that is, when the image projector 32 is in the off state and the active shutter lens 33 is in the on state
  • the ambient light Lc can pass through the active shutter lens 33 from the coupling 31.
  • the outer surface 313 enters the coupler 31 and exits through the inner surface 312 of the coupler 31.
  • the transmittance of the active shutter lens 33 is relatively high.
  • the ambient light Lc can pass through the active shutter lens 33 and enter the coupler 31, and face the person from the inner surface 312 of the coupler 31.
  • the direction of the eye propagates, thus entering the human eye for imaging.
  • human eyes can see the real scene of the outside world through the active shutter lens 33 and the coupler 31.
  • the image projector 32 since the image projector 32 is turned off, the image projector 32 does not project the display light L 0 carrying digital content, neither the eye- catching light L 1 enters the human eye, nor the leaking light L 2 leaks out of the augmented reality device 100 . That is, the human eye can only see the real scene of the outside world.
  • the processor 34 includes a control unit and a storage unit.
  • the control unit is used to control the opening and closing of the image projector 32 and the active shutter lens 33.
  • the storage unit is used to store the preset frequency f 0 , and the preset frequency f 0 is equal to or greater than 60 Hz.
  • the image projector 32 and the active shutter lens 33 are in different states in the first time period and the second time period, respectively. In the first time period, the image projector 32 is in the on state and the active shutter glass 33 is in the off state. In the second time period, the image projector 32 is in the off state and the active shutter glass 33 is in the on state.
  • T is less than or equal to 1/60 second.
  • one second includes at least 60 cycles, that is, the first period and the second period occur at least 60 times in one second. That is, the alternating frequency of the image projector 32 between the on and off states is greater than 120 Hz.
  • the alternating frequency of the active shutter lens 33 between the closed and open states is greater than 120 Hz.
  • the flicker frequency perceivable by the human eye is 60 Hz. Since the preset switching frequency is greater than the refresh frequency of the human eye, according to the persistence of vision phenomenon (also known as the visual pause phenomenon or the afterglow effect), when the augmented reality device 100 is working, the display light L 0 projected by the existing image projector 32 enters the human eye , And the ambient light Lc enters the human eye, that is, the human eye can see both the virtual scene and the real scene of the outside world. Moreover, the display light projected from the image projector 32 will not leak out from the augmented reality device 100.
  • the augmented reality device 100 shown in this embodiment can block the display light leaked from the combiner 31 under the premise of ensuring the transmittance of the augmented reality device 100, which not only improves the privacy and privacy of the augmented reality device 100
  • the sociality also improves the tinyness of the user's appearance when using the augmented reality device 100.
  • FIG. 5 is a schematic diagram of an enlarged structure of the area A in the structure shown in FIG. 3 under another embodiment.
  • the difference between the augmented reality device shown in this embodiment and the augmented reality device 100 shown in the foregoing embodiment is that the active shutter lens 33 is a liquid crystal light valve.
  • the active shutter glass 33 includes a liquid crystal cell 331, a first polarizer 332 and a second polarizer 333.
  • the liquid crystal box 331 is coupled with the processor 34.
  • the first polarizer 332 is located on the side of the liquid crystal cell 331 away from the connector 31, and the first polarizer 332 covers the surface of the liquid crystal cell 331 away from the connector 31. That is, the first polarizing plate 332 covers the outer surfaces of the liquid crystals and 331.
  • the second polarizer 333 is located between the liquid crystal cell 331 and the coupler 31.
  • the second polarizer 333 is located on the side of the liquid crystal cell 331 away from the first polarizer 332, that is, the second polarizer 333 is located on the side of the liquid crystal cell 331 facing the coupler 31.
  • the second polarizer 333 covers the inner surface of the liquid crystal cell 331. That is, the second polarizer 333 covers the surface of the liquid crystal cell 331 facing the bonder 31.
  • the direction of the transmission axis of the first polarizer 332 and the second polarizer 333 are perpendicular to each other. That is, the polarization direction of the light emitted through the first polarizer 332 and the polarization direction of the light emitted through the second polarized light 333 are perpendicular to each other.
  • the liquid crystal light valve is an optical device that uses voltage to control the refractive index of liquid crystal molecules to achieve phase retardation of light. According to the working principle of liquid crystal molecules, only the polarized light in the same direction as the long axis of the liquid crystal can pass through the liquid crystal cell 331.
  • the first polarizer 332 is used to change the polarization state of the incident light incident on the outer surface of the liquid crystal cell 331, and convert the incident light into linearly polarized light, so that the incident light can pass through the liquid crystal cell 331 and the second polarizer 333 to be combined.
  • the outer surface 313 of the ⁇ 31 is used to change the polarization state of the incident light incident on the outer surface of the liquid crystal cell 331, and convert the incident light into linearly polarized light, so that the incident light can pass through the liquid crystal cell 331 and the second polarizer 333 to be combined.
  • FIG. 6 is a schematic diagram of an enlarged structure of area B in the structure shown in FIG. 5. It should be noted that in the drawings of the present application, the straight line with arrows at both ends shown in the circle on the right of the figure represents the polarization state of the light at the interface position, and the description of the drawings will be understood in the same way hereinafter.
  • the active shutter lens 33 is an IPS-type liquid crystal light valve.
  • the processor 34 turns on the active shutter lens 33, that is, when the active shutter lens 33 is in the open state, the liquid crystal light valve is in the power-on state, and the liquid crystal cell There is a voltage difference on both sides of the liquid crystal layer in 331.
  • the polarization direction is rotated 90 degrees.
  • the second polarizer 333 is perpendicular to the transmission axis of the first polarizer 332
  • the light emitted through the liquid crystal cell 331 can pass through the second polarizer 333 and be directed toward the outer surface 313 of the coupler 31. That is, the ambient light L C can pass through the active shutter lens 33 and enter the human eye from the inner surface 312 of the coupler 31 for imaging, ensuring that the user can observe the real scene of the outside world.
  • the natural light transmittance of the active shutter lens 33 is between 35% and 50%.
  • the processor 34 closes the active shutter lens 33, that is, when the active shutter lens 33 is in the closed state, the liquid crystal light valve is in an off state, and the voltage difference between the two sides of the liquid crystal layer in the liquid crystal cell 331 is zero.
  • Ambient light L C after a first polarizing filter 332 enters the liquid crystal cell 331 liquid crystal cell 331 does not change the phase of light through the first polarizer 332 is emitted.
  • the second polarizer 333 is perpendicular to the transmission axis of the first polarizer 332, the light emitted through the liquid crystal cell 331 cannot pass through the second polarizer 333 and is directed toward the outer surface 313 of the coupler 31. 333 completely blocked. That is, the ambient light L C cannot pass through the active shutter glass 33. That is, the active shutter glass 33 completely absorbs the ambient light L C.
  • the active shutter lens 33 is a TN-type liquid crystal light valve.
  • the liquid crystal light valve When the processor 34 turns on the active shutter lens 33, that is, when the active shutter lens 33 is in an open state, the liquid crystal light valve is in an off state, that is, the voltage difference between the two sides of the liquid crystal layer in the liquid crystal cell 331 is zero.
  • the ambient light enters the liquid crystal cell 331 after being filtered by the first polarizer 332.
  • the liquid crystal cell 331 retards the phase of the light emitted by the first polarizer 332 by ⁇ , due to the transmission axis direction of the second polarizer 333 and the first polarizer 332 Vertically, the light emitted through the liquid crystal cell 331 can pass through the second polarizer 333 and be directed toward the outer surface 313 of the bonder 31.
  • the liquid crystal light valve When the processor 34 closes the active shutter lens 33, that is, when the active shutter lens 33 is in the closed state, the liquid crystal light valve is in the power-on state, that is, there is a voltage difference on both sides of the liquid crystal layer in the liquid crystal cell 331, and the liquid crystal in the liquid crystal layer will The rotation becomes a state perpendicular to the first polarizer 33.
  • the ambient light enters the liquid crystal cell 331 after being filtered by the first polarizer 332.
  • the liquid crystal cell 331 does not change the phase of the light emitted by the first polarizer 332, because the second polarizer 333 is perpendicular to the transmission axis of the first polarizer 332 The light emitted through the liquid crystal cell 331 cannot pass through the second polarizer 333 to be directed toward the outer surface of the bonder, and thus is completely blocked by the second polarizer 333.
  • the active shutter lens 33 may also be a VA-type liquid crystal light valve, a super twisted nematic liquid crystal light valve, or an FLC-type liquid crystal light valve.
  • the time period 0-t 12 is taken as an example for description.
  • FIG. 7 is a schematic diagram of the working state of the image projector 32 and the active shutter lens 33 when the augmented reality device 100 shown in FIG. 5 is in operation.
  • the processor 34 turns on the image projector 32 and closes the active shutter lens 33 to ensure that the display light projected by the image projector 32 will not leak from the augmented reality device 100.
  • the processor 34 turns off the image projector 32 and turns it on With the active shutter lens 33, human eyes can see the real scene of the outside world through the active shutter lens 33 and the coupling 31.
  • time periods 0-t 1 , t 2 -t 3 , t 4 -t 5 , t 6 -t 7 , t 8 -t 9 and t 10 -t 11 are all the first time periods mentioned above
  • the time periods t 1 -t 2 , t 3 -t 4 , t 5 -t 6 , t 7 -t 8 , t 9 -t 10 and t 11 -t 12 are all the second time periods mentioned above.
  • the transmittance of the augmented reality device 100 is between 17.5% and 25%.
  • the augmented reality device 100 shown in this embodiment can be adjusted by adjusting the time ratio of the image projector 32 in the on and off states (that is, the active shutter lens 33 is in the off and on states).
  • the transmittance of 100 is not limited to 100.
  • the augmented reality when the duration of the image projector 32 in the on state accounts for 20% of the duration of the entire cycle, that is, when the duration of the image projector 32 in the off state accounts for 80% of the duration of the entire cycle, the augmented reality The transmittance of the device 100 is reduced by 20%, that is, the transmittance of the augmented reality device 100 is between 28% and 40%, that is, the leakage of the coupler 31 can be shielded on the premise of ensuring the transmittance of the augmented reality device 100 Light L 2 .
  • the response time of the liquid crystal light valve closing (millisecond level) is much longer than the response time (microsecond or nanosecond level) of the image projector 32 being turned on, in order to ensure that the active shutter lens 33 can To effectively block the leaked light L 2 in time, the time when the active shutter lens 33 is closed should not be later than the time when the image projector 32 is turned on. That is, the time point when the active shutter glass 33 is closed should be earlier than the time point when the image projector 32 is turned on, or the time point when the active shutter glass 33 is closed should be the same as the time point when the image projector 32 is turned on. It assumed that the response time of liquid crystal light valve is t r, the active shutter glasses 33 is closed at a desired time point compared to the image projector 32 is turned on ahead of time point t r.
  • the response time for closing the liquid crystal light valve is about 1ms-2ms, that is, the active shutter lens 33 can start to close 1ms-2ms before the image projector 32 is turned on to ensure that the active shutter lens 33 can completely block the leaked light in time L 2 .
  • FIG. 8 is a schematic diagram of an enlarged structure of the area A in the structure shown in FIG. 3 under the third embodiment.
  • the augmented reality device 100 also includes a quarter wave plate 40 (also called a quarter delay plate)
  • the quarter wave plate 40 covers the surface of the polarizer 332 facing away from the liquid crystal cell 331, that is, the quarter wave plate 40 covers the outer surface of the polarizer 332.
  • the quarter wave plate is a birefringent single crystal wave plate with a certain thickness. When light enters through the quarter wave plate, it is birefringent and divided into ordinary light and extraordinary light. Ordinary light is light that obeys the law of refraction, extraordinary light is light that does not obey the law of refraction, and ordinary light is different from ordinary light.
  • the phase difference between extraordinary lights is equal to ⁇ /2 or its odd multiples.
  • the quarter wave plate is an achromatic quarter wave plate, that is, the phase delay of the wave plate to the visible light waveband light is ⁇ /2 to ensure that the visible light in the ambient light can enter the human eye Imaging.
  • FIG. 9 is a schematic diagram of an enlarged structure of area C in the structure shown in FIG. 8.
  • the angle between the fast axis direction of the quarter wave plate 40 and the light transmission axis direction of the first polarizer 332 is 45 degrees. That is, the fast axis of the quarter wave plate 40 is set to an angle of 45 degrees with the polarization direction of the linearly polarized light that can pass through the first polarizer 332. It should be understood that since many electronic screens commonly used in daily life are liquid crystal display (LCD), the emitted light of the liquid crystal display is linearly polarized light.
  • LCD liquid crystal display
  • the electronic screen When the augmented reality device 100 shown in this embodiment is worn on the user's head, the electronic screen is viewed through the augmented reality device 100, and the line of sight rotates around the electronic screen, regardless of the polarization state of the emitted light from the electronic screen and the polarizer 332
  • the direction of the transmission axis is vertical or parallel, and the quarter wave plate 40 will convert the linearly polarized light emitted by the electronic screen into circularly polarized light, and attenuate the emitted light of the electronic screen by 50%.
  • the first polarizer 332 changes the circularly polarized light into linearly polarized light and enters the liquid crystal cell 331, and enters the human eye through the liquid crystal cell 331 and the coupler 31, reducing the time when the user is watching the electronic screen.
  • the difference in brightness helps to improve the user experience when wearing the augmented reality device 100 to watch the electronic screen.
  • the augmented reality device 100 shown in this embodiment when the augmented reality device 100 shown in this embodiment is worn on the user's head, there is no need to remove the augmented reality device 100, and only the active shutter lens 33 needs to be turned on to watch the electronic screen of the surrounding environment. The ease of use of the augmented reality device 100.
  • the active shutter glasses 33 of the two augmented reality components 30 both include a liquid crystal cell 331, a first polarizer 332, and a second polarizer 333.
  • the liquid crystal cell 331 is coupled with the processor 34, the first polarizer 332 covers the outer surface of the liquid crystal cell 331, and the second polarizer 333 covers the inner surface of the liquid crystal cell 331.
  • the ambient light Lc can pass through the liquid crystal cell 331 and the second polarizer 333 in sequence after being filtered by the first polarizer 332 and directed toward the outer surface 313 of the coupler 31, and self-combined
  • the inner surface 312 of the filter 31 is directed toward the human eye, so that both the left eye and the right eye of the user can pass through the active shutter lens 33 and the coupler 31 to view the real environment of the outside world.
  • a quarter wave plate 40 covers the outer surface of a first polarizer 332, and the angle between its fast axis direction and the light transmission axis direction of the first polarizer 332 is 45 degrees.
  • the other quarter wave plate 40 covers the outer surface of the other first polarizer 332, and the angle between the fast axis direction and the polarization direction of the first polarizer 332 is 45 degrees. That is to say, the angle between the fast axis direction of each quarter wave plate 40 and the light transmission axis direction of the first polarizer 332 covered by it is 45 degrees, so as to ensure that the user is wearing the augmented reality device 100 while watching the electronics.
  • the brightness difference of the electronic screen viewed by the two eyes is small, which improves the comfort of the user wearing the augmented reality device 100 to watch the electronic screen.
  • the light transmission axis directions of the two first polarizers 332 are the same, and the angle between the fast axis directions of the two quarter wave plates 40 is 90 degrees, or the light transmission of the two first polarizers 332
  • the angle between the axis directions is 90 degrees, and the fast axis directions of the two quarter-wave plates 40 are the same to ensure that the two augmented reality components 30 respectively pass polarized light whose polarization directions are perpendicular to each other, such as left-handed polarized light and right-handed polarized light.
  • Rotating polarized light allows the augmented reality device 100 to be used in a three-dimensional (3D) movie theater.
  • the augmented reality device 100 shown in this embodiment can be used not only to watch a display screen combined with reality and virtual reality, but also to watch 3D video when the processor 34 turns on the active shutter lens 33. That is, the augmented reality device 100 can be compatible with both polarization and active shutter projection methods at the same time.
  • FIG. 10 is an enlarged schematic diagram of the structure of area A in the structure shown in FIG. 3 under the fourth embodiment.
  • the augmented reality device 100 further includes a zoom device 50.
  • the zoom device 50 is mounted on the inner surface 312 of the coupler 31 and covers the The inner surface 312 of the ⁇ 31. That is, the zoom device 50 is located on the side of the coupler 31 close to the human eye, and is used to correct the eyesight of the user.
  • the zoomer 50 can correct the user's refractive error when the user is watching a virtual scene carrying digital content or a real scene outside, and improve the user's viewing of the virtual scene or the real scene outside.
  • the zoom device 50 may be a device capable of zooming, such as a liquid crystal lens, a liquid lens, an Alvarez lens, or a mechanical zoom lens. It should be understood that the zoomer 50 may be an optical device with a fixed power such as a lens with a power, or an optical device with an adjustable power coupled to the processor 34.
  • the user can use the augmented reality device 100 according to The user's diopter adjusts the optical power of the zoomer 50 to match the user's eyesight, so as to improve the adaptability of the augmented reality device 100, thereby improving the flexibility of using the augmented reality device 100.
  • FIG. 11 is an enlarged schematic diagram of the structure of area A in the structure shown in FIG. 3 under the fourth embodiment.
  • the augmented reality device 100 further includes an eye tracking component 60.
  • the eye tracking component 60 is installed on the frame 10 to track the line of sight of the eyeball.
  • the processor 34 is coupled to the zoom device 50 and the eye tracking component 60 to adjust the optical power of the zoom device 50.
  • the eye tracking component 60 is installed on the frame 11 of the frame 10 and faces the inner side of the frame 11.
  • the eye tracker 60 includes one or more infrared light-emitting diodes (IR LEDs) 61 and one or more infrared cameras (IR cameras) 62.
  • IR LEDs infrared light-emitting diodes
  • IR cameras infrared cameras
  • the infrared light emitting diode 61 is installed on the first frame 131 and faces the inner side of the mirror frame 11.
  • the infrared camera 62 is installed on the second frame 133 and faces the inner side of the frame 11.
  • the infrared light-emitting diode 61 emits infrared light, which enters the user's eyeball, and enters the infrared camera 52 after being reflected by the user's cornea, and then forms an image.
  • the processor 34 determines the user's optical axis direction by determining the position of the infrared light spot in the image, and then After calibration, determine the direction of the user's line of sight. It should be noted that the eye tracker 60 shown in this embodiment is not limited to the eye tracking technology described above, and other eye tracking technologies are available, which is not specifically limited in this application.
  • the processor 34 turns off the image projector 32 and adjusts the optical power of the zoomer 50 to the first optical power, that is, when the image projector 32 is in the off state and the optical power of the zoomer 50 is the first optical power
  • the zoomer 50 can correct the user's refractive error when the user is watching the real scene of the outside world, improve the clarity of the user when watching the real scene, and improve the user's experience of use.
  • the first refractive power is the refractive power of the user's eyeball.
  • the eye tracking component 60 obtains the convergence depth of the virtual scene viewed by the eyeball, and the processor 34 adjusts the zoom device 50 according to the result obtained by the eye tracking component 60
  • the optical power is adjusted to the second optical power.
  • the eye tracking component 60 tracks the line of sight of the eyeball, and determines the convergence depth of the virtual scene observed by the user according to the direction of the user's line of sight.
  • the processor 34 changes the virtual image distance of the virtual scene according to the convergence depth, and adjusts the position of the virtual scene to this Convergence depth.
  • the second optical power is the sum of the first optical power and the reciprocal of the virtual image depth observed by the user.
  • the zoomer 50 can not only correct the user's refractive error when the user is observing the virtual digital content, improve the clarity of the user when viewing the digital content, and improve the user experience, but also can change the virtual image distance of the digital content to solve the adjustment of visual convergence Conflict (vergence-accommodation conflict, VAC) reduces the user's discomfort when using the augmented reality device 100, and improves the user's comfort in using the augmented reality device 100.
  • VAC visual convergence conflict
  • FIG. 12 is a schematic diagram of the working state of the image projector 32, the active shutter lens 33 and the zoomer 50 when the augmented reality device 100 shown in FIG. 11 is in operation.
  • the augmented reality device 100 when the augmented reality device 100 is working, within the time periods of 0-t 1 , t 2 -t 3 , t 4 -t 5 , t 6 -t 7 , t 8 -t 9 and t 10 -t 11 ,
  • the image projector 32 is in the on state
  • the active shutter lens 33 is in the off state
  • the processor 34 determines that the depth of the virtual image observed by the user is L (for example, 0.5m) according to the direction of the user's line of sight obtained by the eye tracker 60, then the reciprocal of the virtual image depth ⁇ D is 1/L (such as -2.0D), and the optical power of the zoomer 50 is adjusted to D 0 + ⁇ D (such as -6.0D).
  • the second optical power of the zoomer 50 is D 0 + ⁇ D, which can not only ensure that the display light projected by the image projector 32 will not leak from the augmented reality device 100, but also ensure that the user can clearly watch the digital content .
  • the image projector 32 is off and active
  • the shutter glass 33 is in the open state, and the processor 34 adjusts the optical power of the zoom 50 to D 0 .
  • the first optical power of the zoomer 50 is D 0 to ensure that the human eye can clearly see the real scene of the outside world through the active shutter lens 33 and the coupler 31.
  • the present application also provides a display method of any of the above-mentioned augmented reality devices 100, including:
  • the image projector 32 is turned on and the active shutter lens 33 is closed.
  • the image projector 32 projects a display light L 0 to the coupler 31, part of the display light L 0 emerges from the inner surface 312 of the coupler 31, and part of the display light L 0 exits from the outer surface 313 of the coupler 31, and the active shutter lens 33 blocks the display light L 0 exiting from the outer surface 313 of the coupler 31.
  • the processor 34 turns on the image projector 32 and turns off the active shutter lens 33.
  • the active shutter lens 32 prevents the display light L 0 emitted from the outer surface 313 of the coupler 31 from entering the external environment, thereby avoiding any digital content.
  • the leakage of the display light L 0 can not only improve the privacy of the user and the sociality of the augmented reality device 100, but also prevent the leaked display light L 0 from forming a small display window on the surface of the augmented reality device 100, which improves the user The extraordinar appearance when using the augmented reality device 100.
  • the image projector 32 is turned off and the active shutter glass 33 is turned on. After the ambient light Lc passes through the active shutter glass 33, it enters the coupler 31 from the outer surface 313 of the coupler 31, and enters the coupler 31 from the inner surface 312 of the coupler 31. Shoot out. Specifically, the processor 34 turns off the image projector 32 and turns on the active shutter lens 33, and the user can view the real scene of the outside through the coupler 31 and the active shutter lens 33, so as to ensure that the augmented reality device 100 has a certain transmittance.
  • the length of the second period is equal to the length of the first period. It should be noted that in other embodiments, the length of the second time period may also be greater or less than the length of the first time period, which is not specifically limited in this application.
  • the first time period and the second time period alternate.
  • the first time period and the second time period form a cycle, and a cycle is less than or equal to 1/60 second.
  • the flicker frequency perceivable by the human eye is 60 Hz.
  • one cycle is less than or equal to 1/60 second. That is, one second includes at least 60 cycles. That is, the first period and the second period occur at least 60 times within 1 second.
  • the alternating frequency of the first time period and the second time period is greater than 120Hz.
  • the persistence of vision phenomenon also known as the visual pause phenomenon or the afterglow effect
  • the human eye cannot perceive the switching between the virtual scene and the real scene of the outside world at this time.
  • the human eye can see the existence of virtual scenes as well as the existence of real scenes from the outside world. That is, the display light L 0 leaked from the coupler can be blocked on the premise of ensuring the transmittance of the augmented reality device 100.
  • the transmittance of the augmented reality device 100 can be adjusted by adjusting the time proportion of the first time period and the second time period. For example, when the first period of time accounts for 20% of the entire cycle, the transmittance of the augmented reality device 100 is only reduced by 20%. That is, the transmittance of the augmented reality device 100 is guaranteed.
  • the leakage light L 2 of the coupler 31 is shielded.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Nonlinear Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Mathematical Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • General Health & Medical Sciences (AREA)

Abstract

一种增强现实设备(100)及其显示方法,可提高使用者的隐私性。增强现实设备(100)包括镜架(10)、结合器(31)、主动快门镜片(33)、图像投影机(32)以及处理器(34),结合器(31)安装于镜架(10),结合器(31)包括相背设置的内表面(312)和外表面(313),主动快门镜片(33)安装于结合器(31)的外表面(313),图像投影机(32)安装于镜架(10),处理器(34)耦合图像投影机(32)和主动快门镜片(33)。处理器(34)用于开启图像投影机(32)并关闭主动快门镜片(33),图像投影机(32)向结合器(31)投射显示光线(L 0),部分显示光线(L 1)自结合器(31)的内表面(312)出射,部分显示光线(L 2)自结合器(31)的外表面(313)出射,主动快门镜片(33)遮挡自结合器(31)的外表面(313)出射的显示光线(L 2)。处理器(34)还用于关闭图像投影机(31)并开启主动快门镜片(33),环境光线(L c)穿过主动快门镜片(33)后,自结合器(31)的外表面(313)进入结合器(31),并自结合器(31)的内表面(312)出射。

Description

增强现实设备及其显示方法
本申请要求于2020年03月28日提交中国专利局、申请号为202010233012.4、申请名称为“增强现实设备及其显示方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及虚实结合的显示领域,特别涉及一种增强现实设备及其显示方法。
背景技术
增强现实(augmented reality,AR)技术,其原理是利用计算机控制的图像投射机,将携带数字内容的显示光线投射到人眼中形成虚拟场景,并将虚拟场景与人眼可以直接看到的外界的真实场景叠加,使人眼观看到虚拟场景与外界的真实场景相结合的图像信息。
在传统增强现实设备中,图像投影机投射的显示光线的一部分总会从增强现实设备向外射出,使得携带数字信息的显示光线泄露出去,造成了使用者的隐私泄露,降低了使用者的隐私性。
发明内容
本申请提供一种增强现实设备及其显示方法,用以减小显示光线从增强现实设备中向外射出的可能性,避免携带数字信息的显示光线泄露出去,提高使用者的隐私性。
本申请所示增强现实设备包括镜架、结合器、主动快门镜片、图像投影机以及处理器,结合器安装于镜架,结合器包括相背设置的内表面和外表面,主动快门镜片安装于结合器的外表面,图像投影机安装于镜架,处理器耦合图像投影机和主动快门镜片。
处理器用于开启图像投影机并关闭主动快门镜片,图像投影机向结合器投射显示光线。其中,显示光线为携带有数字内容的光线。部分显示光线自结合器的内表面出射,部分显示光线自结合器的外表面出射,主动快门镜片遮挡自结合器的外表面出射的显示光线,防止自结合器的外表面出射的显示光线穿过主动快门镜片射入外界环境,避免携带有数字内容的显示光线泄露出去,不仅可以提高使用者的隐私性和增强现实设备的社交性,还可以避免泄露出去的显示光线在增强现实设备的表面形成小的显示窗,提高使用者使用增强现实设备时的外观精美度。
处理器还用于关闭图像投影机并开启主动快门镜片,环境光线穿过主动快门镜片后,自结合器的外表面进入结合器,并自结合器的内表面出射,使用户能穿过结合器和主动快门镜片观看到外界的真实场景,以保证增强现实设备具有一定的透过率。
其中,结合器的内表面是指增强现实设备佩戴于用户头部时,结合器朝向用户的表面。即,结合器的内表面为结合器朝向人眼的表面。同理,结合器的外表面是指增强现实设备佩戴于用户头部时,结合器背离用户的表面。即,结合器的外表面为结合器背离人眼的表面。也即,结合器的外表面为结合器朝向外界的表面。
其中,主动快门镜片是一种可以在处理器控制下快速开关的镜片。处理器开启主动快门镜片,即主动快门镜片处于开启状态时,主动快门镜片的透过率较高,光线可穿过主动快门镜片正常传播。处理器关闭主动快门镜片,即主动快门镜片处于关闭状态时,主动快 门镜片的透过率接近于0,主动快门镜片会遮挡光线,即光线几乎无法穿过主动快门镜片正常传播,也即主动快门镜片会将光线吸收。
一种实施方式中,结合器的外表面包括出光区域,自结合器的外表面出射的显示光线自结合器的外表面的出光区域出射,主动快门镜片覆盖结合器的外表面的出光区域。当处理器开启图像投影机并关闭主动快门镜片时,自结合器的外表面出射的显示光线不会射入外界环境中,避免携带有数字内容的显示光线泄露。
另一种实施方式中,主动快门镜片覆盖结合器的外表面,以保证增强现实设备的外观完整性和一致性,提高增强现实设备的外观精美度。此外,相比于主动快门镜片只覆盖结合器的外表面的出光区域,主动快门镜片覆盖结合器的外表面,不仅降低了主动快门镜片的装配工艺难度,还不需要对主动快门镜片进行额外加工,降低了主动快门镜片的加工难度,降低了主动快门镜片的生产成本。
一种实施方式中,主动快门镜片为液晶光阀,主动快门镜片包括液晶盒、第一偏振片和第二偏振片,液晶盒与处理器耦合,第一偏振片位于液晶盒背离结合器的一侧,第二偏振片位于液晶盒与结合器之间。即第二偏振片位于液晶盒背离第一偏振片的一侧,也即第二偏振片位于液晶盒背离结合器的一侧。第二偏振片与第一偏振片的透光轴方向之间的夹角为90度。当处理器开启主动快门镜片时,环境光线经第一偏振片过滤后,依次通过液晶盒和第二偏振片射向结合器的外表面,自结合器的内表面射入人眼,使人眼可以穿过主动快门镜片和结合器看到外界的真实环境。
其中,液晶光阀是通过电压控制液晶分子的折射率来实现对光的相位延迟的光学器件。
一种实施方式中,主动快门镜片为平面转换(in-plane switching,IPS)型液晶光阀。
当处理器开启主动快门镜片时,此时液晶光阀处于上电状态,环境光线经第一偏振片过滤后进入液晶盒,液晶盒将经第一偏振片出射的光线的相位延迟π,由于第二偏振片与第一偏振片的透光轴方向相互垂直,经液晶盒出射的光线可穿过第二偏振片射向结合器的外表面。
当处理器关闭主动快门镜片时,液晶光阀处于断电状态,环境光线经第一偏振片过滤后进入液晶盒,液晶盒不会改变经第一偏振片出射的光线的相位,由于第二偏振片与第一偏振片的透光轴方向相互垂直,经液晶盒出射的光线无法穿过第二偏振片射向结合器的外表面,因而被第二偏振光完全阻挡。
一种实施方式中,主动快门镜片为扭曲向列(twisted nematic,TN)型液晶光阀。
当处理器开启主动快门镜片时,液晶光阀处于断电状态,环境光线经第一偏振片过滤后进入液晶盒,液晶盒会将经第一偏振片出射的光线的相位延迟π,由于第二偏振片与第一偏振片的透光轴方向垂直,经液晶盒出射的光线可穿过第二偏振片射向结合器的外表面。
当处理器关闭主动快门镜片时,液晶光阀处于上电状态,液晶盒中的液晶会旋转成为垂直于第一偏振片的状态,环境光线经第一偏振片过滤后进入液晶盒,液晶盒不会改变经第一偏振片出射的光线的相位,由于第二偏振片与第一偏振片的透光轴方向垂直,经液晶盒出射的光线无法穿过第二偏振片射向结合器的外表面,因而被第二偏振片完全阻挡。
一种实施方式中,液晶光阀为垂直排列(vertical alignment,VA)型液晶光阀、超扭曲向列(super twisted nematic,STN)型液晶光阀或铁电液晶(ferroelectric liquid crystal,FLC) 型光阀。
一种实施方式中,增强现实设备还包括四分之一波片,四分之一波片安装于第一偏振片背离液晶光阀的表面,即四分之一波片安装于第一偏振片的外表面,且四分之一波片的快轴方向与第一偏振片的透光轴方向之间的夹角为45度。
需要了解的是,现有的电子屏幕很多为液晶显示屏(liquid crystal display,LCD),液晶显示屏的出射光为线偏振光。当用户佩戴本实施方式所示增强现实设备观察电子屏幕,且视线环绕电子屏幕旋转时,无论电子屏幕的出射光的偏振方向与第一偏振片的透光轴方向是垂直或平行,四分之一波片可以将任意偏振方向的线偏振光衰减至50%,当处理器开启主动快门镜片时,四分之一波片可以削减用户观看电子屏幕时存在的亮度差异,有助于提高用户佩戴增强现实设备观看电子屏幕时的使用感受。
一种实施方式中,增强现实设备包括两个增强现实组件,两个增强现实组件间隔安装于镜架,每一增强现实组件包括上述结合器、图像投影机和主动快门镜片,两个增强现实组件的结合器并排设置。
本实施方式所示增强现实设备中,一个增强现实组件对应于用户的左眼,另一个增强现实组件对应于用户的右眼。两个增强现实组件的结构相同,即两个增强现实组件均在保证增强现实设备的透过率的前提下,避免携带有数字内容的显示光线泄露出去。
一种实施方式中,每一增强现实组件的主动快门镜片均为液晶光阀,每一增强现实组件的主动快门镜片均包括液晶盒、第一偏振片和第二偏振片,每一增强现实组件的液晶盒均与处理器耦合,每一增强现实组件的第一偏振片位于该增强现实组件的液晶盒背离结合器的一侧,每一增强显示组件的第二偏振片位于该增强现实组件的液晶盒与结合器之间。即,每一增强现实组件的第二偏振片位于该增强现实组件的液晶盒背离第一偏振片的一侧,也即每一增强显示组件的第二偏振片位于该增强现实组件的液晶盒朝向结合器第一偏振片的一侧。每一增强现实组件的第一偏振片与第二偏振片的透光轴方向之间的夹角为90度。
当处理器开启主动快门镜片时,环境光线经第一偏振片过滤后,依次通过液晶盒和第二偏振片射向结合器的外表面,自结合器的内表面射入人眼,使操作者的左眼和右眼均可以观察到外界的真实环境。
一种实施方式中,增强现实设备包括两个四分之一波片,一个四分之一波片安装于一个第一偏振片的外表面,且一个四分之一波片的快轴方向与一个第一偏振片的透光轴方向之间的夹角为45度,另一个四分之一波片安装于另一个第一偏振片的外表面,且另一个四分之一波片的快轴方向与另一个第一偏振片的透光轴方向之间的夹角为45度,以在用户佩戴增强现实设备观看电子屏幕过程中,削减左眼和右眼观看到电子屏幕时存在的亮度差异,有助于提高用户佩戴增强现实设备观看电子屏幕时的使用感受。
一种实施方式中,两个第一偏振片的透光轴方向相同,两个四分之一波片的快轴方向之间的夹角为90度,或,两个第一偏振片的透光轴方向之间的夹角为90度,两个四分之一波片的快轴方向相同,以在用户佩戴增强现实设备观看电子屏幕时,使两个增强现实组件分别通过偏振方向相互垂直的偏振光,比如分别通过左旋偏振光和右旋偏振光,此时偏振方向相互垂直的两个偏振光分别进入用户的左眼和右眼中成像。当处理器开启主动快门镜片时,用户可观看到三维(three dimensions,3D)图像。即,本实施方式所示增强现实 设备还可以用于3D电影放映厅,可同时兼容偏振式和主动快门式两种放映方式。
一种实施方式中,增强现实设备还包括变焦器,变焦器覆盖结合器的内表面。即,变焦器位于结合器靠近人眼的一侧,用以对用户的视力进行矫正。当用户患有近视、远视或散光等视力问题时,变焦器可以在用户观看虚拟场景或外界的真实场景时纠正用户的屈光不正,提高用户观看虚拟场景或外界的真实场景时的清晰度,提高用户使用增强现实设备的使用感受。
一种实施方式中,处理器耦合变焦器,处理器用于调整变焦器的光焦度。当用户需要使用增强现实设备时,处理器可依据用户的屈光度将变焦器的光焦度调整至与用户的视力相匹配,以提高增强现实设备的适配度,进而提高增强现实设备的使用灵活性。
一种实施方式中,增强现实设备还包括眼球追踪组件,眼球追踪组件安装于镜架,用以追踪眼球的视线,处理器耦合变焦器和眼球追踪组件;
处理器用于关闭图像投影机,并将变焦器的光焦度调节为第一光焦度,以在用户观看外界的真实场景时矫正用户的屈光不正,提高用户观察外界的真实场景时的清晰度;
处理器用于开启图像投影机,眼球追踪组件用以获取眼球观看的虚拟场景的辐辏深度,处理器根据眼球追踪组件的获取结果将变焦器的光焦度调整为第二光焦度。
具体的,眼球追踪组件用以追踪眼球的视线,并根据眼球的视线得到用户正在注视的虚拟场景的辐辏深度,处理器根据该辐辏深度改变虚拟场景的虚像距,将虚拟场景的位置调整至该辐辏深度上,不仅可以在用户观察虚拟场景时矫正用户的屈光不正,提高用户观察虚拟场景时的清晰度,还可以解决视觉辐辏调节冲突,减小用户使用增强现实设备时的不适感,提高用户的使用舒适度。
其中,第一光焦度为用户眼球的屈光度,第二光焦度为第一光焦度与用户观察到的虚像深度的倒数之和。
一种实施方式中,眼球追踪组件包括一个或多个红外发光二极管和一个或多个红外相机,红外发光二极管发射的红外光线进入用户的人眼,并经人眼的角膜反射入红外相机成像,处理器通过图像中红外光线的光斑的位置得到用户的眼球的光轴方向,对眼球的光轴方向校准后得到用户的视线方向,并根据用户的视线方向确定用户观看的虚拟场景的深度,进而将变焦器的光焦度调整至第二光焦度。
本申请所示增强现实设备的显示方法为上述任一种增强现实设备的显示方法,包括:
在第一时段,开启图像投影机并关闭主动快门镜片,图像投影机向结合器投射显示光线,部分显示光线自结合器的内表面出射,部分显示光线自结合器的外表面出射,主动快门镜片遮挡自结合器的外表面出射的显示光线,防止自结合器的外表面出射的显示光线经主动快门镜片射入外界环境,避免携带有数字内容的显示光线泄露出去,不仅可以提高使用者的隐私性和增强现实设备的社交性,还可以避免泄露出去的显示光线在增强现实设备的表面形成小的显示窗,提高使用者使用增强现实设备时的外观精美度。
在第二时段,关闭图像投影机并开启主动快门镜片,环境光线穿过主动快门镜片后,自结合器的外表面进入结合器,并自结合器的内表面出射,使用户能穿过结合器和主动快门镜片观看到外界的真实场景,以保证增强现实设备具有一定的透过率。
一种实施方式中,第一时段与第二时段交替进行,以在保证增强现实设备的透过率的 前提下,避免携带有数字内容的显示光线泄露出去。
一种实施方式中,第一时段和第二时段形成一个周期,一个周期小于或等于1/60秒。
需要了解的是,人眼可感知的闪烁频率为60Hz。由于一个周期小于或等于1/60秒,即一秒至少包括60个周期,根据视觉暂留现象(又称视觉暂停现象或余晖效应),此时人眼无法感知到虚拟场景与外界的真实场景的切换,相当于人眼既能够看到虚拟场景的存在,又能看到外界的真实场景的存在。即,可以在保证增强现实设备的透过率的前提下遮挡从结合器泄露的显示光线。
附图说明
为了更清楚地说明本申请实施例或背景技术中的技术方案,下面将对本申请实施例或背景技术中所需要使用的附图进行说明。
图1是本申请实施例提供的一种增强现实设备的结构示意图;
图2是图1所示增强现实设备佩戴于用户头部的结构示意图;
图3是图2所示结构的简化结构示意图;
图4是图3所示结构中A区域的在一种实施例下的放大结构示意图;
图5是图3所示结构中A区域在另一种实施例下的放大结构示意图;
图6是图5所示结构中B区域的放大结构示意图;
图7是图5所示增强现实设备在工作时图像投影机和主动快门镜片的工作状态示意图;
图8是图3所示结构中A区域在第三种实施例下的放大结构示意图;
图9是图8所示结构中C区域的放大结构示意图;
图10是图3所示结构中A区域在第四种实施例下的放大结构示意图;
图11是图3所示结构中A区域在第五种实施例下的放大结构示意图;
图12是图11所示增强现实设备在工作时图像投影机、主动快门镜片和变焦器的工作状态示意图。
具体实施方式
下面结合本申请实施例中的附图对本申请实施例进行描述。
请参阅图1,图1是本申请实施例提供的一种增强现实设备100的结构示意图。
增强现实设备100可以为AR眼镜、AR头盔、混合现实(mixrtual reality,MR)眼镜或MR头盔等将数字内容和现实场景结合在一起的电子产品。图1所示实施例的增强现实设备100以AR眼镜为例进行阐述。
本实施例中,增强现实设备100包括镜架10以及安装于镜架10的增强现实组件30。其中,增强现实组件30有两个,两个增强现实组件30间隔安装于镜架10。
镜架10包括镜框11以及与镜框11连接的镜腿12。其中,镜腿12有两个,两个镜腿12连接于镜框11的相对两端。需要说明的是,在其他实施例中,镜架10也可以包括镜框11和与镜框11连接的固定带,本申请对此不作具体限定。
镜框11包括两个边框13及连接于两个边框13之间的横梁14。每一边框13均包括远离横梁14的第一边框131和与第一边框131相对设置的第二边框132。第一边框131的内部设有收容腔,第一边框131的收容腔用以收容增强现实设备100的电子元器件。横梁14 与两个边框13一体成型,以简化镜框11的成型工艺,增加镜框11的整体强度。其中,镜框11的材料包括且不限于金属、塑料、树脂或天然材料等。应当理解的是,镜框11不仅限于图1所示的全框型镜框,也可以为半框型或无框型镜框。
两个镜腿12转动连接于镜框11的相对两端。具体的,两个镜腿12分别转动连接于镜框11的两个边框13。其中,两个镜腿12分别连接于两个边框13的第一边框131。在增强现实设备100处于展开状态(如图1所示)时,两个镜腿12通过相对镜框11转动至彼此相对,此时增强现实设备100的两个镜腿12可分别架设于用户的两个耳朵上,横梁14架设于用户的鼻梁上,以穿戴于用户的头部。在增强现实设备100处于折叠状态时,两个镜腿12通过相对镜框11转动,至彼此至少部分地重叠且收容于镜框11的内侧,此时增强现实设备100可收纳起来。可以理解的是,在其他实施例中,两个镜腿12可分别固定连接于两个边框13的第一边框131,或者,两个镜腿12可与镜框11一体成型,即增强现实设备100始终处于展开状态,本申请对此不作具体限定。需要说明的是,镜腿12的内部也可以设有收容腔,镜腿12的收容腔也可以收容增强现实设备100的电子元器件。
需要说明的是,本申请提及增强现实设备100时所采用“内侧”“外侧”等方位用词主要依据增强现实设备100被用户佩戴于头部时的方位进行阐述。增强现实设备100被用户佩戴时,以靠近用户头部为内侧,以远离用户头部为外侧,其并不形成对增强现实设备100于其他场景中的方位的限定。
请一并参阅图2和图3。图2是图1所示增强现实设备100佩戴于用户头部的结构示意图。图3是图2所示结构的简化结构示意图。
接下来,为了便于描述,如图2和图3所示,定义增强现实设备100的长度方向为X轴方向,增强现实设备100的宽度方向为Y轴方向,增强现实设备100的厚度方向为Z轴方向,且X方向、Y方向和Z方向彼此两两垂直。其中,X轴方向即为镜框11中一个边框13朝向另一个边框13的方向,Z轴方向即为镜框11朝向镜腿12的方向。
本实施例中,两个增强现实组件30的结构相同。具体的,两个增强现实组件30分别安装于镜框11的两个边框13。增强现实设备100穿戴于用户头部时,一个增强现实组件30对应于用户的左眼,另一个增强现实组件30对应于用户的右眼,此时用户的双眼可以通过两个增强现实组件30观看虚拟场景和真实场景。需要说明的是,在其他实施例中,两个增强现实组件30的结构也可以不同,本申请对此不作具体限定。
接下来,为了便于理解,以与用户的右眼相对应的增强现实组件30为例对增强现实组件30的结构进行具体描述。
请参阅图3和图4,图4是图3所示结构中A区域的在一种实施例下的放大结构示意图。
增强现实组件30包括结合器(combiner)31、图像投影机32、主动快门镜片33及处理器34。具体的,结合器31安装于镜架10。结合器31包括相背设置的内表面312和外表面313。主动快门镜片33安装于结合器31的外表面313。图像投影机32安装于镜架10。处理器34耦合图像投影机32和主动快门镜片33,用以控制图像投影机32和主动快门镜片33的开启与关闭。
需要说明的是,在其他实施例中,两个增强现实组件30可仅包括一个处理器34,该 处理器34同时耦合两个增强现实组件30的激光投影机32,用以控制两个激光投影机32的开启与关闭,本申请对此不做具体限定。
结合器31安装于镜架10的镜框11。本实施例中,两个增强显示组件30的结合器31沿X轴方向并排设置。具体的,两个增强现实组件30的结合器31间隔安装于镜框11。其中,结合器31安装于镜框11的边框13。结合器31的内表面312为结合器31朝向镜框11内侧的表面。即,结合器31的外表面313为结合器31朝向镜框11外侧的表面。本实施例中,结合器31为采用衍射光波导技术将数字内容和现实场景结合在一起的器件。需要说明的是,在其他实施例中,结合器31也可以为采用鸟浴(bird bath)、自由曲面或反射阵列光波导等技术的器件。
具体的,结合器31包括衍射光波导314、耦入光栅315和耦出光栅316。衍射光波导314安装于边框13。衍射光波导314的一端安装于边框13的第一边框131,且收容于第一边框131的收容腔133内。衍射光波导314的另一端安装于边框13的第二边框133。衍射光波导314包括相背设置的内表面和外表面。其中,衍射光波导314的内表面为衍射光波导314朝向镜框11内侧的表面。即,衍射光波导314的外表面为衍射光波导314朝向镜框11外侧的表面。
本实施例中,耦入光栅315和耦出光栅316均为闪耀光栅。具体的,耦入光栅315安装于衍射光波导314的外表面,且位于第一边框131的收容腔133内。耦出光栅316安装于衍射光波导314的外表面,与耦入光栅315间隔设置,且位于第一边框131和第二边框133之间。应当理解的是,耦入光栅315和耦出光栅316也可以为透射式光栅,此时耦入光栅315和耦出光栅316安装于衍射光波导314的内表面。此外,耦入光栅315和耦出光栅316也可以为全息光栅、倾斜光栅、偏振光栅、液晶光栅、全息光元件或衍射光元件,本申请对此不作具体限定。
应当理解的是,光栅是指由大量等宽等间距的平行狭缝构成的光学器件。当光线以一定角度入射到光栅表面时,光栅能对光线的振幅或相位进行空间周期性调整,因此光线会从不同于入射角度的方向射出光栅表面。后文中对光栅的说明做相同理解。
本实施例中,衍射光波导314的内表面即为结合器31的内表面312。结合器31的内表面312包括入光区域3121和出光区域3122。其中,内表面312的入光区域3121位于第一边框131的收容腔133内。具体的,内表面312的入光区域3121为耦入光栅315在内表面312的投影所覆盖的区域。即,结合器31的内表面312中与耦入光栅315正对的区域即为内表面312的入光区域3121。
内表面312的出光区域3122与入光区域3121间隔设置,且位于第一边框131和第二边框132之间。具体的,内表面312的出光区域3122为耦出光栅315在内表面312的投影所覆盖的区域。即,内表面312中与耦出光栅315正对的区域即为内表面3123的出光区域3122。
结合器31的外表面313包括耦入光栅315背离衍射光波导314的表面、耦出光栅316背离衍射光波导314的表面以及衍射光波导314的外表面中未被耦入光栅315和耦出光栅316覆盖的区域。即,结合器31的外表面313包括耦入光栅315的外表面、耦出光栅316的外表面以及衍射光波导314的外表面中未被耦入光栅315和耦出光栅316覆盖的区域。 其中,结合器31的外表面313包括出光区域3131。具体的,外表面313的出光区域3131为耦出光栅316背离衍射光波导314的表面,即耦出光栅316的外表面。
本实施例中,图像投影机32位于第一边框131的收容腔133内,且与结合器31相对设置。具体的,图像投影机32位于衍射光波导314背离耦入光栅315的一侧。即,图像投影机32和耦入光栅315分别位于衍射光波导314的相对两侧。其中,图像投影机32正对内表面312的入光区域3121。可以理解的是,当耦入光栅315为透射式光栅时,图像投影机32和耦入光栅315位于衍射光波导314的同侧。需要说明的是,在其他实施例中,图像投影机32也可以位于镜腿12的收容腔(即镜腿12的内部),或者,激光投影机32也可以部分位于第一边框131的收容腔133,部分位于镜腿12的收容腔,或者,激光投影机32也可以不位于第一边框131的收容腔133或镜腿12的收容腔内,直接外露于边框13的表面,只要在增强现实设备100使用时,不遮挡用户的视线即可。
其中,图像投影机32包括且不限于硅基液晶(liquid crystal on silicon,LCOS)、数字光处理器34(digital light processing,DLP)、发光二极管(light emitting diode,LED)、有机发光二极管(organic light-emitting diode,OLED)、量子点发光二极管(quantum dot light emitting diodes,QLED)、主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED)、柔性发光二极管(flex light-emitting diode,FLED)、Mini LED、Micro OLED、Micro LED或激光微电子机械系统(laser micro electro mechanical systems,Laser MEMS)等光机。
处理器34开启图像投影机32时,即图像投影机32处于开启状态时,图像投影机32向结合器31投射显示光线L 0,部分显示光线L 0自结合器31的内表面312出射,部分显示光线L 0自结合器31的外表面313出射。其中,图像投影机32投射携带有数字内容的显示光线L 0,显示光线L0自内表面312的入光区域3121进入结合器31,并经内表面312的出光区域3122和外表面313的出光区域出射。
具体的,显示光线L 0垂直射向衍射光波导314的内表面(即结合器31的内表面312),自内表面312的入光区域3121垂直射向耦入光栅315,经由耦入光栅315耦入衍射光波导314。其中,耦入光栅315已将显示光线L 0的传播方向调整至满足全反射条件的状态。显示光线L 0在衍射光波导314内发生至少一次全反射并朝向耦出光栅316的方向传播,直至到达耦出光栅316发生衍射。部分显示光线L 0发生衍射后自内表面312的出光区域3122朝向结合器31内侧传播,即朝向人眼的方向传播,图中将该部分光线标记为入眼光线L 1,入射光线L 1可进入人眼成像,以使用户能看到携带数字内容的虚拟场景。与此同时,部分显示光线L 0发生衍射后自外表面313的出光区域3131朝向结合器31外侧传播,图中将该部分光线标记为泄露光线L 2。可以理解的是,处理器34关闭图像投影机32时,即图像投影及32处于关闭状态时,图像投影机32不投射显示光线L 0,此时既没有入眼光线L 1进入人眼成像,也没有泄露光线L 2传播至结合器31外侧。
主动快门镜片33位于结合器31背离图像投影机32的一侧,即主动快门镜片33和图像投影机32位于结合器31的相对两侧。本实施例中,主动快门镜片33为基于电致变色材料(除液晶以外)的镜片。应当理解的是,主动快门镜片33是一种可以在处理器34控制下快速开关的镜片。处理器34开启主动快门镜片33时,即主动快门镜片33处于开启状态 时,主动快门镜片33的透过率较高,光线可穿过主动快门镜片33正常传播。处理器34关闭主动快门镜片33时,即主动快门镜片33处于关闭状态时,主动快门镜片33的透过率接近于0,主动快门镜片33会遮挡光线,即光线无法穿过主动快门镜片33传播,也即主动快门镜片33可以吸收光线。
本实施例中,主动快门镜片33的两端可以通过密封胶分别安装于结合器31的外表面313。主动快门镜片33的中间部分与结合器31的外表面313之间存在空气间隙,以保证显示光线L 0能在衍射光波导中发生全反射。其中,空气间隙的宽度d在50μm左右。应当理解的是,由于耦入光栅315和耦出光栅316的厚度在纳米级别,主动快门镜片33并不会与耦入光栅315和耦出光栅316接触。
具体的,主动快门镜片33覆盖结合器31的外表面313,以保证增强现实设备100的外观完整性和一致性,提高增强现实设备100的外观精美度。即,主动快门镜片33覆盖耦入光栅315的外表面、耦出光栅316的外表面以及衍射光波导314的外表面中未被耦入光栅315和耦出光栅316覆盖的部分。此时,主动快门镜片33可充当保护玻璃以保护耦入光栅315和耦出光栅316。
需要说明的是,在其他实施例中,主动快门镜片33也可以仅覆盖外表面313的出光区域3131,即主动快门镜片33可以仅覆盖耦出光栅316的外表面。可以理解的是,相比于主动快门镜片33只覆盖外表面313的出光区域3131,主动快门镜片33覆盖结合器31的外表面313,不仅降低了主动快门镜片33的装配工艺难度,还不需要对主动快门镜片33进行额外加工,降低了主动快门镜片33的加工难度,降低了主动快门镜片33的生产成本。
本实施例中,处理器34位于第一边框131的收容腔133,且与图像投影机32和主动快门镜片33电连接。其中,处理器34可以包括一个或多个处理单元。多个处理单元例如可以为应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。应当理解的是,处理器34可以为增强现实设备100的中央处理器(central processing unit,CPU),也可以为增强现实设备100的其他处理器。
具体的,处理器34用以控制图像投影机32的开启与关闭,并同步控制主动快门镜片33的关闭与开启。即,处理器34开启图像投影机32的同时关闭主动快门镜片33,处理器34关闭图像投影机32的同时开启主动快门镜片33。也即,处理器34同步切换图像投影机32和主动快门镜片33的状态。
当处理器34开启图像投影机32并关闭主动快门镜片33时,即图像投影机32处于开启状态且主动快门镜片33处于关闭状态时,主动快门镜片33遮挡自结合器31的外表面313出射的显示光线L 0。具体的,图像投影机32投射的显示光线L 0自内表面312的入光区域3121进入结合器31后,入射光线L 1自内表面312的出光区域3121射入人眼成像,泄露光线L 2自外表面313的出光区域3131射向主动快门镜片33。由于此时主动快门镜片33处于关闭状态,主动快门镜片33的透过率接近于0,主动快门镜片33将泄露光线L 2遮挡,相当于主动快门镜片33吸收泄露光线L 2,防止自结合器31的外表面313出射的泄露 光线L 2穿过主动快门镜片33射入外界环境中,避免携带有数字内容的泄露光线L 2泄露出去,不仅可以提高使用者的隐私性和增强现实设备100的社交性,还可以避免泄露出去的泄露光线L 2在增强现实设备100的表面形成小的显示窗,提高使用者使用增强现实设备100时的外观精美度。
当处理器34关闭图像投影机32并开启主动快门镜片33时,即图像投影机32处于关闭状态且主动快门镜片33处于开启状态时,环境光线Lc可穿过主动快门镜片33自结合器31的外表面313进入结合器31,并经结合器31的内表面312出射。其中,由于此时主动快门镜片33处于开启状态,主动快门镜片33的透过率较高,环境光线Lc可穿过主动快门镜片33进入结合器31,并自结合器31的内表面312朝向人眼的方向传播,从而进入人眼成像。即,人眼可透过主动快门镜片33和结合器31观看到外界的真实场景。此外,由于图像投影机32关闭,图像投影机32不投射携带有数字内容的显示光线L 0,既无入眼光线L 1射入人眼,也没有泄露光线L 2从增强现实设备100中泄露出去。即,人眼只能看到外界的真实场景。
一种实施方式中,处理器34包括控制单元和存储单元。控制单元用以控制图像投影机32和主动快门镜片33的开启和关闭。存储单元用以存储预设频率f 0,预设频率f 0等于或大于60Hz。具体的,增强现实设备100开启时,图像投影机32和主动快门镜片33在第一时段和第二时段下分别处于不同的状态。在第一时段内,图像投影机32处于开启状态且主动快门镜片33处于关闭状态。在第二时段内,图像投影机32处于关闭状态且主动快门镜片33处于开启状态。
其中,第一时段与第二时段形成一个周期T,1/T=f 0。即,T小于或等于1/60秒。也就意味着,增强现实设备100开启时,1秒至少包括60个周期,即1秒内第一时段和第二时段至少出现60次。也即,图像投影机32在开启和关闭两种状态之间的交替频率大于120Hz。主动快门镜片33在关闭和开启两种状态之间的交替频率大于120Hz。
需要了解的是,人眼可感知的闪烁频率(又称人眼刷新频率)为60Hz。由于预设切换频率大于人眼刷新频率,根据视觉暂留现象(又称视觉暂停现象或余晖效应),在增强现实设备100工作时,既有图像投影机32投射的显示光线L 0进入人眼,又有环境光线Lc进入人眼,即,人眼既可以看到虚拟场景,又可以看到外界的真实场景。而且从图像投影机32投射的显示光线也不会从增强现实设备100中泄露出去。也就是说,本实施例所示增强现实设备100可以在保证增强现实设备100的透过率的前提下,遮挡自结合器31泄露出来的显示光线,不仅提高了增强现实设备100的隐私性和社交性,还提高了用户在使用增强现实设备100时的外观精美度。
请参阅图5,图5是图3所示结构中A区域在另一种实施例下的放大结构示意图。
本实施例所示增强现实设备与上述实施例所示增强现实设备100的不同之处在于,主动快门镜片33为液晶光阀。主动快门镜片33包括液晶盒331、第一偏振片332和第二偏振片333。液晶盒331与处理器34耦合。第一偏振片332位于液晶盒331背离结合器31的一侧,且第一偏振片332覆盖液晶盒331背离结合器31的表面。即,第一偏振片332覆盖液晶和331的外表面。第二偏振片333位于液晶盒331和结合器31之间。即第二偏振片333位于液晶盒331背离第一偏振片332的一侧,也即第二偏振片333位于液晶盒331朝 向结合器31的一侧。此外,第二偏振片333覆盖液晶盒331的内表面。即第二偏振片333覆盖于液晶盒331朝向结合器31的表面。其中,第一偏振片332与第二偏振片333的透光轴方向垂直。即,经第一偏振片332出射的光线的偏振方向与经第二偏振光333出射的光线的偏振方向相互垂直。
需要说明的是,液晶光阀是通过电压控制液晶分子的折射率来实现对光的相位延迟的光学器件。依据液晶分子的工作原理,当且仅有与液晶长轴方向相同的偏振光能穿过液晶盒331。第一偏振片332用以改变入射至液晶盒331的外表面的入射光线的偏振态,将入射光线转换成线偏振光,以便于入射光线穿过液晶盒331和第二偏振片333射向结合器31的外表面313。
请一并参阅图6,图6是图5所示结构中B区域的放大结构示意图。需要说明的是,本申请附图中,图中右侧圆圈中所示两端带箭头的直线代表光线在该界面位置的偏振状态,后文中对附图的说明做相同理解。
一种实施方式中,主动快门镜片33为IPS型液晶光阀。
当本实施方式所示增强现实设备100佩戴于用户的头部,且处理器34开启主动快门镜片33时,即主动快门镜片33处于打开状态时,此时液晶光阀处于上电状态,液晶盒331中液晶层两侧存在电压差。环境光线L C经第一偏振片332过滤后进入液晶盒332,液晶盒331将经第一偏振片332出射的光线的相位延迟π,相当于液晶盒331将经第一偏振片332出射的光线的偏振方向旋转90度。由于第二偏振片333与第一偏振片332的透光轴方向垂直,经液晶盒331出射的光线可穿过第二偏振片333射向结合器31的外表面313。即,环境光线L C能穿过主动快门镜片33,且自结合器31的内表面312射入人眼中成像,保证用户能观察到外界的真实场景。此时,主动快门镜片33的自然光透过率在35%~50%之间。
当处理器34关闭主动快门镜片33时,即主动快门镜片33处于关闭状态时,此时液晶光阀处于断电状态,液晶盒331中液晶层两侧的电压差为零。环境光线L C经第一偏振片332过滤后进入液晶盒331,液晶盒331不会改变经第一偏振片332出射的光线的相位。由于第二偏振片333与第一偏振片332的透光轴方向垂直,经液晶盒331出射的光线无法穿过第二偏振片333射向结合器31的外表面313,因而被第二偏振片333完全阻挡。即,环境光线L C无法穿过主动快门镜片33。也即,主动快门镜片33将环境光线L C完全吸收。
另一种实施方式中,主动快门镜片33为TN型液晶光阀。
当处理器34开启主动快门镜片33时,即主动快门镜片33处于开启状态时,此时液晶光阀处于断电状态,即液晶盒331中液晶层两侧的电压差为零。环境光线经第一偏振片332过滤后进入液晶盒331,液晶盒331将经第一偏振片332出射的光线的相位延迟π,由于第二偏振片333与第一偏振片332的透光轴方向垂直,经液晶盒331出射的光线可穿过第二偏振片333射向结合器31的外表面313。
当处理器34关闭主动快门镜片33时,即主动快门镜片33处于关闭状态时,此时液晶光阀处于上电状态,即液晶盒331中液晶层两侧存在电压差,液晶层中的液晶会旋转成为垂直于第一偏振片33的状态。环境光线经第一偏振片332过滤后进入液晶盒331,液晶盒331不会改变第一偏振片332出射的光线的相位,由于第二偏振片333与第一偏振片332的透光轴方向垂直,经液晶盒331出射的光线无法穿过第二偏振片333射向结合器的外表 面,因而被第二偏振片333完全阻挡。
应当理解的是,在其他实施方式中,主动快门镜片33也可以为VA型液晶光阀、超扭曲向列型液晶光阀或FLC型液晶光阀。
接下来,为了便于理解,对本实施例所示增强现实设备100在工作时,图像投影机32和主动快门镜片33在各个时间段的工作状态进行举例说明。其中,以在0-t 12时间段为例进行描述。0-t 12时间段包括12个时长为Δt的时间段。即,t n-t n-1=Δt,n为大于等于1且小于等于12的整数。
请参阅图7,图7是图5所示增强现实设备100在工作时图像投影机32和主动快门镜片33的工作状态示意图。
本实施例中,增强现实设备100工作时,在0-t 1、t 2-t 3、t 4-t 5、t 6-t 7、t 8-t 9和t 10-t 11时间段内,处理器34开启图像投影机32且关闭主动快门镜片33,以保证图像投影机32投射的显示光线不会从增强现实设备100中泄露出去。在t 1-t 2、t 3-t 4、t 5-t 6、t 7-t 8、t 9-t 10和t 11-t 12时间段内,处理器34关闭图像投影机32且开启主动快门镜片33,人眼可透过主动快门镜片33和结合器31看到外界的真实场景。换言之,0-t 1、t 2-t 3、t 4-t 5、t 6-t 7、t 8-t 9和t 10-t 11时间段均为上文所提及的第一时段,t 1-t 2、t 3-t 4、t 5-t 6、t 7-t 8、t 9-t 10和t 11-t 12时间段均为上文所提及的第二时段。0-t 2、t 2-t 4、t 4-t 6、t 6-t 8、t 8-t 10和t 10-t 12时间段均为上文所提及的一个周期T,且T=2Δt。
此时,在0-t 12时间段内,图像投影机32开启状态下(即主动快门镜片33关闭状态下)的总时长为6Δt,时长占比为50%。图像投影机32关闭状态下(即主动快门镜片33开启状态下)的总时长为6Δt,时长占比为50%。也就是说,增强现实设备100的透过率为17.5%~25%之间。
可以理解的是,本实施例所示增强现实设备100中,可以通过调整图像投影机32在开启和关闭状态下的时间占比(即主动快门镜片33在关闭和开启状态)来调整增强现实设备100的透过率。比如,当图像投影机32在开启状态下的时长在整个周期的时长占比为20%时,即图像投影机32在关闭状态下的时长在整个周期的时长占比为80%时,增强现实设备100的透过率下降20%,即增强现实设备100的透过率在28%~40%之间,即可以在保证增强现实设备100的透过率的前提下,屏蔽结合器31的泄露光线L 2
需要说明的是,在实际应用中,由于液晶光阀关闭的响应时间(毫秒级)远远长于图像投影机32开启的响应时间(微秒级或纳秒级),为了保障主动快门镜片33能有效及时地遮挡泄露光线L 2,主动快门镜片33关闭的时间点应该不晚于图像投影机32开启的时间点。即,主动快门镜片33关闭的时间点应该早于图像投影机32开启的时间点,或者,主动快门镜片33关闭的时间点应该与图像投影机32开启的时间点相同。假设液晶光阀的响应时间为t r,则主动快门镜片33关闭的时间点需要相比于图像投影机32开启的时间点提前t r
本实施例中,液晶光阀关闭的响应时间约1ms-2ms,即主动快门镜片33可以在图像投影机32的开启前1ms-2ms开始关闭,以保证主动快门镜片33能及时完全地遮挡泄露光线L 2
请参阅图8,图8是图3所示结构中A区域在第三种实施例下的放大结构示意图。
本实施例所示增强现实设备与上述第二种实施例所示增强现实设备100的不同之处在 于,增强现实设备100还包括四分之一波片40(又称四分之一推迟板),四分之一波片40覆盖偏振片332背离液晶盒331的表面,即四分之一波片40覆盖偏振片332的外表面。其中,四分之一波片是具有一定厚度的双折射单晶波片。当光从射入透过四分之一波片时,发生双折射而分成寻常光和非寻常光,寻常光为遵守折射定律的光线,非寻常光为不遵守折射定律的光线,寻常光与非寻常光之间的相位差等于π/2或其奇数倍。本实施例中,四分之一波片为消色差的四分之一波片,即该波片对可见光波段光的相位延迟均为π/2,以确保环境光中的可见光能进入人眼成像。
请一并参阅图9,图9是图8所示结构中C区域的放大结构示意图。
本实施例中,四分之一波片40的快轴方向与第一偏振片332的透光轴方向之间的夹角呈45度。即,四分之一波片40的快轴设置成与能够透过第一偏振片332的线偏振光的偏振方向之间的夹角为45度。需要理解的是,由于目前生活中常用的电子屏幕很多为液晶显示屏(liquid crystal display,LCD),液晶显示屏的出射光为线偏振光。当本实施例所示增强现实设备100佩戴于用户的头部,透过增强现实设备100观看电子屏幕,且视线环绕电子屏幕旋转的过程中,无论电子屏幕的出射光的偏振态与偏振片332的透光轴方向是垂直或平行,四分之一波片40都会将电子屏幕出射的线偏振光变成圆偏振光,将电子屏幕的出射光衰减50%。当处理器开启主动快门镜片33时,第一偏振片332将圆偏振光变为线偏振光进入液晶盒331中,并经液晶盒331和结合器31射入人眼,削减用户观看电子屏幕时存在的亮度差异,有助于提高用户佩戴增强现实设备100观看电子屏幕时的使用感受。
也就是说,本实施例所示增强现实设备100在佩戴在用户头部上时,可以不需要取下增强现实设备100,只需要开启主动快门镜片33即可观看周围环境的电子屏幕,提高了增强现实设备100的使用便捷性。
本实施例中,两个增强现实组件30的主动快门镜片33均包括液晶盒331、第一偏振片332和第二偏振片333。液晶盒331与处理器34耦合,第一偏振片332覆盖于液晶盒331的外表面,第二偏振片333覆盖于液晶盒331的内表面。当处理器34将主动快门镜片33打开时,环境光线Lc经第一偏振片332过滤后,可以依次穿过液晶盒331和第二偏振片333射向结合器31的外表面313,并自结合器31的内表面312射向人眼,使用户的左眼和右眼均可以穿过主动快门镜片33和结合器31观看外界的真实环境。
具体的,四分之一波片40有两个。一个四分之一波片40覆盖一个第一偏振片332的外表面,且其快轴方向与该第一偏振片332的透光轴方向之间的夹角为45度。另一个四分之一波片40覆盖另一个第一偏振片332的外表面,且其快轴方向与该第一偏振片332的偏振方向之间的夹角为45度。也就是说,每一四分之一波片40的快轴方向与其覆盖的第一偏振片332的透光轴方向之间的夹角为45度,以保证用户在佩戴增强现实设备100观看电子屏幕,且两个眼睛的视线环绕电子屏幕旋转时,两个眼睛所观看到的电子屏幕的亮度差异较小,提高用户佩戴增强现实设备100观看电子屏幕的舒适度。
其中,两个第一偏振片332的透光轴方向相同,两个四分之一波片40的快轴方向之间的夹角为90度,或者,两个第一偏振片332的透光轴方向之间的夹角为90度,两个四分之一波片40的快轴方向相同,以保证两个增强现实组件30分别通过偏振方向相互垂直的偏振光,比如左旋偏振光和右旋偏振光,使得增强现实设备100还可以用于三维(three  dimensional,3D)影视放映厅。也就是说,本实施方式所示增强现实设备100不仅可以用于观看虚实结合的显示画面,还可以在处理器34开启主动快门镜片33时观看3D视频。即,增强现实设备100可以同时兼容偏振式和主动快门式两种放映方式。
请参阅图10,图10是图3所示结构中A区域在第四种实施例下的放大结构示意图。
本实施例所示增强现实设备与上述第一种增强现实设备100的不同之处在于,增强现实设备100还包括变焦器50,变焦器50安装于结合器31的内表面312,且覆盖于结合器31的内表面312。即,变焦器50位于结合器31靠近人眼的一侧,用以对用户的视力进行矫正。当用户患有近视、远视或散光等视力问题时,变焦器50可以在用户观看携带数字内容的虚拟场景或外界的真实场景时纠正用户的屈光不正,提高用户观看虚拟场景或外界的真实场景时的清晰度,提高用户使用增强现实设备100的使用感受。其中,变焦器50可以为液晶透镜、液体透镜、Alvarez透镜或机械变焦透镜等可以实现变焦的器件。应当理解的是,变焦器50可以为具有度数的镜片等光焦度固定的光学器件,也可以与处理器34耦合的光焦度可调的光学器件,用户在使用增强现实设备100时可依据用户的屈光度将变焦器50的光焦度调整至与用户的视力相匹配,以提高增强现实设备100的适配度,进而提高增强现实设备100的使用灵活性。
请参阅图11,图11是图3所示结构中A区域在第四种实施例下的放大结构示意图。
本实施例所示增强现实设备100与上述第三种增强现实设备100的不同之处在于,增强现实设备100还包括眼球追踪组件60。眼球追踪组件60安装于镜架10,用以追踪眼球的视线。处理器34耦合变焦器50和眼球追踪组件60,用以调整变焦器50的光焦度。
本实施例中,眼球追踪组件60安装于镜架10的镜框11,且朝向镜框11内侧。其中,眼球追踪器60包括一个或多个红外发光二极管(infrared light-emitting diode,IR LED)61以及一个或多个红外相机(infrared camera,IR camera)62。具体的,红外发光二极管61安装于第一边框131,且朝向镜框11内侧。红外相机62安装于第二边框133,且朝向镜架11内侧。红外发光二极管61发射红外光线,红外光线射入用户的眼球,经过用户的角膜反射后进入红外相机52后成像,处理器34通过确定图像中红外光线的光斑位置确定用户的光轴方向,再经校准后确定用户视线的方向。需要说明的是,本实施例所示眼球追踪器60并不仅限于上文所述的眼球追踪技术,其他的眼球追踪技术均可以,本申请对此不作具体限定。
处理器34关闭图像投影机32并将变焦器50的光焦度调节为第一光焦度时,即图像投影机32处于关闭状态且变焦器50的光焦度为第一光焦度时,变焦器50可以在用户观看外界的真实场景时矫正用户的屈光不正,提高用户在观看真实场景时的清晰度,提高用户的使用感受。其中,当用户患有近视、远视或散光等视力问题时,第一光焦度为用户眼球的屈光度。
处理器34开启图像投影机32时,即图像投影机32处于开启状态时,眼球追踪组件60获取眼球观看的虚拟场景的辐辏深度,处理器34根据眼球追踪组件60的获取结果将变焦器50的光焦度调节为第二光焦度。具体的,眼球追踪组件60追踪眼球的视线,并根据用户视线的方向确定用户观察的虚拟场景的辐辏深度,处理器34根据该辐辏深度改变虚拟场景的虚像距,将虚拟场景的位置调整至该辐辏深度上。其中,第二光焦度为第一光焦度 与用户观察到的虚像深度的倒数之和。此时,变焦器50不仅可以在用户观察虚拟数字内容时矫正用户的屈光不正提高用户观看数字内容时的清晰度,提高用户的使用感受,还可以改变数字内容的虚像距,解决视觉辐辏调节冲突(vergence-accommodation conflict,VAC),减小用户使用增强现实设备100时的不适感,提高用户使用增强现实设备100的舒适度。
接下来,为了便于理解,对增强现实设备100在工作时,图像投影机32、主动快门镜片33和变焦器50在各个时间段的工作状态进行举例说明。其中,以在0-t 12时间段,用户双眼具有D 0(如-4.0D)的屈光不正为例进行描述。
请参阅图12,图12是图11所示增强现实设备100在工作时图像投影机32、主动快门镜片33和变焦器50的工作状态示意图。
本实施例中,增强现实设备100工作时,在0-t 1、t 2-t 3、t 4-t 5、t 6-t 7、t 8-t 9和t 10-t 11时间段内,图像投影机32处于开启状态,主动快门镜片33处于关闭状态,处理器34根据眼球追踪器60获得的用户视线方向确定用户观察的虚像的深度为L(如0.5m),则虚像深度的倒数ΔD为1/L(如-2.0D),将变焦器50的光焦度调至D 0+ΔD(如-6.0D)。此时,变焦器50的第二光焦度为D 0+ΔD,不仅可以保证图像投影机32投射的显示光线不会从增强现实设备100中泄露出去,还可以保证用户清晰地观看到数字内容。在t 1-t 2、t 3-t 4、t 5-t 6、t 7-t 8、t 9-t 10和t 11-t 12时间段内,图像投影机32处于关闭状态,且主动快门镜片33处于开启状态,处理器34将变焦器50的光焦度调至D 0。此时,变焦器50的第一光焦度为D 0,以保证将人眼可透过主动快门镜片33和结合器31清晰地看到外界的真实场景。
本申请还提供一种上述任一种增强现实设备100的显示方法,包括:
在第一时段,开启图像投影机32并关闭主动快门镜片33,图像投影机32向结合器31投射显示光线L 0,部分显示光线L 0自结合器31的内表面312出射,部分显示光线L 0自结合器31的外表面313出射,主动快门镜片33遮挡自结合器31的外表面313出射的显示光线L 0。具体的,处理器34开启图像投影机32并关闭主动快门镜片33,主动快门镜片32防止了自结合器31的外表面313出射的显示光线L 0射入外界环境,避免了携带有数字内容的显示光线L 0泄露出去,不仅可以提高使用者的隐私性和增强现实设备100的社交性,还可以避免泄露出去的显示光线L 0在增强现实设备100的表面形成小的显示窗,提高使用者使用增强现实设备100时的外观精美度。
在第二时段,关闭图像投影机32并开启主动快门镜片33,环境光线Lc穿过主动快门镜片33后,自结合器31的外表面313进入结合器31,并自结合器31的内表面312出射。具体的,处理器34关闭图像投影机32并开启主动快门镜片33,用户能穿过结合器31和主动快门镜片33观看到外界的真实场景,以保证增强现实设备100具有一定的透过率。其中,第二时段的长度等于第一时段的长度。需要说明的是,在其他实施例中,第二时段的长度也可以大于或小于第一时段的长度,本申请对此不作具体限定。
本实施例中,第一时段和第二时段交替进行。其中,第一时段和第二时段形成一个周期,一个周期小于或等于1/60秒。需要了解的是,人眼可感知的闪烁频率为60Hz。由于一个周期小于或等于1/60秒。即,1秒至少包括60个周期。也即,1秒内第一时段和第二时段至少出现60次。此时,第一时段和第二时段的交替频率大于120Hz,根据视觉暂留现象(又称视觉暂停现象或余晖效应),此时人眼无法感知到虚拟场景与外界的真实场景的切 换,相当于人眼既能够看到虚拟场景的存在,又能看到外界的真实场景的存在。即,可以在保证增强现实设备100的透过率的前提下遮挡从结合器泄露的显示光线L 0
需要说明的是,在本实施例所示增强显示设备的显示方法中,可以通过调整第一时段和第二时段的时间占比来调整增强现实设备100的透过率。比如,当第一时段的时间占比占整个周期的20%时,增强现实设备100的透过率仅下降20%,即在保证了增强现实设备100的透过率的前提下,实现了对结合器31的泄露光线L 2的屏蔽。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (13)

  1. 一种增强现实设备,其特征在于,包括镜架、结合器、主动快门镜片、图像投影机以及处理器,所述结合器安装于所述镜架,所述结合器包括相背设置的内表面和外表面,所述主动快门镜片安装于所述结合器的外表面,所述图像投影机安装于所述镜架,所述处理器耦合所述图像投影机和所述主动快门镜片;
    所述处理器用于开启所述图像投影机并关闭所述主动快门镜片,所述图像投影机向所述结合器投射显示光线,部分所述显示光线自所述结合器的内表面出射,部分所述显示光线自所述结合器的外表面出射,所述主动快门镜片遮挡自所述结合器的外表面出射的所述显示光线;
    所述处理器还用于关闭所述图像投影机并开启所述主动快门镜片,环境光线穿过所述主动快门镜片后,自所述结合器的外表面进入所述结合器,并自所述结合器的内表面出射。
  2. 根据权利要求1所述的增强现实设备,其特征在于,所述主动快门镜片覆盖所述结合器的外表面。
  3. 根据权利要求1或2所述的增强现实设备,其特征在于,所述主动快门镜片包括与所述处理器耦合的液晶盒、位于所述液晶盒背离所述结合器一侧的第一偏振片以及位于所述液晶盒与所述结合器之间的第二偏振片,所述第一偏振片与所述第二偏振片的透光轴方向之间的夹角为90度。
  4. 根据权利要求3所述的增强现实设备,其特征在于,所述增强现实设备还包括四分之一波片,所述四分之一波片安装于所述第一偏振片的外表面,且所述四分之一波片的快轴方向与所述第一偏振片的透光轴方向之间的夹角为45度。
  5. 根据权利要求1或2所述的增强现实设备,其特征在于,所述增强现实设备包括两个增强现实组件,两个所述增强现实组件间隔安装于所述镜架,每一所述增强现实组件包括所述结合器、所述图像投影机和所述主动快门镜片,两个所述增强现实组件的结合器并排设置。
  6. 根据权利要求5所述的增强现实设备,其特征在于,每一所述增强现实组件的主动快门镜片均包括与所述处理器耦合的液晶盒、位于所述液晶盒背离所述结合器一侧的第一偏振片以及位于所述液晶盒与所述结合器之间的第二偏振片,每一所述增强现实组件的所述第一偏振片与所述第二偏振片的透光轴方向之间的夹角为90度。
  7. 根据权利要求6所述的增强现实设备,其特征在于,所述增强现实设备包括两个四分之一波片,一个所述四分之一波片安装于一个所述第一偏振片的外表面,且一个所述四分之一波片的快轴方向与一个所述第一偏振片的透光轴方向之间的夹角为45度,另一个所述四分之一波片安装于另一个所述第一偏振片的外表面,且另一个所述四分之一波片的快轴方向与另一个所述第一偏振片的透光轴方向之间的夹角为45度。
  8. 根据权利要求7所述的增强现实设备,其特征在于,两个所述第一偏振片的透光轴方向相同,两个所述四分之一波片的快轴方向之间的夹角为90度,或,两个所述第一偏振片的透光轴方向之间的夹角为90度,两个所述四分之一波片的快轴方向相同。
  9. 根据权利要求1-8中任一项所述的增强现实设备,其特征在于,所述增强现实设备 还包括变焦器,所述变焦器安装于所述结合器的内表面。
  10. 根据权利要求9所述的增强现实设备,其特征在于,所述增强现实设备还包括眼球追踪组件,所述眼球追踪组件安装于所述镜架,所述处理器还耦合所述变焦器和所述眼球追踪组件;
    所述处理器用于关闭所述图像投影机,并将所述变焦器的光焦度调节为第一光焦度;
    所述处理器用于开启所述图像投影机,所述眼球追踪组件用以获取眼球观看的虚拟场景的辐辏深度,所述处理器根据所述眼球追踪组件的获取结果将所述变焦器的光焦度调整为第二光焦度。
  11. 一种增强现实设备的显示方法,其特征在于,所述增强现实设备为如权利要求1-10中任一项所述的增强现实设备,所述增强现实设备的显示方法包括:
    在第一时段,开启所述图像投影机并关闭所述主动快门镜片,所述图像投影机向所述结合器投射显示光线,部分所述显示光线自所述结合器的内表面出射,部分所述显示光线自所述结合器的外表面出射,所述主动快门镜片遮挡自所述结合器的外表面出射的所述显示光线;
    在第二时段,关闭所述图像投影机并开启所述主动快门镜片,环境光线穿过所述主动快门镜片后,自所述结合器的外表面进入所述结合器,并自所述结合器的内表面出射。
  12. 根据权利要求11所述的增强现实设备的显示方法,其特征在于,所述第一时段与所述第二时段交替进行。
  13. 根据权利要求12所述的增强现实设备的显示方法,其特征在于,所述第一时段与所述第二时段形成一个周期,所述周期小于或等于1/60秒。
PCT/CN2021/081545 2020-03-28 2021-03-18 增强现实设备及其显示方法 WO2021197082A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP21781311.2A EP4109166A4 (en) 2020-03-28 2021-03-18 AUGMENTED REALITY APPARATUS AND DISPLAY METHOD THEREOF
US17/915,401 US11914155B2 (en) 2020-03-28 2021-03-18 Augmented reality device and display method thereof
IL296491A IL296491A (en) 2020-03-28 2021-03-18 Augmented reality display device and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010233012.4A CN113448089B (zh) 2020-03-28 2020-03-28 增强现实设备及其显示方法
CN202010233012.4 2020-03-28

Publications (1)

Publication Number Publication Date
WO2021197082A1 true WO2021197082A1 (zh) 2021-10-07

Family

ID=77808243

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/081545 WO2021197082A1 (zh) 2020-03-28 2021-03-18 增强现实设备及其显示方法

Country Status (5)

Country Link
US (1) US11914155B2 (zh)
EP (1) EP4109166A4 (zh)
CN (1) CN113448089B (zh)
IL (1) IL296491A (zh)
WO (1) WO2021197082A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115032799A (zh) * 2022-08-10 2022-09-09 歌尔股份有限公司 一种增强现实眼镜
WO2023136861A1 (en) * 2022-01-11 2023-07-20 Google Llc Multiple incoupler waveguide and method

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113448088A (zh) * 2020-03-28 2021-09-28 华为技术有限公司 增强现实设备
EP4154050A4 (en) * 2020-05-22 2024-06-05 Magic Leap, Inc. AUGMENTED AND VIRTUAL REALITY DISPLAY SYSTEMS WITH CORRELATED OPTICAL REGIONS
CN114967146B (zh) * 2022-05-17 2024-07-23 深圳惠牛科技有限公司 增强现实结构和ar眼镜
CN117631282A (zh) * 2022-08-18 2024-03-01 北京字跳网络技术有限公司 Ar头戴式显示设备、控制方法及控制装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9116337B1 (en) * 2012-03-21 2015-08-25 Google Inc. Increasing effective eyebox size of an HMD
CN106990532A (zh) * 2017-03-29 2017-07-28 张卓鹏 一种具有遮挡效应的增强现实显示系统及显示方法
US20190056591A1 (en) * 2017-08-18 2019-02-21 Microsoft Technology Licensing, Llc Optical waveguide with multiple antireflective coatings
CN110058412A (zh) * 2019-04-23 2019-07-26 深圳惠牛科技有限公司 一种传输解耦的大视场光波导镜片
CN110221439A (zh) * 2019-07-11 2019-09-10 Oppo广东移动通信有限公司 增强现实设备及增强现实调节方法
CN110618530A (zh) * 2019-05-22 2019-12-27 上海猫虎网络科技有限公司 可动态调节透明度的新型ar眼镜

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10301055A (ja) * 1997-04-25 1998-11-13 Sony Corp 画像表示装置
JP3683575B2 (ja) * 2003-10-28 2005-08-17 オリンパス株式会社 頭部装着型ディスプレイコントローラー
US8599247B2 (en) * 2008-01-30 2013-12-03 Samsung Electronics Co., Ltd. Stereoscopic image system employing an electronic controller which controls the polarization plane rotator in synchronization with an output image of the display device
US9134534B2 (en) * 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US8941559B2 (en) * 2010-09-21 2015-01-27 Microsoft Corporation Opacity filter for display device
JP5833847B2 (ja) * 2011-07-04 2015-12-16 日東電工株式会社 立体画像表示システム
BR102014024863A2 (pt) * 2014-10-06 2016-05-24 João Sylvio Junior Zanetti sistema para exibição de imagem 3d completa e sem cintilação e imagens 2d simultâneas com a utilização de óculos ativo
CN104485427B (zh) * 2014-12-26 2018-02-13 北京维信诺科技有限公司 一种透明有机电致发光装置及其制备方法
JP6892213B2 (ja) * 2015-04-30 2021-06-23 ソニーグループ株式会社 表示装置及び表示装置の初期設定方法
US11828942B2 (en) 2018-03-12 2023-11-28 Magic Leap, Inc. Tilting array based display
CN109387942B (zh) * 2018-03-28 2024-05-10 深圳惠牛科技有限公司 一种光学系统及增强现实设备
US20200018962A1 (en) 2018-07-11 2020-01-16 Facebook Technologies, Llc Adaptive lenses for near-eye displays
US10319154B1 (en) * 2018-07-20 2019-06-11 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for dynamic vision correction for in-focus viewing of real and virtual objects

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9116337B1 (en) * 2012-03-21 2015-08-25 Google Inc. Increasing effective eyebox size of an HMD
CN106990532A (zh) * 2017-03-29 2017-07-28 张卓鹏 一种具有遮挡效应的增强现实显示系统及显示方法
US20190056591A1 (en) * 2017-08-18 2019-02-21 Microsoft Technology Licensing, Llc Optical waveguide with multiple antireflective coatings
CN110058412A (zh) * 2019-04-23 2019-07-26 深圳惠牛科技有限公司 一种传输解耦的大视场光波导镜片
CN110618530A (zh) * 2019-05-22 2019-12-27 上海猫虎网络科技有限公司 可动态调节透明度的新型ar眼镜
CN110221439A (zh) * 2019-07-11 2019-09-10 Oppo广东移动通信有限公司 增强现实设备及增强现实调节方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4109166A4

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023136861A1 (en) * 2022-01-11 2023-07-20 Google Llc Multiple incoupler waveguide and method
CN115032799A (zh) * 2022-08-10 2022-09-09 歌尔股份有限公司 一种增强现实眼镜

Also Published As

Publication number Publication date
EP4109166A1 (en) 2022-12-28
CN113448089A (zh) 2021-09-28
US20230129018A1 (en) 2023-04-27
EP4109166A4 (en) 2023-11-29
US11914155B2 (en) 2024-02-27
IL296491A (en) 2022-11-01
CN113448089B (zh) 2023-05-09

Similar Documents

Publication Publication Date Title
WO2021197082A1 (zh) 增强现实设备及其显示方法
JP7418706B2 (ja) 拡張現実装置及びそのための光学システム
WO2022206638A1 (zh) 增强现实设备及其显示方法
KR101672707B1 (ko) 헤드 마운트 디스플레이용 광학 시스템
CN108398791B (zh) 一种基于偏光隐形眼镜的近眼显示装置
CN106019585B (zh) 头部佩戴型显示器
CA2828407A1 (en) Light control in head mounted displays
US20240171846A1 (en) Electronic Device and Control Method Thereof
WO2021197100A1 (zh) 增强现实设备
KR101598480B1 (ko) 시스루형 헤드 마운트 표시장치
KR20170017854A (ko) 헤드 마운트 디스플레이용 광학 시스템
US11852827B2 (en) Switchable artificial reality device
JP3354008B2 (ja) 頭部装着型映像表示装置
JP7444983B2 (ja) 立体表示装置
US20230418083A1 (en) Eye-Movement Tracking Apparatus And Electronic Device
TWI769783B (zh) 光學模組及近眼顯示裝置
TWI807358B (zh) 具有液晶元件的餅乾透鏡組件
CN110967828A (zh) 显示系统以及头戴显示装置
US12092830B1 (en) AR glasses
US20240345389A1 (en) Waveguide based imaging system for object tracking and waveguide based display system for reducing world side ghost
RU2413264C2 (ru) Устройство воспроизведения изображения (варианты)
WO2024215645A2 (en) Waveguide based imaging system for object tracking and waveguide based display system for reducing world side ghost
KR20210031213A (ko) 헤드마운트 표시장치
KR200274501Y1 (ko) 단일 화상원으로 3차원 영상 구현이 가능한 헤드 장착디스플레이
TW202409657A (zh) 左右眼可分別微調出清晰的人工生成影像的增強現實裝置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21781311

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021781311

Country of ref document: EP

Effective date: 20220919

NENP Non-entry into the national phase

Ref country code: DE