WO2019144634A1 - 增强现实装置、增强现实系统及其信息提示方法 - Google Patents

增强现实装置、增强现实系统及其信息提示方法 Download PDF

Info

Publication number
WO2019144634A1
WO2019144634A1 PCT/CN2018/107323 CN2018107323W WO2019144634A1 WO 2019144634 A1 WO2019144634 A1 WO 2019144634A1 CN 2018107323 W CN2018107323 W CN 2018107323W WO 2019144634 A1 WO2019144634 A1 WO 2019144634A1
Authority
WO
WIPO (PCT)
Prior art keywords
augmented reality
polarization state
infrared light
reflected
infrared
Prior art date
Application number
PCT/CN2018/107323
Other languages
English (en)
French (fr)
Inventor
洪涛
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Priority to EP18857419.8A priority Critical patent/EP3748416B1/en
Priority to US16/337,536 priority patent/US11460699B2/en
Publication of WO2019144634A1 publication Critical patent/WO2019144634A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/28Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising
    • G02B27/283Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising used for beam splitting or combining
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/30Collimators
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/30Polarising elements
    • G02B5/3083Birefringent or phase retarding elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • G02B2027/0134Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Definitions

  • Embodiments of the present disclosure relate to an augmented reality device, an augmented reality system, and an information prompting method thereof.
  • Augmented Reality superimposes virtual information on real scenes, and integrates virtual information and real scenes with the aid of sensing and display devices, and finally presents the viewer with a new environment with realistic sensory effects.
  • At least one embodiment of the present disclosure provides an augmented reality device, including a microdisplay and an augmented reality component.
  • the microdisplay is configured to emit light that carries display content, the light comprising a first portion of light in a first polarization state and a second portion of light in a second polarization state; and the augmented reality element is configured to Converting the first ray portion from the first polarization state to the second polarization state, converting the second ray portion from the second polarization state to the first polarization state, and The first ray portion of the second polarization state and the second ray portion of the first polarization state are coupled out; wherein the first polarization state is orthogonal to the second polarization state.
  • an optical waveguide component is further configured to receive and transmit the first ray portion and the first portion of the augmented reality component coupling output a second ray portion;
  • the augmented reality element comprising a polarization beam splitter, a first concave mirror, a second concave mirror, a first phase retarder, and a second phase retarder.
  • the polarization beam splitter is configured to receive the light from the microdisplay, transmit the first portion of light in the first polarization state, and reflect the second light in the second polarization state a portion of the first ray transmitted by the polarization beam splitter disposed through the first phase retarder, reflected by the first concave mirror, and again passed through the first phase retarder
  • the first light ray portion is converted from the first polarization state to the second polarization state by the first phase retarder and the first concave mirror;
  • the second ray portion of the mirror reflection is disposed through the second phase retarder, is reflected by the second concave mirror, and passes through the second phase retarder again, wherein
  • the second ray portion is converted from the second polarization state to the first polarization state by a combination of a two-phase retarder and the second concave mirror; and the polarization beam splitter is further configured to reflect The second polarization state The first ray portion and the second ray portion in the first polarization state such that
  • an augmented reality device further includes an infrared emitter and an infrared detector.
  • the infrared emitter is configured to emit infrared light to the augmented reality element;
  • the augmented reality element is configured to couple the infrared light to an entrance face of the optical waveguide element;
  • the optical waveguide element is configured to Transmitting the infrared light to the eye through a semi-transflective array;
  • the optical waveguide element and the augmented reality element are further configured to transmit reflected infrared light reflected by the eye along the opposite of the infrared light
  • a path is transmitted to the infrared detector; and the infrared detector is configured to detect the reflected infrared light.
  • the infrared light emitted by the infrared emitter includes a first infrared light portion in the first polarization state and a second polarization state in the second polarization state.
  • the polarization beam splitter further configured to transmit the first infrared light portion in the first polarization state and to reflect the second infrared light portion in the second polarization state;
  • the first infrared light portion transmitted by the polarization beam splitter is disposed through the first phase retarder, is reflected by the first concave mirror, and passes through the first phase retarder again, wherein The first infrared light portion is converted from the first polarization state to the second polarization state by the first phase retarder and the first concave mirror;
  • the polarization beam splitter is The reflected second infrared light portion is disposed to pass through the second phase retarder, is reflected by the second concave mirror, and passes through the second phase retarder again, wherein Two-phase retarder and The second infrared light portion is converted from the second polarization state to the first polarization state by a second concave mirror;
  • the polarization beam splitter is further configured to reflect the second polarization state The first infrare
  • the optical waveguide component is further configured to transmit the reflected infrared light reflected by the eye to the polarization beam splitter via the semi-transparent surface array.
  • the reflected infrared light includes a first reflected infrared light portion in the second polarization state and a second reflected infrared light portion in the first polarization state;
  • the polarization beam splitter is further configured to reflect at The first reflected infrared light portion of the second polarization state, and transmitting the second reflected infrared light portion in the first polarization state; the second polarization state reflected by the polarization beam splitter
  • the first reflected infrared light portion is disposed to pass through the first phase retarder, is reflected by the first concave mirror, and passes through the first phase retarder again, wherein the first phase The first reflected infrared light portion is converted from the second polarization state to the first polarization state by the retarder and the first con
  • the first phase retarder and the first concave mirror are both disposed on a first side of the augmented reality element; the second phase retarder And the second concave mirror are disposed on a first end of the third side of the augmented reality element adjacent to the first side, the first end of the third side being adjacent to the first side And an incident surface of the optical waveguide element is disposed at a first end of the fourth side of the augmented reality element adjacent to the first side, the first end of the fourth side being adjacent to the first end The side is opposite to the first end of the third side.
  • the microdisplay, the infrared emitter, and the infrared detector are disposed on a second side of the augmented reality element opposite to the first side. side.
  • the augmented reality component further includes a first infrared beam splitter.
  • the microdisplay is disposed on a second side of the augmented reality element opposite to the first side; the infrared emitter and the infrared detector are both disposed on a third side of the augmented reality element a second end, the second end of the third side is adjacent to the second side; or, the infrared emitter and the infrared detector are both disposed at a second end of the fourth side of the augmented reality element, The second end of the fourth side is adjacent to the second side; and the first infrared beam splitter is configured to reflect the infrared light from the infrared emitter to the polarization beam splitter, and The light emitted by the microdisplay is transmitted to the polarization beam splitter.
  • the augmented reality component further includes a second infrared beam splitter.
  • the infrared emitter and the infrared detector are both disposed on a second side of the augmented reality element opposite to the first side;
  • the microdisplay is disposed on a third side of the augmented reality element a second end, the second end of the third side is adjacent to the second side; or, the microdisplay is disposed at a second end of the fourth side of the augmented reality element, and the second end of the fourth side Adjacent to the second side and opposite the second end of the third side;
  • the second infrared beamsplitter is configured to transmit the infrared light from the infrared emitter to the polarization beam splitter, And reflecting the light emitted by the microdisplay to the polarization beam splitter.
  • the semi-transflective surface array is disposed in the optical waveguide element and includes a plurality of semi-reverse half lenses arranged in an array; and the semi-reverse The semi-transmissive array is configured to: transmit the light incident from the incident surface of the optical waveguide element onto the array of semi-transparent surfaces and the infrared light to the eye; and The reflected infrared light reflected by the eye is transmitted to an incident surface of the optical waveguide element such that the reflected infrared light enters the augmented reality element.
  • the first polarization state is a p-polarization state
  • the second polarization state is an s-polarization state
  • the first phase retarder and the second phase is a quarter-wave phase retarder.
  • At least one embodiment of the present disclosure also provides an augmented reality system including an augmented reality device and controller as provided by embodiments of the present disclosure.
  • the controller is configured to: determine, according to the intensity of the reflected infrared light detected by the infrared detector, whether the user associated with the augmented reality device is in a fatigue state; and generate control when determining that the user is in a fatigue state a signal; and providing prompt information according to the control signal.
  • the controller is configured to determine whether the user is in a fatigue state according to an eyelid closing time or a closing frequency of the user.
  • an augmented reality system provided by an embodiment of the present disclosure further includes a sensor.
  • the sensor is configured to acquire physiological parameters of the user, the sensor comprising at least one of a blood pressure sensor and a pulse sensor.
  • the controller is further configured to: determine, according to the physiological parameter, whether the user is in an abnormal state; and when determining that the user is in an abnormal state, generate a a control signal; and providing the prompt information according to the control signal.
  • an augmented reality system provided by an embodiment of the present disclosure further includes an electrical pulse generator and a surface electrode, the electrical pulse generator configured to generate an electrical pulse signal in response to the control signal, and to generate the electrical pulse signal Transfer to the surface electrode.
  • an augmented reality system provided by an embodiment of the present disclosure further includes a positioning device configured to acquire location information of the user.
  • an augmented reality system provided by an embodiment of the present disclosure further includes a voice generating device.
  • the voice generating device is configured to play the hint information in response to the control signal.
  • an augmented reality system provided by an embodiment of the present disclosure further includes an image rendering device.
  • the image rendering device is configured to render an image corresponding to the prompt information in response to the control signal, the microdisplay being configured to emit the light ray comprising the prompt information.
  • an augmented reality system provided by an embodiment of the present disclosure further includes a communication device configured to communicate with a preset contact in response to the control signal.
  • At least one embodiment of the present disclosure further provides an information prompting method for an augmented reality system, comprising: emitting infrared light to an eye; and determining, according to the intensity of reflected infrared light returned by the eye, whether a user associated with the augmented reality system is In a fatigue state; generating a control signal when determining that the user is in a fatigue state; and providing prompt information according to the control signal.
  • FIG. 1 is a schematic diagram of an exemplary augmented reality device
  • FIG. 2 is a schematic diagram of an augmented reality device according to an embodiment of the present disclosure
  • FIG. 3 is a schematic diagram of an augmented reality device according to another embodiment of the present disclosure.
  • FIG. 4 is a second schematic diagram of an augmented reality device according to another embodiment of the present disclosure.
  • Figure 5 is a side elevational view of the second side of the augmented reality element of Figure 4.
  • FIG. 6 is a schematic diagram of an augmented reality device according to still another embodiment of the present disclosure.
  • Figure 7 is a schematic diagram showing a binocular form corresponding to the augmented reality device of Figure 6;
  • FIG. 8 is a schematic diagram of an augmented reality device according to still another embodiment of the present disclosure.
  • FIG. 9 is a schematic diagram of an augmented reality device according to still another embodiment of the present disclosure.
  • FIG. 10 is a schematic diagram of an augmented reality device according to still another embodiment of the present disclosure.
  • FIG. 11 is a schematic diagram of an augmented reality system according to an embodiment of the present disclosure.
  • FIG. 12 is a schematic diagram of an augmented reality system according to another embodiment of the present disclosure.
  • FIG. 13 is a schematic diagram of glasses including an augmented reality system according to an embodiment of the present disclosure.
  • FIG. 14 is a schematic diagram of an augmented reality system according to still another embodiment of the present disclosure.
  • FIG. 15 is a schematic diagram of an augmented reality system according to still another embodiment of the present disclosure.
  • FIG. 16 is an information prompting method of an augmented reality system according to an embodiment of the present disclosure.
  • the geometric optical waveguide augmented reality device has the advantages of light weight, small size, light weight (thickness can be less than 2mm), it has attracted extensive attention.
  • the geometric optical waveguide utilizes the reflective surface and the semi-reverse semi-transparent surface array in the planar waveguide. Achieve coupling and exit of light.
  • FIG. 1 illustrates an augmented reality device that is mainly composed of an optical waveguide component, a projection system, and a microdisplay.
  • the image displayed by the microdisplay is enlarged and projected into the optical waveguide component through the projection system, and is incident on the reflective surface of the optical waveguide component.
  • the light is reflected by the reflective surface to satisfy the total reflection condition of the light, thereby realizing the coupling input of the image signal.
  • the image signal coupled into the optical waveguide component is transmitted by total reflection in the optical waveguide component, and finally incident on the semi-reverse semi-transmissive surface array, and some of the light is reflected by the half-reverse half lens, and then the total reflection condition is no longer satisfied, and the image signal is realized. Coupling output. The other part of the light continues to be totally reflected in the optical waveguide element through the half-reverse half-lens until the next time it is incident on the semi-reverse semi-transmissive array, and then splits again and after several times through the half-reverse semi-transmissive array. Access to the eye (eg, the human eye) completes the transmission of the image signal.
  • the eye eg, the human eye
  • the light in the real scene can directly enter the eye through the optical waveguide component, and the display optical path and the transmitted optical path are superimposed at the eye position to realize the transmissive near-eye display, that is, the virtual information and the real scene can be integrated and presented simultaneously. Observer.
  • the light emitted by the microdisplay may include light in different polarization states, and when the light in different polarization states passes through the projection system, for example, the polarization beam splitter in the projection system allows only one polarization.
  • the light of the state is transmitted (or reflected), so that the light incident on the incident surface of the optical waveguide component may lose light of other polarization states, thereby making the augmented reality device have low light-passing efficiency, thereby failing to achieve high brightness.
  • Augmented reality display Moreover, the augmented reality device shown in Fig. 1 requires a projection system and has a complicated structure.
  • At least one embodiment of the present disclosure provides an augmented reality device, including a microdisplay and an augmented reality component.
  • the microdisplay is configured to emit light that carries display content, the light comprising a first portion of light in a first polarization state and a second portion of light in a second polarization state.
  • the augmented reality component is configured to convert the first ray portion from the first polarization state to the second polarization state, convert the second ray portion from the second polarization state to the first polarization state, and to be in the first state of the second polarization state
  • the light portion is coupled to the second light portion in the first polarization state.
  • At least one embodiment of the present disclosure also provides an augmented reality system corresponding to the above-described augmented reality device and an information presenting method thereof.
  • the augmented reality device, the augmented reality system and the information prompting method provided by the embodiments of the present disclosure can improve the light-passing efficiency by utilizing the light emitted by the micro-display with the maximum efficiency of the real-life component, thereby improving the brightness of the display of the augmented reality device.
  • the augmented reality device can also be used to detect the degree of fatigue of the user and provide prompt information. For example, when it is judged that the user is in a fatigue state, certain measures can be taken to keep the user awake, thereby improving safety. At the same time, the augmented reality device does not need to set a complicated projection system, and has a simple structure.
  • One embodiment of the present disclosure provides an augmented reality device, as shown in FIG. 2, comprising a microdisplay 11 and an augmented reality component 12.
  • the microdisplay 11 is configured to emit light that carries display content, the light comprising a first ray portion DP1 in a first polarization state and a second ray portion DS2 in a second polarization state.
  • the augmented reality element 12 is configured to convert the first light portion DP1 from the first polarization state to the second polarization state, the second light portion DS2 from the second polarization state to the first polarization state, and to be in the second polarization state
  • the first ray portion DS1 and the second ray portion DP2 in the first polarization state are coupled out, for example, coupled to the optical waveguide element 13.
  • the first ray portion DS1 in the second polarization state and the second ray portion DP2 in the first polarization state may be coupled to the optical waveguide element 13 through the incident surface 130 of the optical waveguide element 13.
  • the incident surface 130 of the optical waveguide element 13 is a surface on which the optical waveguide element 13 is connected to the augmented reality element 12.
  • the first ray portion in the first polarization state is identified as "DP1”
  • the first ray portion in the second polarization state is identified as “DS1”
  • the second ray portion is identified as "DS2”
  • the second ray portion in the first polarization state is identified as "DP2.”
  • the light carrying the display content emitted from the microdisplay 11 is, for example, light of a nonlinear polarization state, which may be a circular polarization state, an elliptical polarization state or a partial polarization state.
  • the first polarization state is, for example, a p-polarization state
  • the second polarization state is, for example, an s-polarization state
  • all polarization states of light can be decomposed into linearly polarized light and s-polarized state of a p-polarized state.
  • Linearly polarized light The following embodiments are the same as those described herein and will not be described again.
  • the augmented reality element converts the first light portion in the first polarization state and the second light portion in the second polarization state among the light emitted by the micro display into the A first ray portion of the second polarization state and a second ray portion of the first polarization state are coupled out to, for example, an optical waveguide element.
  • the utilization of the light emitted by the microdisplay can be improved, and the light-passing efficiency of the augmented reality device can be improved, so that the brightness of the display of the augmented reality device can be improved.
  • the augmented reality device further includes an optical waveguide component 13 configured to receive and transmit the first ray portion and the coupled portion of the augmented reality component 12 coupled out.
  • the two ray portions for example, the first ray portion is in the second polarization state, and the second ray portion is in the first polarization state.
  • the augmented reality element 12 includes a polarization beam splitter 120, a first concave mirror 121, a second concave mirror 122, a first phase retarder 123, and a second phase retarder 124.
  • the first phase retarder 123 and the second phase retarder 124 are both quarter-wave phase retarders.
  • the polarization beam splitter 120 is configured to receive light from the microdisplay 11, transmit a first ray portion DP1 in a first polarization state, and reflect a second ray portion DS2 in a second polarization state.
  • the first ray portion DP1 in the first polarization state transmitted by the polarization beam splitter 120 is disposed to pass through the first phase retarder 123, is reflected by the first concave mirror 121, and is again worn.
  • the first phase retarder 123 is passed.
  • the first ray portion DP1 in the first polarization state passes through the first phase retarder 123 and becomes left-handed (or right-handed) circularly polarized light, which is reflected by the first concave mirror 121 and becomes right-handed (or left-handed).
  • the circularly polarized light passes through the first phase retarder 123 again to become linearly polarized light of the second polarization state, that is, under the joint action of the first phase retarder 123 and the first concave mirror 121, the first light portion
  • the first polarization state is converted to the second polarization state (ie, the first light portion DP1 in the first polarization state is converted to the first light portion DS1 in the second polarization state).
  • the second ray portion DS2 in the second polarization state reflected by the polarization beam splitter 120 is disposed to pass through the second phase retarder 124, is reflected by the second concave mirror 122, and passes through the second phase retarder 124 again.
  • the second light portion DS2 in the second polarization state passes through the second phase retarder 124 and becomes left-handed (or right-handed) circularly polarized light, which is reflected by the second concave mirror 122 and becomes right-handed (or left-handed).
  • the circularly polarized light passes through the second phase retarder 124 again to become linearly polarized light of the first polarization state, that is, under the joint action of the second phase retarder 124 and the second concave mirror 122, the second light portion
  • the second polarization state is converted to the first polarization state (ie, the second light portion DS2 in the second polarization state is converted to the second light portion DP2 in the first polarization state).
  • the polarization beam splitter 120 is further configured to reflect the first light portion DS1 in the second polarization state and the second light portion DP2 in the first polarization state such that both the first light portion and the second light portion are coupled to
  • the incident surface 130 of the optical waveguide element 13 enters the optical waveguide element 13 through the incident surface 130.
  • the light rays coupled into the optical waveguide element 13 are transmitted by total reflection in the optical waveguide element 13, and finally incident on the semi-transmissive surface array 16, for example, the half-transparent surface array 16 is disposed in the optical waveguide element 13 and A plurality of half-reverse half mirrors 161 arranged in an array are included.
  • the semi-transflective array 16 is configured to transmit light incident on the array of transflective surfaces 16 from the entrance face 130 of the optical waveguide element 13 to the eye 100.
  • the half mirror 161 After some of the light is reflected by the half mirror 161, the total reflection condition is no longer satisfied, and the coupling and output of the light is realized.
  • the other part of the light passes through the half-reverse lens 161 and continues to be totally reflected in the optical waveguide element 13 until the next time it is incident on the half-reverse half lens 161, and is split again and again through the half-transparent-area array 16 several times before and after.
  • the transmission of light to the eye 100 completes the transmission of the display content.
  • the eye 100 may be, for example, a human eye, but the present disclosure is not limited thereto, as long as the light that can be coupled and emitted by the optical waveguide element 13 can be used as the eye 100.
  • the eye 100 may also include the eyes of an animal, and the following embodiments are described by taking the eye 100 as the human eye as an example.
  • the microdisplay 11 is disposed at the focus of the first concave mirror 121.
  • the radius of the first concave mirror 121 is R
  • the microdisplay 11 and the first concave mirror 121 are The distance is the focal length f
  • the augmented reality device may further include an infrared ray emitter 14 and an infrared ray detector 15.
  • the infrared emitter 14 is configured to emit infrared light to the augmented reality element 12.
  • the augmented reality element 12 is configured to couple infrared light to the incident face 130 of the optical waveguide element 13.
  • the optical waveguide component 13 is configured to transmit infrared light to the eye 100 through the semi-transflective array 16 such as the eye 100 being the human eye.
  • the optical waveguide element 13 and the augmented reality element 12 are also configured to transmit reflected infrared light reflected by the eye 100 to the infrared detector 15 along a transmission path opposite to the infrared light.
  • the infrared detector 15 is configured to detect reflected infrared light.
  • the infrared light emitted by the infrared emitter 14 includes a first infrared light portion IP1 in a first polarization state and a second infrared light portion IS2 in a second polarization state.
  • the polarization beam splitter 120 is also configured to transmit the first infrared light portion IP1 in the first polarization state and to reflect the second infrared light portion IS2 in the second polarization state.
  • the first infrared light portion in the first polarization state is identified as "IP1”
  • the first infrared light portion in the second polarization state is identified as "IS1" in the second polarization
  • the second infrared portion of the state is identified as "IS2”
  • the second portion of the infrared light in the first polarization state is identified as "IP2.”
  • the first infrared light portion IP1 in the first polarization state transmitted by the polarization beam splitter 120 is disposed to pass through the first phase retardation plate 123, is reflected by the first concave mirror 121, and is again Passing through the first phase retarder 123, the first infrared light portion is converted from the first polarization state to the second polarization state (ie, at the first polarization) by the interaction of the first phase retarder 123 and the first concave mirror 121
  • the first infrared light portion IP1 of the state is converted into the first infrared light portion IS1) in the second polarization state.
  • the second infrared light portion IS2 in the second polarization state reflected by the polarization beam splitter 120 is disposed to pass through the second phase retarder 124, is reflected by the second concave mirror 122, and passes through the second phase retarder 124 again.
  • the second infrared light portion is converted from the second polarization state to the first polarization state (ie, the second infrared light portion in the second polarization state) by the combination of the second phase retarder 124 and the second concave mirror 122.
  • IS2 is converted to a second infrared light portion IP2) in a first polarization state.
  • the polarization beam splitter 120 is further configured to reflect the first infrared light portion IS1 in the second polarization state and the second infrared light portion IP2 in the first polarization state such that the first infrared light portion and the second infrared light portion Both are coupled to the incident face 130 of the optical waveguide element 13.
  • the optical waveguide component 13 is configured to transmit the coupled first infrared light portion and second infrared light portion to the eye 100 via the semi-reverse semi-permeable array 16 .
  • the semi-transflective array 16 can also be configured to transmit infrared light incident on the semi-reverse semi-permeable array 16 from the entrance face 130 of the optical waveguide element 13 to the eye 100. It should be noted that the description of the working principle of the semi-transparent surface array 16 can be referred to the corresponding description in the above embodiments, and details are not described herein again.
  • the light that is transmitted to the infrared light of the eye 100 and reflected by the eye 100 is referred to as reflected infrared light, and the reflected infrared light is transmitted to the infrared detector through the optical waveguide element 13 and the augmented reality element 12 as described below.
  • the semi-transflective array 16 can also be configured to transmit reflected infrared light reflected by the eye 100 to the incident surface 130 of the optical waveguide element 13 such that reflected infrared light enters the augmented reality element 12.
  • the optical waveguide component 13 is further configured to transmit reflected infrared light reflected by the eye 100 to the polarization beam splitter 120 via the semi-reverse semi-transmissive array 16, the reflected infrared light comprising the second polarization state.
  • the first reflected infrared light portion in the first polarization state is identified as "RP1”
  • the first reflected infrared light portion in the second polarization state is identified as "RS1”
  • the second reflected infrared portion of the second polarization state is identified as "RS2”
  • the second reflected infrared portion of the first polarization state is identified as "RP2.”
  • the polarization beam splitter 120 is further configured to reflect the first reflected infrared light portion RS1 in the second polarization state and to transmit the second reflected infrared light portion RP2 in the first polarization state.
  • the first reflected infrared light portion RS1 in the second polarization state reflected by the polarization beam splitter 120 is disposed to pass through the first phase retarder 123, is reflected by the first concave mirror 121, and The first phase retarder 123 is again passed through.
  • the first reflected infrared light portion is converted from the second polarization state to the first polarization state, that is, the first reflected infrared light portion in the second polarization state.
  • RS1 is converted to a first reflected infrared light portion RP1 in a first polarization state.
  • the second reflected infrared light portion RP2 in the first polarization state transmitted by the polarization beam splitter 120 is disposed to pass through the second phase retarder 124, is reflected by the second concave mirror 122, and passes through the second phase retarder again. 124.
  • the second reflected infrared light portion is converted from the first polarization state to the second polarization state, that is, the second emission infrared portion in the first polarization state.
  • RP2 is converted to a second reflected infrared light portion RS2 in a second polarization state.
  • the polarization beam splitter 120 is further configured to transmit the first reflected infrared light portion RP1 in the first polarization state and the second reflected infrared light portion RS2 in the second polarization state such that the first reflected infrared light portion and the second portion The reflected infrared light portion is transmitted to the infrared ray detector 15 through the polarization beam splitter 120.
  • the number of the infrared detectors 15 is not limited.
  • two infrared detectors 15 may be disposed.
  • FIGS. 3 and 4 for the sake of clarity, the transmission path of the light emitted by the microdisplay 11 is not identified, and the transmission path of the light emitted by the microdisplay 11 can be referred to FIG. The following embodiments are the same as those described herein and will not be described again.
  • the first phase retarder 123 and the first concave mirror 121 are both disposed on the first side S1 of the augmented reality element 12.
  • the second phase retarder 124 and the second concave mirror 122 are both disposed at the first end S31 of the third side S3 of the augmented reality element 12 adjacent to the first side S1, and the first end S31 of the third side S3 is adjacent The first side S1.
  • the incident surface 130 of the optical waveguide element 13 is disposed on the first end S41 of the fourth side S4 of the augmented reality element 12 adjacent to the first side S1, and the first end S41 of the fourth side S4 is adjacent to the first side S1 and
  • the first end S31 of the third side S3 is opposite.
  • the microdisplay 11, the infrared emitter 14, and the infrared detector 15 may each be disposed on the augmented reality element 12 as opposed to the first side S1.
  • FIGS. 3 and 4 are plan views, the microdisplay 11 is partially blocked by the infrared emitter 14 and the infrared detector 15.
  • Figure 5 shows a side view of the second side S2 of the augmented reality element 12 of Figure 4. It should be noted that FIG. 5 is only an example of the positional relationship between the microdisplay 11 and the infrared ray emitters 14 and the infrared ray detectors 15, and other positional relationships may be adopted, which is not limited by the embodiments of the present disclosure.
  • the augmented reality element 12 may further include a first infrared beam splitter 125.
  • the microdisplay 11 is disposed on the second side S2 of the augmented reality element 12 opposite the first side S1.
  • Both the infrared emitter 14 and the infrared detector 15 are disposed at the second end S32 of the third side S3 of the augmented reality element 12, and the second end S32 of the third side S3 is adjacent to the second side S2.
  • the first infrared beam splitter 125 is configured to reflect infrared light from the infrared emitter 14 to the polarization beam splitter 120 and transmit the light emitted by the microdisplay 11 to the polarization beam splitter 120. That is, the first infrared beam splitter 125 functions to reflect infrared light and transmit light of other wavelengths, and does not affect the light emitted from the microdisplay 11.
  • the augmented reality device provided by the embodiments of the present disclosure may be embodied as glasses.
  • the augmented reality device may be implemented in a single-purpose form, for example, it can be worn on the left or right eye of the user.
  • FIG. 7 it can also be implemented in a dual purpose form.
  • the embodiments of the present disclosure are not limited thereto, and the following embodiments are the same as those described herein, and are not described again.
  • the infrared emitter 14 and the infrared detector 15 may also be disposed at the second end S42 of the fourth side S4 of the augmented reality element 12, and the second end S42 of the fourth side S4 is adjacent to the second side. S2.
  • the microdisplay 11 is disposed on a second side S2 of the augmented reality element 12 opposite the first side S1; the first infrared beam splitter 125 is configured to be from an infrared emitter The infrared light of 14 is reflected to the polarization beam splitter 120, and the light emitted from the microdisplay 11 is transmitted to the polarization beam splitter 120. It should be noted that the setting position of the first infrared beam splitter 125 is correspondingly changed at this time, so that the infrared light emitted by the infrared emitter 14 can be reflected to the polarization beam splitter 120.
  • the augmented reality element 12 may further include a second infrared beam splitter 126. Both the infrared emitter 14 and the infrared detector 15 are disposed on the second side S2 of the augmented reality element 12 opposite the first side S1.
  • the microdisplay 11 is disposed at the second end S32 of the third side S3 of the augmented reality element 12, and the second end S32 of the third side S3 is adjacent to the second side S2.
  • the second infrared beam splitter 126 is configured to transmit infrared light from the infrared emitter 14 to the polarization beam splitter 120 and reflect the light emitted by the microdisplay 11 to the polarization beam splitter 120. That is, the second infrared beam splitter 126 functions to reflect the light in the visible light range emitted by the microdisplay 11 and transmit the infrared light emitted from the infrared emitter 14.
  • the microdisplay 11 may also be disposed at the second end S42 of the fourth side S4 of the augmented reality element 12, and the second end S42 of the fourth side S4 is adjacent to the second side S2 and to the third side S3.
  • the second end S32 is opposite.
  • both the infrared emitter 14 and the infrared detector 15 are disposed on the second side S2 of the augmented reality element 12 opposite the first side S1; the second infrared beam splitter 126 is The infrared light from the infrared emitter 14 is configured to be transmitted to the polarization beam splitter 120, and the light emitted from the microdisplay 11 is reflected to the polarization beam splitter 120. It should be noted that the setting position of the second infrared beam splitter 126 is changed accordingly, so that the light emitted by the microdisplay 11 can be reflected to the polarization beam splitter 120.
  • the infrared light emitted from the infrared ray emitter 14 can be transmitted to the human eye through the augmented reality element 12 and the optical waveguide element 13 by providing the infrared ray emitter 14 and the infrared ray detector 15.
  • the reflected infrared light reflected by the human eye is transmitted to the infrared detector 15 through a path opposite to the original optical path, and the infrared detector 15 can detect the intensity of the reflected infrared light.
  • the reflectance of the infrared light is different, that is, the eyeball and the eyelid of the human eye have different reflectances to the infrared light, so the intensity of the reflected infrared light detected by the infrared detector 15 can be judged.
  • the time or frequency at which the human eye is closed so that it can be judged, for example, whether the user wearing the augmented reality device is in a fatigue state.
  • certain measures can be taken to keep the user awake, thereby improving safety.
  • the augmented reality device can be implemented as glasses (monocular or binocular), and the user can wear the augmented reality device while driving the vehicle, so that assisted driving can be achieved and safety can be improved.
  • the augmented reality device is light in weight and easy to wear, and can be well compatible with various types of vehicles, and does not require modification of the vehicle, and can meet the requirements of vehicle versatility.
  • the augmented reality device can also be worn to monitor whether the user is in a fatigue state.
  • the infrared light emitted by the infrared ray emitter 14 may have a wavelength of, for example, 4 micrometers or more and 13 micrometers or less to avoid damage to the human eye.
  • An embodiment of the present disclosure also provides an augmented reality system, as shown in FIG. 11, including an augmented reality device 10 and a controller 20.
  • the augmented reality device 10 is an augmented reality device including the infrared emitter 14 and the infrared detector 15 provided by the embodiment of the present disclosure, for example, the augmented reality device shown in FIGS. 3, 4, and 6-10 can be used. .
  • the controller 20 is configured to determine whether the user associated with the augmented reality device 10 is in a fatigue state according to the intensity of the reflected infrared light detected by the infrared detector 15, and to generate a control signal when determining that the user is in a fatigue state; Provides prompt information based on the control signal.
  • the data of the reflected infrared light intensity detected by the infrared detector 15 in the augmented reality device 10 may be sent to the controller 20, and the controller 20 may analyze the received reflected infrared light intensity data to determine and augment the reality.
  • the controller 20 may generate a control signal when determining that the user is in a fatigue state, for example, according to the control signal, the user may be provided with prompt information to improve security.
  • the controller 20 may be disposed on the same object as the augmented reality device 10, for example, when the augmented reality device 10 is disposed on the glasses, the controller 20 may also be disposed on the glasses. Upper (for example in the temples of glasses).
  • the controller 20 may also be disposed at a remote end relative to the augmented reality device 10, for example, on a server, and the augmented reality device 10 may communicate with the server through a network, for example, the controller 20 may send the determination result to the enhanced network through the network.
  • Reality device 10 may be disposed on the same object as the augmented reality device 10, for example, when the augmented reality device 10 is disposed on the glasses, the controller 20 may also be disposed on the glasses. Upper (for example in the temples of glasses).
  • the controller 20 may also be disposed at a remote end relative to the augmented reality device 10, for example, on a server, and the augmented reality device 10 may communicate with the server through a network, for example, the controller 20 may send the determination result to the enhanced network through the network.
  • the network described in the embodiments of the present disclosure may be various types of communication networks, including but not limited to a local area network, a wide area network, and an Internet Internet; and may be implemented as an Ethernet or a Token. Ring), FDDI network, asynchronous transmission mode network (ATM), etc.; may include but not limited to 3G/4G/5G mobile communication network, wireless local area network WIFI, wired communication network, and the like.
  • the server can be implemented in various forms, including a server installed in a local area network, a server installed in a wide area network, or a server installed in the Internet, such as a cloud server.
  • the cloud service can be provided by a public cloud provider ( Typically provided by, for example, Amazon, Facebook Cloud, etc., or may be provided by a private cloud.
  • the controller 20 can be configured to determine whether the user is in a fatigue state based on the user's eyelid closing time or closing frequency.
  • the human eye has different reflectances to infrared light when it is opened and closed, that is, the eyeball and the eyelid of the human eye have different reflectances to infrared light, so the controller 20 can perform data on the received reflected infrared light intensity.
  • the treatment is performed to obtain the user's eyelid closing time or closing frequency, so that it is possible to determine whether the user is in a fatigue state.
  • the augmented reality system 1 may further include a sensor 30.
  • sensor 30 is configured to collect physiological parameters of the user, for example sensor 30 includes at least one of a blood pressure sensor and a pulse sensor.
  • a blood pressure sensor such that the user's blood pressure can be collected; for example, in another example sensor 30 can be a pulse sensor such that the user's pulse can be acquired; and, for example, in yet another example sensor
  • the 30 can include a blood pressure sensor and a pulse sensor so that the user's blood pressure and pulse can be simultaneously acquired. It should be noted that the embodiment of the present disclosure does not limit the type of the sensor 30.
  • the augmented reality system 1 provided by the embodiment of the present disclosure may be embodied as glasses, that is, the augmented reality system 1 may be disposed on the glasses, in which case the sensor 30 may be disposed on the temples. The inside, and when the user wears the glasses, the sensor 30 can be close to the temple of the user.
  • the augmented reality system 1 is only schematically shown in FIG. 13 , and the augmented reality system 1 may be disposed on one side of the glasses (monocular form) or on both sides of the glasses (binocular form). The embodiment of the present disclosure does not limit this.
  • the controller 20 may be further configured to: determine whether the user is in an abnormal state according to physiological parameters; and generate a control signal when determining that the user is in an abnormal state; Provides prompt information based on the control signal.
  • the physiological parameters of the user collected by the sensor 30 can be sent to the controller 20, and the controller 20 can analyze the received physiological parameters to determine whether the user is in an abnormal state.
  • the abnormal state may indicate, for example, that the physiological parameter of the user is in an abnormal state.
  • a normal threshold range of a physiological parameter may be preset.
  • the physiological parameter received by the controller 20 is within the threshold range, it may be determined that the user is in a normal state, otherwise the user may be determined to be in an abnormal state.
  • the physiological parameter can be blood pressure or pulse.
  • the controller 20 may generate a control signal, for example, according to the control signal, the user may be provided with prompt information to improve security.
  • the augmented reality system 1 may further include an electric pulse generator 40 and a surface electrode 41.
  • the electrical pulse generator 40 is configured to generate an electrical pulse signal in response to the control signal and to transmit the electrical pulse signal to the surface electrode 41.
  • the surface electrode 41 may be disposed near the nose of the spectacles, and the surface electrode 41 is disposed in the vicinity of the nose of both sides of the spectacles in FIG. 13 , which is not limited in the present disclosure, for example, The surface electrode 41 may be provided only in the vicinity of the nose of one side of the glasses.
  • the augmented reality system 1 when the augmented reality system 1 is implemented as glasses, the user can wear the glasses while driving the vehicle or performing other activities (such as high-risk work), and when the controller 20 determines that the user is in a fatigue state or an abnormal state, a control signal can be generated.
  • the electric pulse generator 40 can generate an electric pulse signal in response to the control signal and transmit the electric pulse signal to the surface electrode 41. Since the surface electrode 41 is in direct contact with the user's nose, the electrical pulse signal can be transmitted to the user to electrically stimulate the user, thereby alerting the user to safety.
  • the augmented reality system 1 may further include a positioning device 50.
  • the positioning device 50 is configured to acquire location information of the user.
  • the positioning device 50 can adopt a GPS or a Beidou positioning system, which is not limited by the embodiment of the present disclosure.
  • the real-time location information of the user can be obtained by setting the positioning device, and the augmented reality system can further process according to the location information, for example, providing relevant prompt information to the user according to the location information. Or the location information can be sent to a third party associated with the user.
  • the augmented reality system 1 may further include a voice generating device 60.
  • the voice generating device 60 is configured to play the prompt information in response to the control signal, the prompt information including at least one of the preset voice information, the location information of the nearby hospital or the rest place generated based on the location information of the user.
  • the voice generating device 60 may include a speaker.
  • the speaker may be disposed on the glasses, but the specific setting position thereof is not limited, and may be set on the temple, for example. .
  • the prompt information played by the voice generating device 60 may be preset voice information, for example, the preset voice information may be stored in the controller 20 in advance, when the controller 20 determines that the user is in a fatigue state or an abnormal state.
  • the controller 20 can transmit the stored preset voice information to the voice generating device 60 for playing.
  • the preset voice information may also be stored in the voice generating device 60 in advance, and the voice signal generating device 60 directly plays the preset voice information when receiving the control signal of the controller 20.
  • the voice generating device 60 needs to include a storage medium.
  • the preset voice information may be “Please go to a rest place nearby to take a break” or “Please go to a nearby hospital for examination”, etc., and embodiments of the present disclosure do not limit the content of the preset voice information.
  • the prompt information played by the voice generating device 60 may be at least one of location information of a nearby hospital or a rest place generated based on location information of the user.
  • the controller 20 may generate location information of a nearby hospital or rest place based on the location information of the user acquired from the positioning device 50, and convert the voice information to the voice generating device 60 for playback by the voice generating device 60.
  • the controller 20 may directly transmit the location information of the nearby hospital or rest place to the voice generating device 60, and convert the voice into the voice by the voice generating device 60 before playing.
  • the prompt information can be played when the user is in a fatigue state or an abnormal state to remind the user to improve security.
  • the augmented reality system 1 may further include an image rendering device 70.
  • the image rendering device 70 is configured to render an image corresponding to the prompt information in response to the control signal, the microdisplay being configured to emit light including the prompt information, the prompt information including the preset image information, the neighborhood generated according to the location information of the user At least one of navigation information for a hospital or resting place.
  • a control signal may be generated and sent to the image rendering device 70, and the image rendering device 70 may respond to the The control signal is used to render an image corresponding to the prompt information.
  • the controller 20 can transmit the image rendered by the image rendering device 70 to the microdisplay in the augmented reality device 10, and the microdisplay can emit light including the prompt information to implement a reminder to the user.
  • the image rendering device 70 may be separately provided, or may be integrally configured with the controller 20, that is, may be packaged together in hardware, which is not limited by the embodiment of the present disclosure.
  • the prompt information rendered by the image rendering device 70 may be preset image information, for example, the preset image information may be pre-stored in the controller 20, when the controller 20 determines that the user is in a fatigue state or an abnormal state.
  • the controller 20 can transmit the stored preset image information to the image rendering device 70 for rendering.
  • the preset image information may also be stored in the image rendering device 70 in advance, and the image rendering device 70 directly renders the preset image information when receiving the control signal from the controller 20.
  • the image rendering device 70 needs to include a storage medium.
  • the preset image information may be an image with a reminder or warning function, for example, an image displayed as a red exclamation mark "!, and for example, an image displayed as a red "Please pay attention to safety"
  • a reminder or warning function for example, an image displayed as a red exclamation mark "!, and for example, an image displayed as a red "Please pay attention to safety"
  • the embodiment of the present disclosure does not limit the content of the preset image information.
  • the prompt information rendered by the image rendering device 70 may be at least one of navigation information of a nearby hospital or rest place generated based on location information of the user.
  • the controller 20 may generate navigation information of a nearby hospital or rest place according to the position information of the user acquired from the positioning device 50, and send it to the image rendering device 70, and the navigation information is rendered by the image rendering device 70 into a corresponding image.
  • the controller 20 can transmit the image rendered by the image rendering device 70 to the microdisplay in the augmented reality device 10, and the microdisplay can emit light including the navigation information to implement a navigation prompt to the user.
  • the prompt information can be displayed by the augmented reality device when the user is in a fatigue state or an abnormal state, so as to remind the user or navigate the prompt to improve the security. And convenience.
  • the augmented reality system 1 can also include a communication device 80 that is configured to communicate with a preset contact in response to a control signal.
  • the controller 20 determines that the user is in a fatigue state or an abnormal state, a control signal may be generated, and the communication device 80 may respond to the control signal with the preset contact.
  • Communicate can be communicated by sending a message or making a call, for example, the user's physiological parameter data can be sent to the preset contact.
  • the preset contact may be pre-stored in the controller 20 or the communication device 80, and the preset contact may be a guardian, a doctor, or the like of the user.
  • the communication device 80 can communicate with the preset contact through the network.
  • the network For the description of the network, refer to the corresponding description in the foregoing embodiment, and details are not described herein again.
  • the user can be in communication with the preset contact when the user is in a fatigue state or an abnormal state, and the preset contact can take further measures to ensure the security of the user. Therefore, the user can be fully protected and the security can be improved.
  • controller 20 and the image rendering device 70 in the augmented reality system 1 may be implemented to include an application specific integrated circuit, hardware (circuit), firmware, or any other combination to achieve the desired.
  • the function can be embodied, for example, as a digital signal processor or the like.
  • controller 20 and image rendering device 70 may also be implemented to include a processor and a storage medium configured to store computer instructions that are adaptable for execution by the processor, and computer instructions that are executed by the processor may achieve respective expectations The function.
  • controller 20 and image rendering device 70 may also be implemented to include a processor and a storage medium configured to store computer instructions that are adaptable for execution by the processor, and computer instructions that are executed by the processor may achieve respective expectations The function.
  • the embodiments of the present disclosure do not limit this.
  • the processor may be implemented by a general-purpose integrated circuit chip or an application-specific integrated circuit chip.
  • the integrated circuit chip may be disposed on a motherboard, for example, a memory and a power supply circuit may be disposed on the motherboard;
  • the processor may be implemented by circuitry or by software, hardware (circuit), firmware, or any combination thereof.
  • the processor may include various computing structures, such as a Complex Instruction Set Computer (CISC) structure, a Reduced Instruction Set Computer (RISC) structure, or a structure that implements a combination of multiple instruction sets.
  • the processor can also be a microprocessor, such as an X86 processor or an ARM processor, or can be a digital processor (DSP) or the like.
  • a storage medium may be disposed, for example, on the above-described main board, and the storage medium may hold instructions and/or data executed by the processor.
  • a storage medium may include one or more computer program products, which may comprise various forms of computer readable memory, such as volatile memory and/or nonvolatile memory.
  • the volatile memory may include, for example, a random access memory (RAM) and/or a cache (cache) or the like.
  • the nonvolatile memory may include, for example, a read only memory (ROM), a magnetic disk, an optical disk, a semiconductor memory (such as a flash memory, etc.), and the like.
  • One or more computer program instructions can be stored on the computer readable memory, and the processor can execute the program instructions to implement a desired function (implemented by a processor) in an embodiment of the present disclosure.
  • An embodiment of the present disclosure further provides an information prompting method of an augmented reality system.
  • the information prompting method may include the following operations.
  • Step S100 emitting infrared light to the eye
  • Step S200 determining whether the user associated with the augmented reality system is in a fatigue state according to the intensity of the reflected infrared light returned by the eye;
  • Step S300 generating a control signal when determining that the user is in a fatigue state
  • Step S400 providing prompt information according to the control signal.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Optical Communication System (AREA)

Abstract

一种增强现实装置、增强现实系统及其信息提示方法。增强现实装置包括微显示器(11)、增强现实元件(12)和光波导元件(13)。微显示器(11)被配置为发射携带显示内容的光线,光线包括处于第一偏振态的第一光线部分(DP1)和处于第二偏振态的第二光线部分(DS2);增强现实元件(12)被配置为将第一光线部分(DP1)由第一偏振态转换为第二偏振态,将第二光线部分由第二偏振态转换为第一偏振态,并将处于第二偏振态的第一光线部分(DS1)和处于第一偏振态的第二光线部分(DP2)耦合至光波导元件(13);其中第一偏振态与第二偏振态正交。由此提高显示器发射光线的利用率和增强现实装置的通光效率,从而提高增强现实装置的显示亮度。

Description

增强现实装置、增强现实系统及其信息提示方法
本申请要求于2018年1月29日递交的中国专利申请第201810084344.3号的优先权,在此全文引用上述中国专利申请公开的内容以作为本申请的一部分。
技术领域
本公开实施例涉及一种增强现实装置、增强现实系统及其信息提示方法。
背景技术
增强现实(Augmented Reality,AR)是将虚拟信息叠加在真实场景上,并借助感知和显示设备,将虚拟信息和真实场景融为一体,最终呈现给观察者一个感观效果真实的新环境。
发明内容
本公开至少一实施例提供一种增强现实装置,包括微显示器和增强现实元件。所述微显示器被配置为发射携带显示内容的光线,所述光线包括处于第一偏振态的第一光线部分和处于第二偏振态的第二光线部分;以及所述增强现实元件被配置为将所述第一光线部分由所述第一偏振态转换为所述第二偏振态,将所述第二光线部分由所述第二偏振态转换为所述第一偏振态,并将处于所述第二偏振态的所述第一光线部分和处于所述第一偏振态的所述第二光线部分耦合输出;其中,所述第一偏振态与所述第二偏振态正交。
例如,在本公开一实施例提供的增强现实装置中,还包括光波导元件,所述光波导元件被配置为接收并传输所述增强现实元件耦合输出的所述第一光线部分和所述第二光线部分;所述增强现实元件包括偏振分光镜、第一凹面反射镜、第二凹面反射镜、第一相位延迟片和第二相位延迟片。所述偏振分光镜被配置为接收来自所述微显示器的所述光线,透射处于所述第一偏振态的所述第一光线部分,并反射处于所述第二偏振态的所述第二 光线部分;被所述偏振分光镜透射的所述第一光线部分被设置为穿过所述第一相位延迟片,被所述第一凹面反射镜反射,并再次穿过所述第一相位延迟片,其中,在所述第一相位延迟片和所述第一凹面反射镜共同作用下,所述第一光线部分由所述第一偏振态转换为所述第二偏振态;被所述偏振分光镜反射的所述第二光线部分被设置为穿过所述第二相位延迟片,被所述第二凹面反射镜反射,并再次穿过所述第二相位延迟片,其中,在所述第二相位延迟片和所述第二凹面反射镜共同作用下,所述第二光线部分由所述第二偏振态转换为所述第一偏振态;以及所述偏振分光镜还被配置为反射处于所述第二偏振态的所述第一光线部分和透射处于所述第一偏振态的所述第二光线部分,使得所述第一光线部分和所述第二光线部分均被耦合至所述光波导元件的入射面。
例如,本公开一实施例提供的增强现实装置还包括红外线发射器和红外线探测器。所述红外线发射器被配置为发射红外光至所述增强现实元件;所述增强现实元件被配置为将所述红外光耦合至所述光波导元件的入射面;所述光波导元件被配置为经过半反半透面阵列将所述红外光传输至眼睛;所述光波导元件和所述增强现实元件还被配置为将所述眼睛反射的反射红外光沿着与所述红外光相反的传输路径传输至所述红外线探测器;以及所述红外线探测器被配置为探测所述反射红外光。
例如,在本公开一实施例提供的增强现实装置中,所述红外线发射器发射的所述红外光包括处于所述第一偏振态的第一红外光部分和处于所述第二偏振态的第二红外光部分;所述偏振分光镜还被配置为透射处于所述第一偏振态的所述第一红外光部分,并反射处于所述第二偏振态的所述第二红外光部分;被所述偏振分光镜透射的所述第一红外光部分被设置为穿过所述第一相位延迟片,被所述第一凹面反射镜反射,并再次穿过所述第一相位延迟片,其中,在所述第一相位延迟片和所述第一凹面反射镜共同作用下,所述第一红外光部分由所述第一偏振态转换为所述第二偏振态;被所述偏振分光镜反射的所述第二红外光部分被设置为穿过所述第二相位延迟片,被所述第二凹面反射镜反射,并再次穿过所述第二相位延迟片,其中,在所述第二相位延迟片和所述第二凹面反射镜共同作用下,所述第二红外光部分由所述第二偏振态转换为所述第一偏振态;所述偏振分光镜还被配置为反射处于所述第二偏振态的所述第一红外光部分和透射处于所 述第一偏振态的所述第二红外光部分,使得所述第一红外光部分和所述第二红外光部分均被耦合至所述光波导元件的入射面;以及所述光波导元件被配置为经过所述半反半透面阵列将耦合后的所述第一红外光部分和所述第二红外光部分传输至所述眼睛。
例如,在本公开一实施例提供的增强现实装置中,所述光波导元件还被配置为经由所述半反半透面阵列,将所述眼睛反射的反射红外光传输至所述偏振分光镜,所述反射红外光包括处于所述第二偏振态的第一反射红外光部分和处于所述第一偏振态的第二反射红外光部分;所述偏振分光镜还被配置为反射处于所述第二偏振态的所述第一反射红外光部分,并透射处于所述第一偏振态的所述第二反射红外光部分;被所述偏振分光镜反射的处于所述第二偏振态的所述第一反射红外光部分被设置为穿过所述第一相位延迟片,被所述第一凹面反射镜反射,并再次穿过所述第一相位延迟片,其中,在所述第一相位延迟片和所述第一凹面反射镜共同作用下,所述第一反射红外光部分由所述第二偏振态转换为所述第一偏振态;被所述偏振分光镜透射的处于所述第一偏振态的所述第二反射红外光部分被设置为穿过所述第二相位延迟片,被所述第二凹面反射镜反射,并再次穿过所述第二相位延迟片,其中,在所述第二相位延迟片和所述第二凹面反射镜共同作用下,所述第二反射红外光部分由所述第一偏振态转换为所述第二偏振态;以及所述偏振分光镜还被配置为透射处于所述第一偏振态的所述第一反射红外光部分和反射处于所述第二偏振态的所述第二反射红外光部分,使得所述第一反射红外光部分和所述第二反射红外光部分均通过所述偏振分光镜而传输至所述红外线探测器。
例如,在本公开一实施例提供的增强现实装置中,所述第一相位延迟片和所述第一凹面反射镜均设置于所述增强现实元件的第一侧;所述第二相位延迟片和所述第二凹面反射镜均设置于所述增强现实元件的、与所述第一侧相邻的第三侧的第一端,所述第三侧的第一端靠近所述第一侧;以及所述光波导元件的入射面设置于所述增强现实元件的、与所述第一侧相邻的第四侧的第一端,所述第四侧的第一端靠近所述第一侧并与所述第三侧的第一端相对。
例如,在本公开一实施例提供的增强现实装置中,所述微显示器、所述红外线发射器和所述红外线探测器设置于所述增强现实元件的、与所述 第一侧相对的第二侧。
例如,在本公开一实施例提供的增强现实装置中,所述增强现实元件还包括第一红外分光镜。所述微显示器设置于所述增强现实元件的、与所述第一侧相对的第二侧;所述红外线发射器和所述红外线探测器均设置于所述增强现实元件的第三侧的第二端,所述第三侧的第二端靠近所述第二侧;或者,所述红外线发射器和所述红外线探测器均设置于所述增强现实元件的第四侧的第二端,所述第四侧的第二端靠近所述第二侧;以及所述第一红外分光镜被配置为将来自所述红外线发射器的所述红外光反射至所述偏振分光镜,并将所述微显示器发射的所述光线透射至所述偏振分光镜。
例如,在本公开一实施例提供的增强现实装置中,所述增强现实元件还包括第二红外分光镜。所述红外线发射器和所述红外线探测器均设置于所述增强现实元件的、与所述第一侧相对的第二侧;所述微显示器设置于所述增强现实元件的第三侧的第二端,所述第三侧的第二端靠近所述第二侧;或者,所述微显示器设置于所述增强现实元件的第四侧的第二端,所述第四侧的第二端靠近所述第二侧并与所述第三侧的第二端相对;以及所述第二红外分光镜被配置为将来自所述红外线发射器的所述红外光透射至所述偏振分光镜,并将所述微显示器发射的所述光线反射至所述偏振分光镜。
例如,在本公开一实施例提供的增强现实装置中,所述半反半透面阵列设置在所述光波导元件中并包括呈阵列排布的多个半反半透镜;以及所述半反半透面阵列被配置为:将从所述光波导元件的入射面进入的、入射至所述半反半透面阵列上的所述光线和所述红外光传输至所述眼睛;以及将被所述眼睛反射的反射红外光传输至所述光波导元件的入射面,从而使得所述反射红外光进入所述增强现实元件。
例如,在本公开一实施例提供的增强现实装置中,所述第一偏振态为p偏振态,所述第二偏振态为s偏振态,所述第一相位延迟片和所述第二相位延迟片为四分之一波长相位延迟片。
本公开至少一实施例还提供一种增强现实系统,包括如本公开的实施例提供的增强现实装置和控制器。所述控制器被配置为:根据所述红外线探测器探测的反射红外光的强度,判断与所述增强现实装置相关联的用户是否处于疲劳状态;在判断所述用户处于疲劳状态时,生成控制信号;以 及根据所述控制信号提供提示信息。
例如,在本公开一实施例提供的增强现实系统中,所述控制器被配置为根据所述用户的眼睑闭合时间或闭合频率判断所述用户是否处于疲劳状态。
例如,本公开一实施例提供的增强现实系统还包括传感器。所述传感器被配置为采集所述用户的生理参数,所述传感器包括血压传感器和脉搏传感器中的至少一种。
例如,在本公开一实施例提供的增强现实系统中,所述控制器还被配置为:根据所述生理参数判断所述用户是否处于异常状态;在判断所述用户处于异常状态时,生成所述控制信号;以及根据所述控制信号提供所述提示信息。
例如,本公开一实施例提供的增强现实系统还包括电脉冲发生器和表面电极,所述电脉冲发生器被配置为响应于所述控制信号而产生电脉冲信号,并将所述电脉冲信号传输至所述表面电极。
例如,本公开一实施例提供的增强现实系统还包括定位装置,所述定位装置被配置为获取所述用户的位置信息。
例如,本公开一实施例提供的增强现实系统还包括语音生成装置。所述语音生成装置被配置为响应于所述控制信号而播放所述提示信息。
例如,本公开一实施例提供的增强现实系统还包括图像渲染装置。所述图像渲染装置被配置为响应于所述控制信号而渲染出所述提示信息对应的图像,所述微显示器被配置为发射包括所述提示信息的所述光线。
例如,本公开一实施例提供的增强现实系统还包括通信装置,所述通信装置被配置为响应于所述控制信号与预设联系人进行通信。
本公开至少一实施例还提供一种增强现实系统的信息提示方法,包括:发射红外光至眼睛;根据所述眼睛返回的反射红外光的强度,判断与所述增强现实系统相关联的用户是否处于疲劳状态;在判断所述用户处于疲劳状态时,生成控制信号;以及根据所述控制信号提供提示信息。
附图说明
为了更清楚地说明本公开实施例的技术方案,下面将对实施例的附图作简单地介绍,显而易见地,下面描述中的附图仅仅涉及本公开的一些实 施例,而非对本公开的限制。
图1为一种示例性的增强现实装置的示意图;
图2为本公开的一个实施例提供的一种增强现实装置的示意图;
图3为本公开的另一个实施例提供的一种增强现实装置的示意图之一;
图4为本公开的另一个实施例提供的一种增强现实装置的示意图之二;
图5为图4中增强现实元件的第二侧的侧视图;
图6为本公开的再一个实施例提供的一种增强现实装置的示意图;
图7为对应图6中的增强现实装置的双目形式的示意图;
图8为本公开的再一个实施例提供的一种增强现实装置的示意图;
图9为本公开的再一个实施例提供的一种增强现实装置的示意图;
图10为本公开的又一个实施例提供的一种增强现实装置的示意图;
图11为本公开的一个实施例提供的一种增强现实系统的示意图;
图12为本公开的另一个实施例提供的一种增强现实系统的示意图;
图13为本公开的实施例提供的一种包括增强现实系统的眼镜的示意图;
图14为本公开的再一个实施例提供的一种增强现实系统的示意图;
图15为本公开的又一个实施例提供的一种增强现实系统的示意图;以及
图16为本公开的一个实施例提供的一种增强现实系统的信息提示方法。
具体实施方式
为使本公开实施例的目的、技术方案和优点更加清楚,下面将结合本公开实施例的附图,对本公开实施例的技术方案进行清楚、完整地描述。显然,所描述的实施例是本公开的一部分实施例,而不是全部的实施例。基于所描述的本公开的实施例,本领域普通技术人员在无需创造性劳动的前提下所获得的所有其他实施例,都属于本公开保护的范围。
除非另外定义,本公开使用的技术术语或者科学术语应当为本公开所属领域内具有一般技能的人士所理解的通常意义。本公开中使用的“第一”、“第二”以及类似的词语并不表示任何顺序、数量或者重要性,而只是用来区分不同的组成部分。同样,“一个”、“一”或者“该”等类似词语也不表示数量限制,而是表示存在至少一个。“包括”或者“包含”等类似的词语意指出现该词前面的元件或者物件涵盖出现在该词后面列举的元件或者物件及其等同,而不排除其他元件或者物件。“连接”或者“相连”等类似的词语并非限定于物理 的或者机械的连接,而是可以包括电性的连接,不管是直接的还是间接的。“上”、“下”、“左”、“右”等仅用于表示相对位置关系,当被描述对象的绝对位置改变后,则该相对位置关系也可能相应地改变。
由于几何光波导增强现实装置具有重量轻,体积小,轻薄化(厚度可以小于2mm)的优势,使其得到广泛的关注,几何光波导是利用平面波导内的反射面和半反半透面阵列实现光线的耦合入射和出射。
例如,图1示出了一种增强现实装置,该增强现实装置主要由光波导元件、投影系统及微显示器三部分组成。微显示器显示的图像经过投影系统放大投射进入光波导元件,并入射到光波导元件的反射面上,光线经反射面反射后满足光的全反射条件,实现了图像信号的耦合输入。
耦合进入光波导元件内的图像信号在光波导元件内通过全反射进行传输,最后入射在半反半透面阵列上,部分光线经半反半透镜反射后不再满足全反射条件,实现图像信号的耦合输出。另一部分光线则透过半反半透镜继续在光波导元件内进行全反射传输,直到下一次入射到半反半透面阵列上再次进行分光,前后几次经过半反半透面阵列耦合出射的光线进入眼睛(例如人眼)完成图像信号的传输。
同时真实场景中的光线可以直接透过光波导元件进入眼睛,显示光路与透射光路在眼睛位置处叠加,实现透射式的近眼显示,即可以实现将虚拟信息和真实场景融为一体并同时呈现给观察者。
在图1所示的增强现实装置中,微显示器发射的光线可能包括处于不同偏振态的光线,处于不同偏振态的光线在通过投影系统时,投影系统中的例如偏振分光镜只允许一种偏振态的光线透射(或反射),使得入射到光波导元件的入射面上的光线可能会损失掉其他偏振态的光线,从而使得该增强现实装置的通光效率较低,进而无法实现高亮度的增强现实显示。而且图1中所示的增强现实装置需要投影系统,结构复杂。
本公开至少一实施例提供一种增强现实装置,包括微显示器和增强现实元件。微显示器被配置为发射携带显示内容的光线,光线包括处于第一偏振态的第一光线部分和处于第二偏振态的第二光线部分。增强现实元件被配置为将第一光线部分由第一偏振态转换为第二偏振态,将第二光线部分由第二偏振态转换为第一偏振态,并将处于第二偏振态的第一光线部分和处于第一偏振态的第二光线部分耦合输出。本公开至少一实施例还提供对应于上述增 强现实装置的增强现实系统及其信息提示方法。
本公开的实施例提供的增强现实装置、增强现实系统及其信息提示方法,通过增强现实元件最大效率的利用微显示器所发射的光线,可以提高通光效率,从而可以提高增强现实装置显示的亮度。该增强现实装置还可以用于检测用户的疲劳程度,并提供提示信息。例如,在判断用户处于疲劳状态时,可以进一步地采取一定的措施来使得用户保持清醒,从而提高安全性。同时,该增强现实装置不需要设置复杂的投影系统,结构简单。
下面结合附图对本公开的实施例进行详细说明。
本公开的一个实施例提供一种增强现实装置,如图2所示,该增强现实装置包括微显示器11和增强现实元件12。
微显示器11被配置为发射携带显示内容的光线,光线包括处于第一偏振态的第一光线部分DP1和处于第二偏振态的第二光线部分DS2。增强现实元件12被配置为将第一光线部分DP1由第一偏振态转换为第二偏振态,将第二光线部分DS2由第二偏振态转换为第一偏振态,并将处于第二偏振态的第一光线部分DS1和处于第一偏振态的第二光线部分DP2耦合输出,例如耦合输出至光波导元件13。
例如,处于第二偏振态的第一光线部分DS1和处于第一偏振态的第二光线部分DP2可以通过光波导元件13的入射面130耦合至光波导元件13。例如,光波导元件13的入射面130为光波导元件13与增强现实元件12相连接的面。
在本公开中,为了描述的方便,处于第一偏振态的第一光线部分被标识为“DP1”,处于第二偏振态的第一光线部分被标识为“DS1”,处于第二偏振态的第二光线部分被标识为“DS2”,处于第一偏振态的第二光线部分被标识为“DP2”。
需要说明的是,从微显示器11发射的携带显示内容的光线例如是非线性偏振态的光,其可能为圆偏振态、椭圆偏振态或者部分偏振态。在本公开的实施例中,第一偏振态例如为p偏振态,第二偏振态例如为s偏振态,并且所有偏振态的光都可以被分解为p偏振态的线偏振光和s偏振态的线偏振光。以下各实施例与此相同,不再赘述。
在本公开的实施例提供的增强现实装置中,增强现实元件将微显示器发射的光线中的处于第一偏振态的第一光线部分和处于第二偏振态的第二光 线部分,分别转换为处于第二偏振态的第一光线部分和处于第一偏振态的第二光线部分,并耦合输出至例如光波导元件。通过这种方式可以提高微显示器发射的光线的利用率、提高增强现实装置的通光效率,从而可以提高该增强现实装置显示的亮度。
例如,在本公开的一个实施例中,如图2所示,增强现实装置还包括光波导元件13,光波导元件13被配置为接收并传输增强现实元件12耦合输出的第一光线部分和第二光线部分,例如第一光线部分处于第二偏振态,第二光线部分处于第一偏振态。
如图2所示,增强现实元件12包括偏振分光镜120、第一凹面反射镜121、第二凹面反射镜122、第一相位延迟片123和第二相位延迟片124。例如,第一相位延迟片123和第二相位延迟片124均为四分之一波长相位延迟片。
例如,如图2所示,偏振分光镜120被配置为接收来自微显示器11的光线,透射处于第一偏振态的第一光线部分DP1,并反射处于第二偏振态的第二光线部分DS2。
例如,如图2所示,被偏振分光镜120透射的处于第一偏振态的第一光线部分DP1被设置为穿过第一相位延迟片123,被第一凹面反射镜121反射,并再次穿过第一相位延迟片123。例如,处于第一偏振态的第一光线部分DP1穿过第一相位延迟片123后变为左旋(或右旋)圆偏振光,被第一凹面反射镜121反射后变为右旋(或左旋)圆偏振光,再次穿过第一相位延迟片123后变为第二偏振态的线偏振光,即在第一相位延迟片123和第一凹面反射镜121的共同作用下,第一光线部分由第一偏振态转换为第二偏振态(即,处于第一偏振态的第一光线部分DP1转换为处于第二偏振态的第一光线部分DS1)。
被偏振分光镜120反射的处于第二偏振态的第二光线部分DS2被设置为穿过第二相位延迟片124,被第二凹面反射镜122反射,并再次穿过第二相位延迟片124。例如,处于第二偏振态的第二光线部分DS2穿过第二相位延迟片124后变为左旋(或右旋)圆偏振光,被第二凹面反射镜122反射后变为右旋(或左旋)圆偏振光,再次穿过第二相位延迟片124后变为第一偏振态的线偏振光,即在第二相位延迟片124和第二凹面反射镜122的共同作用下,第二光线部分由第二偏振态转换为第一偏振态(即,处于第二偏振态的 第二光线部分DS2转换为处于第一偏振态的第二光线部分DP2)。
例如,偏振分光镜120还被配置为反射处于第二偏振态的第一光线部分DS1和透射处于第一偏振态的第二光线部分DP2,使得第一光线部分和第二光线部分均被耦合至光波导元件13的入射面130,并穿过入射面130进入光波导元件13。
耦合至光波导元件13内的光线在光波导元件13内通过全反射进行传输,最后入射在半反半透面阵列16上,例如,半反半透面阵列16设置在光波导元件13中并包括呈阵列排布的多个半反半透镜161。例如,半反半透面阵列16被配置为将从光波导元件13的入射面130进入的、入射至半反半透面阵列16上的光线传输至眼睛100。
部分光线经半反半透镜161反射后不再满足全反射条件,实现光线的耦合输出。另一部分光线则透过半反半透镜161后继续在光波导元件13内全反射传输直到下一次入射到半反半透镜161上再次进行分光,前后几次经过半反半透面阵列16耦合出射的光线传输至眼睛100完成显示内容的传输。
需要说明的是,在本公开的实施例中,眼睛100例如可以为人眼,但本公开对此不作限定,只要是可以接收光波导元件13耦合出射的光均可以作为眼睛100。例如,眼睛100还可以包括动物的眼睛,以下各实施例均以眼睛100为人眼为例进行描述。
需要说明的是,如图2所示,微显示器11设置在第一凹面反射镜121的焦点上,例如第一凹面反射镜121的半径为R,则微显示器11与第一凹面反射镜121的距离为焦距f,焦距f满足f=R/2。
例如,在本公开的一个实施例中,如图3所示,增强现实装置还可以包括红外线发射器14和红外线探测器15。
红外线发射器14被配置为发射红外光至增强现实元件12。增强现实元件12被配置为将红外光耦合至光波导元件13的入射面130。光波导元件13被配置为经过半反半透面阵列16将红外光传输至眼睛100,例如眼睛100为人眼。光波导元件13和增强现实元件12还被配置为将眼睛100反射的反射红外光沿着与红外光相反的传输路径传输至红外线探测器15。红外线探测器15被配置为探测反射红外光。
例如,如图3所示,红外线发射器14发射的红外光包括处于第一偏振态的第一红外光部分IP1和处于第二偏振态的第二红外光部分IS2。偏振分 光镜120还被配置为透射处于第一偏振态的第一红外光部分IP1,并反射处于第二偏振态的第二红外光部分IS2。
在本公开中,为了描述的方便,处于第一偏振态的第一红外光部分被标识为“IP1”,处于第二偏振态的第一红外光部分被标识为“IS1”,处于第二偏振态的第二红外光部分被标识为“IS2”,处于第一偏振态的第二红外光部分被标识为“IP2”。
例如,如图3所示,被偏振分光镜120透射的处于第一偏振态的第一红外光部分IP1被设置为穿过第一相位延迟片123,被第一凹面反射镜121反射,并再次穿过第一相位延迟片123,在第一相位延迟片123和第一凹面反射镜121共同作用下,第一红外光部分由第一偏振态转换为第二偏振态(即,处于第一偏振态的第一红外光部分IP1转换为处于第二偏振态的第一红外光部分IS1)。
被偏振分光镜120反射的处于第二偏振态的第二红外光部分IS2被设置为穿过第二相位延迟片124,被第二凹面反射镜122反射,并再次穿过第二相位延迟片124,在第二相位延迟片124和第二凹面反射镜122的共同作用下,第二红外光部分由第二偏振态转换为第一偏振态(即,处于第二偏振态的第二红外光部分IS2转换为处于第一偏振态的第二红外光部分IP2)。
需要说明的是,关于第一偏振态和第二偏振态之间的转换可以参考上述实施例中关于微显示器发射的光线的描述,这里不再赘述。
例如,偏振分光镜120还被配置为反射处于第二偏振态的第一红外光部分IS1和透射处于第一偏振态的第二红外光部分IP2,使得第一红外光部分和第二红外光部分均被耦合至光波导元件13的入射面130。
例如,光波导元件13被配置为经过半反半透面阵列16将耦合后的第一红外光部分和第二红外光部分传输至眼睛100。例如,半反半透面阵列16还可以被配置为将从光波导元件13的入射面130进入的、入射至半反半透面阵列16上的红外光传输至眼睛100。需要说明的是,关于半反半透面阵列16工作原理的描述可以参考上述实施例中的相应描述,这里不再赘述。
在本公开的实施例中,将传输至眼睛100的红外光并经过眼睛100反射后的光称为反射红外光,下面描述反射红外光经过光波导元件13和增强现实元件12传输至红外线探测器15的过程。
例如,半反半透面阵列16还可以被配置为将被眼睛100反射的反射红 外光传输至光波导元件13的入射面130,从而使得反射红外光进入增强现实元件12。
例如,如图4所示,光波导元件13还被配置为经由半反半透面阵列16,将眼睛100反射的反射红外光传输至偏振分光镜120,反射红外光包括处于第二偏振态的第一反射红外光部分RS1和处于第一偏振态的第二反射红外光部分RP2。
在本公开中,为了描述的方便,处于第一偏振态的第一反射红外光部分被标识为“RP1”,处于第二偏振态的第一反射红外光部分被标识为“RS1”,处于第二偏振态的第二反射红外光部分被标识为“RS2”,处于第一偏振态的第二反射红外光部分被标识为“RP2”。
例如,偏振分光镜120还被配置为反射处于第二偏振态的第一反射红外光部分RS1,并透射处于第一偏振态的第二反射红外光部分RP2。
例如,如图4所示,被偏振分光镜120反射的处于第二偏振态的第一反射红外光部分RS1被设置为穿过第一相位延迟片123,被第一凹面反射镜121反射,并再次穿过第一相位延迟片123。在第一相位延迟片123和第一凹面反射镜121的共同作用下,第一反射红外光部分由第二偏振态转换为第一偏振态,即处于第二偏振态的第一反射红外光部分RS1转换为处于第一偏振态的第一反射红外光部分RP1。
被偏振分光镜120透射的处于第一偏振态的第二反射红外光部分RP2被设置为穿过第二相位延迟片124,被第二凹面反射镜122反射,并再次穿过第二相位延迟片124。在第二相位延迟片124和第二凹面反射镜122的共同作用下,第二反射红外光部分由第一偏振态转换为第二偏振态,即处于第一偏振态的第二发射红外光部分RP2转换为处于第二偏振态的第二反射红外光部分RS2。
例如,偏振分光镜120还被配置为透射处于第一偏振态的第一反射红外光部分RP1和反射处于第二偏振态的第二反射红外光部分RS2,使得第一反射红外光部分和第二反射红外光部分均通过偏振分光镜120而传输至红外线探测器15。
需要说明的是,在本公开的实施例中,对红外线探测器15的设置个数不作限定,例如如图3和图4所示,可以设置两个红外线探测器15,在一些实施例中,还可以仅设置一个红外线探测器15,又例如还可以设置三个或更 多个红外线探测器。另外,在图3和图4中,为了示意清楚,没有标识出微显示器11所发射的光线的传输路径,关于微显示器11所发射的光线的传输路径可以参考图2中所示。以下各实施例与此相同,不再赘述。
例如,如图3和图4所示,第一相位延迟片123和第一凹面反射镜121均设置于增强现实元件12的第一侧S1。第二相位延迟片124和第二凹面反射镜122均设置于增强现实元件12的、与第一侧S1相邻的第三侧S3的第一端S31,第三侧S3的第一端S31靠近第一侧S1。光波导元件13的入射面130设置于增强现实元件12的、与第一侧S1相邻的第四侧S4的第一端S41,第四侧S4的第一端S41靠近第一侧S1并与第三侧S3的第一端S31相对。
例如,在本公开的一个实施例中,如图3和图4所示,可以将微显示器11、红外线发射器14和红外线探测器15均设置于增强现实元件12的、与第一侧S1相对的第二侧S2。
需要说明的是,由于图3和图4是俯视图,微显示器11被红外线发射器14和红外线探测器15部分遮挡。图5示出了图4中增强现实元件12的第二侧S2的侧视图。需要说明的是,图5仅是一种微显示器11和红外线发射器14、红外线探测器15的位置关系的一种示例,还可以采用其他位置关系,本公开的实施例对此不作限定。
例如,在本公开的一个实施例中,如图6所示,增强现实元件12还可以包括第一红外分光镜125。微显示器11设置于增强现实元件12的、与第一侧S1相对的第二侧S2。红外线发射器14和红外线探测器15均设置于增强现实元件12的第三侧S3的第二端S32,第三侧S3的第二端S32靠近第二侧S2。
例如,第一红外分光镜125被配置为将来自红外线发射器14的红外光反射至偏振分光镜120,并将微显示器11发射的光线透射至偏振分光镜120。即第一红外分光镜125的作用是反射红外光而透射其它波长的光,对微显示器11发出的光线不会造成影响。
例如,本公开的实施例提供的增强现实装置可以具体实现为眼镜。例如,如图6所示,可以实现为单目的形式,例如可以佩戴于用户的左眼或右眼。又例如,如图7所示,还可以实现为双目的形式。本公开的实施例对此不作限定,以下各实施例与此相同,不再赘述。
例如,如图8所示,红外线发射器14和红外线探测器15还可以均设置 于增强现实元件12的第四侧S4的第二端S42,第四侧S4的第二端S42靠近第二侧S2。同样地,和图6所示的实施例一致,微显示器11设置于增强现实元件12的、与第一侧S1相对的第二侧S2;第一红外分光镜125被配置为将来自红外线发射器14的红外光反射至偏振分光镜120,并将微显示器11发射的光线透射至偏振分光镜120。需要注意的是,此时第一红外分光镜125的设置位置要相应的改变,以使得可以将红外线发射器14发射的红外光反射至偏振分光镜120。
例如,在本公开的一个实施例中,如图9所示,增强现实元件12还可以包括第二红外分光镜126。红外线发射器14和红外线探测器15均设置于增强现实元件12的、与第一侧S1相对的第二侧S2。微显示器11设置于增强现实元件12的第三侧S3的第二端S32,第三侧S3的第二端S32靠近第二侧S2。
例如,第二红外分光镜126被配置为将来自红外线发射器14的红外光透射至偏振分光镜120,并将微显示器11发射的光线反射至偏振分光镜120。即第二红外分光镜126的作用是反射由微显示器11发出的可见光范围内的光线,并透射红外线发射器14发出的红外光。
例如,如图10所示,微显示器11还可以设置于增强现实元件12的第四侧S4的第二端S42,第四侧S4的第二端S42靠近第二侧S2并与第三侧S3的第二端S32相对。同样地,和图9所示的实施例一致,红外线发射器14和红外线探测器15均设置于增强现实元件12的、与第一侧S1相对的第二侧S2;第二红外分光镜126被配置为将来自红外线发射器14的红外光透射至偏振分光镜120,并将微显示器11发射的光线反射至偏振分光镜120。需要注意的是,此时第二红外分光镜126的设置位置要相应的改变,以使得可以将微显示器11发射的光线反射至偏振分光镜120。
在本公开的实施例提供的增强现实装置中,通过设置红外线发射器14和红外线探测器15,可以将红外线发射器14发射的红外光经增强现实元件12和光波导元件13传输至人眼,经过人眼反射后的反射红外光再经过与原来的光路相反的路径传输至红外线探测器15,红外线探测器15可以探测反射红外光的强度。例如,人眼在睁开和闭合时对红外光的反射率不同,即人眼的眼球和眼睑对红外光的反射率不同,所以通过红外线探测器15所探测到的反射红外光的强度可以判断人眼眼睑闭合的时间或频率,从而可以判断 例如佩戴该增强现实装置的用户是否处于疲劳状态。在判断用户处于疲劳状态时,可以进一步地采取一定的措施来使得用户保持清醒,从而提高安全性。
例如,该增强现实装置可以实现为眼镜(单目或双目),用户可以在驾驶车辆时佩戴该增强现实装置,从而可以实现辅助驾驶,提高安全性。另外,该增强现实装置设计轻巧、方便佩戴,可以很好的兼容各种车型,不需要对车辆进行改造,可以满足车辆通用性的需求。
又例如,当用户在进行高危、需要一直保持清醒状态的工作时,也可以佩戴该增强现实装置,对用户是否处于疲劳状态进行监控。
在本公开的实施例中,红外线发射器14所发射的红外光的波长例如可以大于等于4微米且小于等于13微米,以避免对人眼的伤害。
本公开的一个实施例还提供一种增强现实系统,如图11所示,包括增强现实装置10和控制器20。需要说明的是,该增强现实装置10为本公开的实施例提供的包括红外线发射器14和红外线探测器15的增强现实装置,例如可以采用图3、4、6-10所示的增强现实装置。
例如,控制器20被配置为:根据红外线探测器15探测的反射红外光的强度,判断与增强现实装置10相关联的用户是否处于疲劳状态;在判断用户处于疲劳状态时,生成控制信号;以及根据控制信号提供提示信息。
例如,可以将增强现实装置10中的红外线探测器15探测的反射红外光强度的数据发送至控制器20,控制器20可以对接收到的反射红外光强度的数据进行分析,以判断与增强现实装置10相关联的用户是否处于疲劳状态,例如与增强现实装置10相关联的用户是佩戴该增强现实装置10的用户。控制器20在判断用户处于疲劳状态时,可以生成控制信号,例如根据该控制信号可以向该用户提供提示信息,以提高安全性。
在本公开的实施例提供的增强现实系统中,控制器20可以与增强现实装置10设置在同一个物体上,例如当增强现实装置10设置在眼镜上时,控制器20也可以设置在该眼镜上(例如眼镜的镜腿中)。又例如控制器20也可以设置在相对于增强现实装置10的远端,例如设置在服务器上,增强现实装置10可以通过网络与服务器进行通信,例如控制器20可以通过网络将判断结果发送至增强现实装置10。
需要说明的是,本公开的实施例中所描述的网络可以为各种类型的通信网络,包括但不限于局域网、广域网以及Internet互联网等;可以实现为以 太网(Ethernet)、令牌网(Token Ring)、FDDI网、异步传输模式网(ATM)等;可以包括但不限于3G/4G/5G移动通信网络、无线局域网WIFI、有线通信网络等。相应地,服务器可以实现为多种形式,包括架设于局域网内的服务器、架设于广域网内的服务器或者在架设于互联网内的服务器,例如云端服务器,此时该云服务可以由公有云供应商(典型地例如亚马逊、阿里云等)提供,或者可以由私有云方式提供。
例如,控制器20可以被配置为根据用户的眼睑闭合时间或闭合频率判断用户是否处于疲劳状态。例如,人眼在睁开和闭合时对红外光的反射率不同,即人眼的眼球和眼睑对红外光的反射率不同,所以控制器20可以通过对接收到的反射红外光强度的数据进行处理以得到用户的眼睑闭合时间或闭合频率,从而可以判断该用户是否处于疲劳状态。
例如,在本公开的一个实施例中,如图12所示,增强现实系统1还可以包括传感器30。例如,传感器30被配置为采集用户的生理参数,例如传感器30包括血压传感器和脉搏传感器中的至少一种。
例如,在一个示例中传感器30可以为血压传感器,从而可以采集用户的血压;例如,在另一个示例中传感器30可以为脉搏传感器,从而可以采集用户的脉搏;又例如,在又一个示例中传感器30可以包括血压传感器和脉搏传感器,从而可以同时采集用户的血压和脉搏。需要说明的是,本公开的实施例对传感器30的类型不作限定。
例如,如图13所示,本公开的实施例提供的增强现实系统1可以具体实现为眼镜,即可以将增强现实系统1设置在眼镜上,在这种情形下可以将传感器30设置在镜腿内侧,并且当用户佩戴该眼镜时,传感器30可以靠近用户的太阳穴。需要说明的是,在图13中只是示意性的示出增强现实系统1,增强现实系统1可以设置在眼镜的一侧(单目形式),也可以设置在眼镜的两侧(双目形式),本公开的实施例对此不作限定。
例如,如图12所示,在增强现实系统1包括传感器30时,控制器20还可以被配置为:根据生理参数判断用户是否处于异常状态;在判断用户处于异常状态时,生成控制信号;以及根据控制信号提供提示信息。
例如,可以将传感器30采集的用户的生理参数发送至控制器20,控制器20可以对接收到的生理参数进行分析,以判断用户是否处于异常状态。需要说明的是,在本公开的实施例中,异常状态例如可以表示用户的生理参 数处于异常的状态。例如,可以预先设置一个生理参数的正常的阈值范围,当控制器20接收到的生理参数处于该阈值范围内时,即可以判定用户处于正常状态,否则可以判定用户处于异常状态。例如该生理参数可以为血压或脉搏。控制器20在判断用户处于异常状态时,可以生成控制信号,例如根据该控制信号可以向该用户提供提示信息,以提高安全性。
例如,在本公开的一个实施例中,如图14所示,增强现实系统1还可以包括电脉冲发生器40和表面电极41。电脉冲发生器40被配置为响应于控制信号而产生电脉冲信号,并将电脉冲信号传输至表面电极41。
继续回到图13所示的眼镜,例如,可以将表面电极41设置在眼镜的鼻翼附近,图13中在眼镜两侧的鼻翼附近均设置了表面电极41,本公开对此不作限定,例如还可以仅在眼镜一侧的鼻翼附近设置表面电极41。
例如,当增强现实系统1实现为眼镜时,用户可以在驾驶车辆或进行其他活动(例如高危工作时)时佩戴该眼镜,当控制器20判断用户处于疲劳状态或异常状态时,可以生成控制信号,电脉冲发生器40可以响应于该控制信号而产生电脉冲信号,并将该电脉冲信号传输至表面电极41。由于表面电极41与用户的鼻子直接接触,电脉冲信号可以传导至用户以对用户进行电刺激,从而提醒用户注意安全。
例如,在本公开的一个实施例中,如图15所示,增强现实系统1还可以包括定位装置50。例如,定位装置50被配置为获取用户的位置信息。例如,定位装置50可以采用GPS或北斗定位系统,本公开的实施例对此不作限定。
在本公开的实施例提供的增强现实系统中,通过设置定位装置,可以获取用户的实时位置信息,增强现实系统可以根据该位置信息做进一步的处理,例如根据该位置信息提供相关提示信息给用户,或者可以将该位置信息发送至与用户相关的第三方。
例如,在本公开的一个实施例中,如图15所示,增强现实系统1还可以包括语音生成装置60。例如,语音生成装置60被配置为响应于控制信号而播放提示信息,提示信息包括预设语音信息、根据用户的位置信息生成的附近的医院或休息场所的位置信息中的至少一种。
例如,当用户使用(例如佩戴)该增强现实系统1时,当控制器20判断用户处于疲劳状态或异常状态时,可以生成控制信号并发送至语音生成装 置60,语音生成装置60可以响应于该控制信号而播放提示信息。例如,语音生成装置60可以包括扬声器,当增强现实系统1例如采用图13所示的眼镜结构时,该扬声器可以设置于眼镜上,但对其具体设置位置不作限定,例如可以设置在镜腿上。
例如,在一个示例中,语音生成装置60播放的提示信息可以是预设语音信息,例如该预设语音信息可以预先存储在控制器20中,当控制器20判断用户处于疲劳状态或异常状态时,控制器20可以将存储的该预设语音信息发送至语音生成装置60进行播放。又例如该预设语音信息也可以预先存储在语音生成装置60中,当语音生成装置60接收到控制器20的控制信号时直接播放该预设语音信息。在这种情形下,语音生成装置60需要包括存储介质。
例如,该预设语音信息可以为“请前往附近的休息场所进行休息”或者“请前往附近的医院进行检查”等,本公开的实施例对预设语音信息的内容不作限定。
又例如,在另一个示例中,语音生成装置60播放的提示信息可以是根据用户的位置信息生成的附近的医院或休息场所的位置信息中的至少一种。例如,控制器20可以根据从定位装置50获取的用户的位置信息生成附近的医院或休息场所的位置信息,并转换成语音发送至语音生成装置60,由语音生成装置60进行播放。或者控制器20可以直接将附近的医院或休息场所的位置信息发送至语音生成装置60,由语音生成装置60转换成语音后再进行播放。
在本公开的实施例提供的增强现实系统中,通过设置语音生成装置,可以在用户处于疲劳状态或异常状态时播放提示信息,以对用户进行提醒,提高安全性。
例如,在本公开的一个实施例中,如图15所示,增强现实系统1还可以包括图像渲染装置70。例如,图像渲染装置70被配置为响应于控制信号而渲染出提示信息对应的图像,微显示器被配置为发射包括提示信息的光线,提示信息包括预设图像信息、根据用户的位置信息生成的附近的医院或休息场所的导航信息中的至少一种。
例如,当用户使用(例如佩戴)该增强现实系统1时,当控制器20判断用户处于疲劳状态或异常状态时,可以生成控制信号并发送至图像渲染装 置70,图像渲染装置70可以响应于该控制信号而渲染出提示信息对应的图像。控制器20可以将图像渲染装置70渲染出的图像发送至增强现实装置10中的微显示器,微显示器可以发射包括该提示信息的光线,从而实现对用户的提醒。
例如,在本公开的实施例中,图像渲染装置70可以单独设置,也可以和控制器20一体设置,即在硬件上可以封装在一起,本公开的实施例对此不作限定。
例如,在一个示例中,图像渲染装置70渲染的提示信息可以是预设图像信息,例如该预设图像信息可以预先存储在控制器20中,当控制器20判断用户处于疲劳状态或异常状态时,控制器20可以将存储的该预设图像信息发送至图像渲染装置70进行渲染。又例如该预设图像信息也可以预先存储在图像渲染装置70中,当图像渲染装置70接收到控制器20的控制信号时直接对该预设图像信息进行渲染。在这种情形下,图像渲染装置70需要包括存储介质。
例如,该预设图像信息可以是一种具有提醒或警示功能的图像,例如为一副显示为红色惊叹号“!”的图像,又例如可以为一副显示为红色字样“请注意安全”的图像,本公开的实施例对预设图像信息的内容不作限定。
又例如,在另一个示例中,图像渲染装置70所渲染的提示信息可以是根据用户的位置信息生成的附近的医院或休息场所的导航信息中的至少一种。例如,控制器20可以根据从定位装置50获取的用户的位置信息生成附近的医院或休息场所的导航信息,并发送至图像渲染装置70,由图像渲染装置70将该导航信息渲染成对应的图像。控制器20可以将图像渲染装置70渲染出的图像发送至增强现实装置10中的微显示器,微显示器可以发射包括该导航信息的光线,从而实现对用户的导航提示。
在本公开的实施例提供的增强现实系统中,通过设置图像渲染装置,可以在用户处于疲劳状态或异常状态时通过增强现实装置显示提示信息,以对用户进行提醒或者进行导航提示,提高安全性和便利性。
例如,在本公开的一个实施例中,如图15所示,增强现实系统1还可以包括通信装置80,通信装置80被配置为响应于控制信号与预设联系人进行通信。
例如,当用户使用(例如佩戴)该增强现实系统1时,当控制器20判 断用户处于疲劳状态或异常状态时,可以生成控制信号,通信装置80可以响应于该控制信号而与预设联系人进行通信。例如,可以通过发送信息或拨打电话的方式与预设联系人进行通信,例如可以将用户的生理参数数据发送至预设联系人。例如该预设联系人可以预先存储在控制器20或者通信装置80中,该预设联系人可以为用户的监护人、医生等。
例如,通信装置80可以通过网络与预设联系人进行通信,关于网络的描述可以参考上述实施例中相应描述,这里不再赘述。
在本公开的实施例提供的增强现实系统中,通过设置通信装置,可以在用户处于疲劳状态或异常状态时与预设联系人进行通信,预设联系人可以采取进一步的措施来保证用户的安全,从而可以对用户进行全方位的保护,可以提高安全性。
需要说明的是,本公开的实施例中提供的增强现实系统1中的控制器20和图像渲染装置70可以实现为包括专用集成电路、硬件(电路)、固件或其他任意组合方式,以实现期望的功能,例如可以具体实现为数字信号处理器等。
或者,控制器20和图像渲染装置70也可以实现为包括处理器和存储介质,该存储介质配置为存储有可适于处理器执行的计算机指令,且计算机指令被处理器执行时可以实现各自期望的功能。本公开的实施例对此不作限定。
在本公开的实施例中,处理器可以由通用集成电路芯片或专用集成电路芯片实现,例如该集成电路芯片可以设置在一个主板上,例如在该主板上还可以设置有存储器以及电源电路等;此外,处理器也可以由电路或者采用软件、硬件(电路)、固件或其任意组合方式实现。在本公开的实施例中,处理器可以包括各种计算结构,例如复杂指令集计算机(CISC)结构、精简指令集计算机(RISC)结构或者一种实行多种指令集组合的结构。在一些实施例中,处理器也可以是微处理器,例如X86处理器或ARM处理器,或者可以是数字处理器(DSP)等。
在本公开的实施例中,存储介质例如可以设置在上述主板上,存储介质可以保存处理器执行的指令和/或数据。例如,存储介质可以包括一个或多个计算机程序产品,所述计算机程序产品可以包括各种形式的计算机可读存储器,例如易失性存储器和/或非易失性存储器。所述易失性存储器例如可以包 括随机存取存储器(RAM)和/或高速缓冲存储器(cache)等。所述非易失性存储器例如可以包括只读存储器(ROM)、磁盘、光盘、半导体存储器(例如闪存等)等。在所述计算机可读存储器上可以存储一个或多个计算机程序指令,处理器可以运行所述程序指令,以实现本公开实施例中(由处理器实现)期望的功能。
本公开的一个实施例还提供一种增强现实系统的信息提示方法,如图16所示,该信息提示方法可以包括如下操作。
步骤S100:发射红外光至眼睛;
步骤S200:根据眼睛返回的反射红外光的强度,判断与增强现实系统相关联的用户是否处于疲劳状态;
步骤S300:在判断用户处于疲劳状态时,生成控制信号;以及
步骤S400:根据控制信号提供提示信息。
关于该信息提示方法的详细描述及其技术效果可以参考上述实施例中的相应描述,这里不再赘述。
以上,仅为本公开的具体实施方式,但本公开的保护范围并不局限于此,本公开的保护范围应以权利要求的保护范围为准。

Claims (21)

  1. 一种增强现实装置,包括微显示器和增强现实元件,其中,
    所述微显示器被配置为发射携带显示内容的光线,所述光线包括处于第一偏振态的第一光线部分和处于第二偏振态的第二光线部分;以及
    所述增强现实元件被配置为将所述第一光线部分由所述第一偏振态转换为所述第二偏振态,将所述第二光线部分由所述第二偏振态转换为所述第一偏振态,并将处于所述第二偏振态的所述第一光线部分和处于所述第一偏振态的所述第二光线部分耦合输出;
    其中,所述第一偏振态与所述第二偏振态正交。
  2. 根据权利要求1所述的增强现实装置,还包括光波导元件,所述光波导元件被配置为接收并传输所述增强现实元件耦合输出的所述第一光线部分和所述第二光线部分;
    其中,所述增强现实元件包括偏振分光镜、第一凹面反射镜、第二凹面反射镜、第一相位延迟片和第二相位延迟片;
    所述偏振分光镜被配置为接收来自所述微显示器的所述光线,透射处于所述第一偏振态的所述第一光线部分,并反射处于所述第二偏振态的所述第二光线部分;
    被所述偏振分光镜透射的所述第一光线部分被设置为穿过所述第一相位延迟片,被所述第一凹面反射镜反射,并再次穿过所述第一相位延迟片,其中,在所述第一相位延迟片和所述第一凹面反射镜共同作用下,所述第一光线部分由所述第一偏振态转换为所述第二偏振态;
    被所述偏振分光镜反射的所述第二光线部分被设置为穿过所述第二相位延迟片,被所述第二凹面反射镜反射,并再次穿过所述第二相位延迟片,其中,在所述第二相位延迟片和所述第二凹面反射镜共同作用下,所述第二光线部分由所述第二偏振态转换为所述第一偏振态;以及
    所述偏振分光镜还被配置为反射处于所述第二偏振态的所述第一光线部分和透射处于所述第一偏振态的所述第二光线部分,使得所述第一光线部分和所述第二光线部分均被耦合至所述光波导元件的入射面。
  3. 根据权利要求2所述的增强现实装置,还包括红外线发射器和红外线探测器;其中,
    所述红外线发射器被配置为发射红外光至所述增强现实元件;
    所述增强现实元件被配置为将所述红外光耦合至所述光波导元件的入射面;
    所述光波导元件被配置为经过半反半透面阵列将所述红外光传输至眼睛;
    所述光波导元件和所述增强现实元件还被配置为将所述眼睛反射的反射红外光沿着与所述红外光相反的传输路径传输至所述红外线探测器;以及
    所述红外线探测器被配置为探测所述反射红外光。
  4. 根据权利要求3所述的增强现实装置,其中,
    所述红外线发射器发射的所述红外光包括处于所述第一偏振态的第一红外光部分和处于所述第二偏振态的第二红外光部分;
    所述偏振分光镜还被配置为透射处于所述第一偏振态的所述第一红外光部分,并反射处于所述第二偏振态的所述第二红外光部分;
    被所述偏振分光镜透射的所述第一红外光部分被设置为穿过所述第一相位延迟片,被所述第一凹面反射镜反射,并再次穿过所述第一相位延迟片,其中,在所述第一相位延迟片和所述第一凹面反射镜共同作用下,所述第一红外光部分由所述第一偏振态转换为所述第二偏振态;
    被所述偏振分光镜反射的所述第二红外光部分被设置为穿过所述第二相位延迟片,被所述第二凹面反射镜反射,并再次穿过所述第二相位延迟片,其中,在所述第二相位延迟片和所述第二凹面反射镜共同作用下,所述第二红外光部分由所述第二偏振态转换为所述第一偏振态;
    所述偏振分光镜还被配置为反射处于所述第二偏振态的所述第一红外光部分和透射处于所述第一偏振态的所述第二红外光部分,使得所述第一红外光部分和所述第二红外光部分均被耦合至所述光波导元件的入射面;以及
    所述光波导元件被配置为经过所述半反半透面阵列将耦合后的所述第一红外光部分和所述第二红外光部分传输至所述眼睛。
  5. 根据权利要求4所述的增强现实装置,其中,
    所述光波导元件还被配置为经由所述半反半透面阵列,将所述眼睛反射的反射红外光传输至所述偏振分光镜,所述反射红外光包括处于所述第 二偏振态的第一反射红外光部分和处于所述第一偏振态的第二反射红外光部分;
    所述偏振分光镜还被配置为反射处于所述第二偏振态的所述第一反射红外光部分,并透射处于所述第一偏振态的所述第二反射红外光部分;
    被所述偏振分光镜反射的处于所述第二偏振态的所述第一反射红外光部分被设置为穿过所述第一相位延迟片,被所述第一凹面反射镜反射,并再次穿过所述第一相位延迟片,其中,在所述第一相位延迟片和所述第一凹面反射镜共同作用下,所述第一反射红外光部分由所述第二偏振态转换为所述第一偏振态;
    被所述偏振分光镜透射的处于所述第一偏振态的所述第二反射红外光部分被设置为穿过所述第二相位延迟片,被所述第二凹面反射镜反射,并再次穿过所述第二相位延迟片,其中,在所述第二相位延迟片和所述第二凹面反射镜共同作用下,所述第二反射红外光部分由所述第一偏振态转换为所述第二偏振态;以及
    所述偏振分光镜还被配置为透射处于所述第一偏振态的所述第一反射红外光部分和反射处于所述第二偏振态的所述第二反射红外光部分,使得所述第一反射红外光部分和所述第二反射红外光部分均通过所述偏振分光镜而传输至所述红外线探测器。
  6. 根据权利要求3-5任一项所述的增强现实装置,其中,
    所述第一相位延迟片和所述第一凹面反射镜均设置于所述增强现实元件的第一侧;
    所述第二相位延迟片和所述第二凹面反射镜均设置于所述增强现实元件的、与所述第一侧相邻的第三侧的第一端,所述第三侧的第一端靠近所述第一侧;以及
    所述光波导元件的入射面设置于所述增强现实元件的、与所述第一侧相邻的第四侧的第一端,所述第四侧的第一端靠近所述第一侧并与所述第三侧的第一端相对。
  7. 根据权利要求6所述的增强现实装置,其中,
    所述微显示器、所述红外线发射器和所述红外线探测器设置于所述增强现实元件的、与所述第一侧相对的第二侧。
  8. 根据权利要求6所述的增强现实装置,其中,所述增强现实元件 还包括第一红外分光镜;
    所述微显示器设置于所述增强现实元件的、与所述第一侧相对的第二侧;
    所述红外线发射器和所述红外线探测器均设置于所述增强现实元件的第三侧的第二端,所述第三侧的第二端靠近所述第二侧;或者,所述红外线发射器和所述红外线探测器均设置于所述增强现实元件的第四侧的第二端,所述第四侧的第二端靠近所述第二侧;以及
    所述第一红外分光镜被配置为将来自所述红外线发射器的所述红外光反射至所述偏振分光镜,并将所述微显示器发射的所述光线透射至所述偏振分光镜。
  9. 根据权利要求6所述的增强现实装置,其中,所述增强现实元件还包括第二红外分光镜;
    所述红外线发射器和所述红外线探测器均设置于所述增强现实元件的、与所述第一侧相对的第二侧;
    所述微显示器设置于所述增强现实元件的第三侧的第二端,所述第三侧的第二端靠近所述第二侧;或者,所述微显示器设置于所述增强现实元件的第四侧的第二端,所述第四侧的第二端靠近所述第二侧并与所述第三侧的第二端相对;以及
    所述第二红外分光镜被配置为将来自所述红外线发射器的所述红外光透射至所述偏振分光镜,并将所述微显示器发射的所述光线反射至所述偏振分光镜。
  10. 根据权利要求3-9任一项所述的增强现实装置,其中,
    所述半反半透面阵列设置在所述光波导元件中并包括呈阵列排布的多个半反半透镜;以及
    所述半反半透面阵列被配置为:
    将从所述光波导元件的入射面进入的、入射至所述半反半透面阵列上的所述光线和所述红外光传输至所述眼睛;以及
    将被所述眼睛反射的反射红外光传输至所述光波导元件的入射面,从而使得所述反射红外光进入所述增强现实元件。
  11. 根据权利要求2-10任一项所述的增强现实装置,其中,所述第一偏振态为p偏振态,所述第二偏振态为s偏振态,所述第一相位延迟片和 所述第二相位延迟片为四分之一波长相位延迟片。
  12. 一种增强现实系统,包括如权利要求3-11任一项所述的增强现实装置和控制器,其中,
    所述控制器被配置为:
    根据所述红外线探测器探测的反射红外光的强度,判断与所述增强现实装置相关联的用户是否处于疲劳状态;
    在判断所述用户处于疲劳状态时,生成控制信号;以及
    根据所述控制信号提供提示信息。
  13. 根据权利要求12所述的增强现实系统,其中,所述控制器被配置为根据所述用户的眼睑闭合时间或闭合频率判断所述用户是否处于疲劳状态。
  14. 根据权利要求12或13所述的增强现实系统,还包括传感器,其中,所述传感器被配置为采集所述用户的生理参数,所述传感器包括血压传感器和脉搏传感器中的至少一种。
  15. 根据权利要求14所述的增强现实系统,其中,所述控制器还被配置为:
    根据所述生理参数判断所述用户是否处于异常状态;
    在判断所述用户处于异常状态时,生成所述控制信号;以及
    根据所述控制信号提供所述提示信息。
  16. 根据权利要求12-15任一项所述的增强现实系统,还包括电脉冲发生器和表面电极;其中,
    所述电脉冲发生器被配置为响应于所述控制信号而产生电脉冲信号,并将所述电脉冲信号传输至所述表面电极。
  17. 根据权利要求12-16任一项所述的增强现实系统,还包括定位装置,其中,所述定位装置被配置为获取所述用户的位置信息。
  18. 根据权利要求12-17任一项所述的增强现实系统,还包括语音生成装置,其中,所述语音生成装置被配置为响应于所述控制信号而播放所述提示信息。
  19. 根据权利要求12-18任一项所述的增强现实系统,还包括图像渲染装置,其中,所述图像渲染装置被配置为响应于所述控制信号而渲染出所述提示信息对应的图像,所述微显示器被配置为发射包括所述提示信息 的所述光线。
  20. 根据权利要求12-19任一项所述的增强现实系统,还包括通信装置,其中,所述通信装置被配置为响应于所述控制信号与预设联系人进行通信。
  21. 一种用于如权利要求12-20任一项所述的增强现实系统的信息提示方法,包括:
    发射红外光至眼睛;
    根据所述眼睛返回的反射红外光的强度,判断与所述增强现实系统相关联的用户是否处于疲劳状态;
    在判断所述用户处于疲劳状态时,生成控制信号;以及
    根据所述控制信号提供提示信息。
PCT/CN2018/107323 2018-01-29 2018-09-25 增强现实装置、增强现实系统及其信息提示方法 WO2019144634A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP18857419.8A EP3748416B1 (en) 2018-01-29 2018-09-25 Augmented reality device, augmented reality system and information prompting method therefor
US16/337,536 US11460699B2 (en) 2018-01-29 2018-09-25 Augmented reality device, augmented reality system and information prompt method thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810084344.3A CN110095866B (zh) 2018-01-29 2018-01-29 增强现实装置、增强现实系统及其信息提示方法
CN201810084344.3 2018-01-29

Publications (1)

Publication Number Publication Date
WO2019144634A1 true WO2019144634A1 (zh) 2019-08-01

Family

ID=67394861

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/107323 WO2019144634A1 (zh) 2018-01-29 2018-09-25 增强现实装置、增强现实系统及其信息提示方法

Country Status (4)

Country Link
US (1) US11460699B2 (zh)
EP (1) EP3748416B1 (zh)
CN (1) CN110095866B (zh)
WO (1) WO2019144634A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022069403A1 (en) * 2020-09-30 2022-04-07 tooz technologies GmbH Human vital signs measuring by smart glasses

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7456583B2 (ja) * 2018-09-28 2024-03-27 ライト フィールド ラボ、インコーポレイテッド ライトフィールドディスプレイ用ホログラフィック対象物中継部
US11592684B2 (en) * 2018-11-14 2023-02-28 BRELYON Inc System and method for generating compact light-field displays through varying optical depths
JP2022554404A (ja) 2019-11-12 2022-12-28 ライト フィールド ラボ、インコーポレイテッド 中継システム
US11726326B1 (en) * 2020-06-11 2023-08-15 Meta Platforms Technologies, Llc Wedge light guide
CN112099239B (zh) * 2020-11-19 2021-06-08 北京亮亮视野科技有限公司 一种紧凑型波导显示光学系统及ar眼镜
US20220334392A1 (en) * 2021-04-16 2022-10-20 Nvidia Corporation Holographic virtual reality display
CN115144952B (zh) * 2022-09-06 2022-12-02 北京灵犀微光科技有限公司 光波导器件及近眼显示装置
CN117064343B (zh) * 2023-10-11 2023-12-19 汉达科技发展集团有限公司 一种可检测生命体征的智能ar偏振探测数据处理方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104395815A (zh) * 2012-05-21 2015-03-04 鲁姆斯有限公司 头戴式显示器眼球追踪器集成系统
CN105683812A (zh) * 2013-08-30 2016-06-15 英特尔公司 用于头戴式显示器的恶心和发病检测、预测和缓解
US9523852B1 (en) * 2012-03-28 2016-12-20 Rockwell Collins, Inc. Micro collimator system and method for a head up display (HUD)
CN107229119A (zh) * 2016-03-23 2017-10-03 北京三星通信技术研究有限公司 近眼显示设备及近眼显示的方法
CN107329273A (zh) * 2017-08-29 2017-11-07 京东方科技集团股份有限公司 一种近眼显示装置
CN206684389U (zh) * 2017-04-28 2017-11-28 歌尔科技有限公司 一种光学模组及增强现实眼镜

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06138432A (ja) * 1992-10-26 1994-05-20 Olympus Optical Co Ltd 頭部装着式ディスプレイ装置
JP5906692B2 (ja) * 2011-11-29 2016-04-20 セイコーエプソン株式会社 表示装置
US20140347736A1 (en) 2013-05-23 2014-11-27 Omnivision Technologies, Inc. Systems And Methods For Aligning A Near-Eye Display Device
US10302944B2 (en) * 2013-06-28 2019-05-28 Seiko Epson Corporation Head-mount type display device and method of controlling head-mount type display device
KR102246310B1 (ko) * 2013-12-31 2021-04-29 아이플루언스, 인크. 시선-기반 미디어 선택 및 편집을 위한 시스템들 및 방법들
US10473933B2 (en) * 2016-02-19 2019-11-12 Microsoft Technology Licensing, Llc Waveguide pupil relay
CN109116556A (zh) * 2017-06-23 2019-01-01 芋头科技(杭州)有限公司 一种成像显示系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9523852B1 (en) * 2012-03-28 2016-12-20 Rockwell Collins, Inc. Micro collimator system and method for a head up display (HUD)
CN104395815A (zh) * 2012-05-21 2015-03-04 鲁姆斯有限公司 头戴式显示器眼球追踪器集成系统
CN105683812A (zh) * 2013-08-30 2016-06-15 英特尔公司 用于头戴式显示器的恶心和发病检测、预测和缓解
CN107229119A (zh) * 2016-03-23 2017-10-03 北京三星通信技术研究有限公司 近眼显示设备及近眼显示的方法
CN206684389U (zh) * 2017-04-28 2017-11-28 歌尔科技有限公司 一种光学模组及增强现实眼镜
CN107329273A (zh) * 2017-08-29 2017-11-07 京东方科技集团股份有限公司 一种近眼显示装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3748416A4

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022069403A1 (en) * 2020-09-30 2022-04-07 tooz technologies GmbH Human vital signs measuring by smart glasses

Also Published As

Publication number Publication date
CN110095866A (zh) 2019-08-06
EP3748416A1 (en) 2020-12-09
EP3748416A4 (en) 2021-11-10
US11460699B2 (en) 2022-10-04
CN110095866B (zh) 2020-07-28
US20210356742A1 (en) 2021-11-18
EP3748416B1 (en) 2022-12-28

Similar Documents

Publication Publication Date Title
WO2019144634A1 (zh) 增强现实装置、增强现实系统及其信息提示方法
US10682055B1 (en) Fluorescent imaging on a head-mountable device
JP2022031715A (ja) 視標追跡及び走査レーザ投影をウェアラブルヘッドアップディスプレイに統合するシステム、デバイス、及び方法
TWI759395B (zh) 基於經由光導光學元件之視網膜成像的眼動追蹤器
US9958680B2 (en) Near-eye display device and methods with coaxial eye imaging
KR102277893B1 (ko) 증강 현실을 위한 방법 및 시스템
US8866702B1 (en) Use of optical display system as a visual indicator for a wearable computing device
US10345903B2 (en) Feedback for optic positioning in display devices
US11693243B2 (en) Polarizing beam splitting system
JP2023518421A (ja) 網膜結像および追跡のためのシステムおよび方法
CN105008981A (zh) 用于近眼显示器的光学系统
JP2022542750A (ja) アイトラッキング撮像における迷光抑制
JP2023513869A (ja) 二重波長による眼撮像
JP2007322769A (ja) 映像表示システム
WO2022199580A1 (zh) 电子设备及其控制方法
US20170261750A1 (en) Co-Aligned Retinal Imaging And Display System
CN107765435A (zh) 头戴显示装置
US11543662B2 (en) Augmented reality device including reflective polarizer and wearable device including the same
CN112051671B (zh) 一种近眼显示光机及其方法和近眼显示设备
CN214151260U (zh) 近眼显示系统及增强现实设备
JP5467287B2 (ja) ディスプレイユニット
US11205069B1 (en) Hybrid cornea and pupil tracking
WO2022130372A1 (en) Optical systems and methods for eye tracking based on eye imaging via collimating element and light-guide optical element
CN114063286A (zh) 用于增强现实的光学系统及头戴式增强现实设备
CN219162490U (zh) 一种近眼光学显示系统以及ar显示设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18857419

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018857419

Country of ref document: EP

Effective date: 20200831