CN110673340A - Augmented reality device and control method thereof - Google Patents

Augmented reality device and control method thereof Download PDF

Info

Publication number
CN110673340A
CN110673340A CN201910906702.9A CN201910906702A CN110673340A CN 110673340 A CN110673340 A CN 110673340A CN 201910906702 A CN201910906702 A CN 201910906702A CN 110673340 A CN110673340 A CN 110673340A
Authority
CN
China
Prior art keywords
polarization state
virtual image
ambient light
display
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910906702.9A
Other languages
Chinese (zh)
Inventor
任红恩
姜滨
迟小羽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Optical Technology Co Ltd
Original Assignee
Goertek Techology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Techology Co Ltd filed Critical Goertek Techology Co Ltd
Priority to CN201910906702.9A priority Critical patent/CN110673340A/en
Publication of CN110673340A publication Critical patent/CN110673340A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a control method of an augmented display device and an augmented reality device. The method comprises the following steps: receiving ambient light of a first polarization state from the polarizer, wherein the ambient light of the first polarization state is obtained by filtering natural light by the polarizer; acquiring a shielding corresponding position on the transparent screen, wherein the shielding corresponding position is a position on the transparent screen corresponding to a first part of the first polarization state ambient light; the transparent screen is controlled to convert the first part of the ambient light in the first polarization state into ambient light in a second polarization state, the ambient light is emitted to the spectroscope, and the second part of the ambient light in the first polarization state is transmitted to the spectroscope; controlling a display to display a virtual image and emitting virtual image light in a first polarization state; wherein the first portion of the ambient light of the first polarization state is used to present a particular ambient object to the user and the particular ambient object is intended to occlude a portion of the virtual image light from presenting to a particular virtual image of the user.

Description

Augmented reality device and control method thereof
Technical Field
The invention relates to the technical field of display, in particular to a control method of augmented reality equipment and the augmented reality equipment.
Background
Currently, augmented reality AR technology can be implemented by means of video perspective or optical perspective to achieve the effect of superposition of real environment and virtual world.
When the AR technology is realized by using video perspective, firstly, an image of a real environment is collected through a camera device, then the image of the real environment is converted into a digital image through a processor, a virtual image of a virtual world is superposed on the image of the real environment in the conversion process, and finally, the superposed image is displayed through a display. However, with this approach, the amount of computation of the processor is very large, which results in the display not being able to display in time an image formed by superimposing a virtual image of the virtual world on an image of the real environment. In addition, the display effect of the method depends on the acquisition of the camera device and the rendering and restoring capability of the processor, the camera device cannot avoid generating certain distortion in the acquisition process, and certain distortion can be generated after the processor renders and in the restoring process, so that the difference between the image of the real environment in the image finally displayed by the display and the image of the real environment is larger.
To address the problems presented in video fluoroscopy as described above, optical fluoroscopy is often employed to implement AR techniques. This is because when the AR technology is implemented using optical perspective, the image of the real environment is directly seen by human eyes, and there is no need to convert and render the image of the real environment. However, when the AR technology is implemented by using optical perspective, the virtual image of the virtual world is directly superimposed on the image of the real environment seen by the user, so that the user can clearly distinguish the virtual image from the image of the real environment in the image displayed by using the AR technology of optical perspective, and the user experience is poor.
Disclosure of Invention
The invention aims to provide a new technical scheme for controlling augmented reality equipment.
According to a first aspect of the present invention, there is provided a method for controlling an augmented reality device, including:
the augmented reality equipment comprises a polaroid, a transparent screen, a spectroscope, a display, a polarization conversion reflector and a processor, and the control method comprises the following steps:
receiving ambient light in a first polarization state from the polarizer, wherein the ambient light in the first polarization state is obtained by filtering natural light by the polarizer;
acquiring a shielding corresponding position on the transparent screen, wherein the shielding corresponding position is a position on the transparent screen corresponding to a first part of the first polarization state ambient light;
the transparent screen is controlled to convert the first part of the ambient light in the first polarization state into ambient light in a second polarization state, the ambient light is emitted to the spectroscope, and the second part of the ambient light in the first polarization state is transmitted to the spectroscope; controlling the display to display a virtual image and emitting virtual image light in a first polarization state; wherein the first portion of the first polarization state of the ambient light is used to present a particular ambient object to a user and the particular ambient object is intended to occlude a portion of the virtual image light from presenting to a particular virtual image of the user;
the beam splitter is configured to transmit the second portion of the ambient light with the first polarization state, transmit the virtual image light with the first polarization state, and reflect the ambient light with the second polarization state.
Optionally, the method further includes:
determining an occlusion region on a display;
controlling an area of the display other than the occluded area to display a virtual image;
wherein the virtual image light plan from the occlusion region is used to present a set virtual image to a user, and the set virtual image plan occludes a set environment object presented to the user, the set environment object being presented by a portion of the second portion of the first polarization state of ambient light.
Optionally, the obtaining of the corresponding position of the shielding on the transparent screen includes:
acquiring an environment image containing depth information;
and acquiring the shielding corresponding position on the transparent screen according to the environment image.
Optionally, the method further includes:
determining the object type of the environmental object according to the environmental image containing the depth information;
generating the specific virtual image according to the object type.
Optionally, the method further includes:
acquiring the head motion state information of the user in real time;
and adjusting the display position and/or the display state of the specific virtual image according to the motion state information.
According to a second aspect of the invention, there is provided an augmented reality device comprising:
the polarizer is used for filtering natural light to obtain ambient light in a first polarization state;
the transparent screen is arranged in parallel with the polaroid and positioned on one side of human eyes, and is used for transmitting a second part of the ambient light in the first polarization state and converting the first part of the ambient light in the first polarization state into the ambient light in the second polarization state;
the spectroscope and the transparent screen form a first preset included angle and are positioned on one side of human eyes, and the spectroscope is configured to transmit the second part of the first polarization state ambient light from the transparent screen and reflect the second polarization state ambient light;
the display and the spectroscope form a second preset included angle and is used for displaying a virtual image and emitting virtual image light in a first polarization state;
the polarized light conversion reflector is arranged on the other side of the spectroscope and is arranged on two sides of the spectroscope together with the display, and is used for converting the virtual image light in the first polarization state into virtual image light in a second polarization state and reflecting the virtual image light in the second polarization state to the spectroscope;
the beam splitter is further configured to transmit virtual image light of the first polarization state from the display; the augmented reality device further includes:
a processor, configured to obtain a position corresponding to a barrier on the transparent screen, where the position corresponding to the barrier is a position where the first portion of the ambient light in the first polarization state corresponds to the transparent screen, and control the transparent screen to convert the first portion of the ambient light in the first polarization state into the ambient light in the second polarization state; controlling the display to display a virtual image and emitting virtual image light in a first polarization state; wherein the first portion of the first polarization state of the ambient light is used to present a particular ambient object to a user and the particular ambient object is intended to occlude a portion of the virtual image light from presenting to a particular virtual image of the user.
Optionally, the processor is further configured to determine an occlusion region on the display, and control a region of the display other than the occlusion region to display a virtual image; wherein the virtual image light from the occlusion region is used to present a set virtual image to a user, and the set virtual image plans to occlude a set environmental object presented to the user, the set environmental object being presented by a portion of the second portion of the first polarization state of the ambient light.
Optionally, the method further includes:
the image acquisition unit is used for acquiring an environment image containing depth of field information;
the processor is further configured to obtain the corresponding shielding position on the transparent screen according to the environment image.
Optionally, the processor is further configured to
Determining the object type of the specific environment object according to the environment image containing the depth information;
generating the specific virtual image according to the object type.
Alternatively to this, the first and second parts may,
the processor is also used for
Acquiring the head motion state information of the user in real time;
and adjusting the display position and/or the display state of the specific virtual image according to the motion state information.
In this embodiment, a position corresponding to the shielding on the transparent screen may be obtained, and then the transparent screen is controlled to convert the first part of the ambient light in the first polarization state at the position corresponding to the shielding into ambient light in the second deflection state and emit the ambient light to the spectroscope. The ambient light in the second deflected state will not enter the human eye at this time. I.e. the first portion of the ambient light of the first polarization state will not be visible to the user for the particular ambient object presented to the user. On the basis, the display is controlled to display the virtual image and emit the virtual image light in the first polarization state, at this time, the virtual image is highly fused with the real environment object (in the embodiment, the object represented by the ambient light in the first polarization state), and the experience that the virtual image blocks the real environment object can be provided for the user. Therefore, the user can clearly distinguish the virtual image from the real environment object, and the user experience is greatly improved. For example, the display effect of the method provided by the present embodiment can be as shown in fig. 4.
Other features of the present invention and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required to be used in the embodiments will be briefly described below. It is appreciated that the following drawings depict only certain embodiments of the invention and are therefore not to be considered limiting of its scope. For a person skilled in the art, it is possible to derive other relevant figures from these figures without inventive effort.
Fig. 1 is a schematic flowchart of a control method for an augmented reality device according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an augmented reality device according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of another augmented reality device according to an embodiment of the present invention;
fig. 4 is a schematic diagram of a display effect of an augmented reality device according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a display effect of another augmented reality device according to an embodiment of the present invention.
Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
< control method of augmented reality device >
As shown in fig. 1, an embodiment of the present invention provides a method for controlling an augmented reality device, where the augmented reality device is shown in fig. 2 and includes a polarizer, a transparent screen, a beam splitter, a display, a polarization conversion transmitter, and a processor.
For augmented reality equipment, specifically, the transparent screen and the polarizing film are arranged in parallel and are positioned at the same side of human eyes, and the polarizing film is positioned at the side of the transparent screen far away from the human eyes; the spectroscope and the transparent screen form a first preset included angle alpha and are positioned on the same side of human eyes, and the spectroscope is positioned between the transparent screen and the human eyes; the display and the spectroscope form a second preset included angle beta; and the polarized light conversion reflector is arranged in parallel with the display and is respectively arranged on two sides of the spectroscope together with the display.
Note that the processor is not shown in fig. 2. It will be appreciated that the processor may be located in a variety of positions in the structure shown in fig. 2, as long as the processor does not block the various lights involved in the present embodiment. In addition, the control method of the augmented reality device provided by the embodiment is implemented by a processor.
In addition, the augmented reality device may be: AR glasses, AR helmets, and the like.
Based on the augmented reality device, the method for controlling the augmented reality device provided by this embodiment includes the following steps S101 to S103:
s101, receiving first polarization state ambient light from the polarizer, wherein the first polarization state ambient light is obtained by filtering natural light through the polarizer.
In one embodiment, the polarizer may be a P-type polarizer. In this case, the polarizer is used to filter natural light to obtain ambient light in the P-polarization state. Based on this, when the polarizing plate is a P-type polarizing plate, the first polarization state in S101 is a P-polarization state.
In another embodiment, the polarizer may be an S-type polarizer. In this case, the polarizer is used to filter natural light to obtain ambient light in the S polarization state. Based on this, when the polarizing plate is an S-type polarizing plate, the first polarization state in S101 is an S-polarization state.
It should be noted that the polarizer filters natural light, and does not change the real environment seen by human eyes. Thus, based on the ambient light of the first polarization state, the user can see the real environment.
S102, obtaining a shielding corresponding position on the transparent screen, wherein the shielding corresponding position is a position on the transparent screen corresponding to the first part of the first polarization state of the ambient light.
Wherein the first portion of the ambient light of the first polarization state is used to present a particular ambient object to the user and the particular ambient object is intended to occlude a portion of the virtual image light from presenting to a particular virtual image of the user.
It should be noted that the specific environment object is an entity in the environment.
In one embodiment, the processor may directly acquire the occlusion corresponding position from a terminal with an augmented reality function.
In another embodiment, the obtaining of the corresponding position of the transparent on-screen occlusion may also be obtained by the processor, and the specific obtaining manner may include the following steps S1021 and S1022:
and S1021, acquiring an environment image containing depth information.
In an embodiment, the specific implementation of S1021 may be: an image of the environment is acquired by an image acquisition unit (such as a binocular camera or a TOF camera), and the image of the environment corresponds to depth information. It can be understood that the depth information of the environment image may reflect the distance from the transparent screen to the entity corresponding to the environment image. It should be noted that the environment image and the corresponding depth of field information acquired by the image acquisition unit are acquired from the direction of the human eye sight.
And S1022, acquiring a shielding corresponding position on the transparent screen according to the environment image.
In this embodiment, the specific implementation of S1022 may be: acquiring a display position of a specific virtual image on a display; and obtaining the position of the specific virtual image on the transparent screen based on the conversion relation between the coordinate system of the display and the coordinate system of the transparent screen. And when the visual distance between the specific virtual image represented by the depth information corresponding to the specific virtual image and the transparent screen is smaller than or equal to the distance between the specific environment object and the transparent screen, determining the position of the specific virtual image corresponding to the transparent screen as the corresponding shielding position.
In one example, the particular virtual image may be an image of a virtual monster.
In one embodiment, in the embodiments shown in S1021 and S1022, the specific virtual image may be directly obtained from a terminal having an augmented reality function. Of course, it may also be derived for the processor based on the ambient image. Based on this, on the basis of the above S1021 and S1022, the method for controlling an augmented reality device according to the present embodiment further includes the following steps S1023 and S1024:
and S1023, determining the object type of the environment object according to the environment image containing the depth information.
S1024, generating a specific virtual image according to the object type of the environment object.
In this embodiment, the specific implementation of S1023 and S1024 may be: the processor acquires an environment image containing depth information, and then identifies an object type corresponding to each environment object in the environment image based on an identification algorithm (such as a deep learning algorithm). And then submitting the object type corresponding to each environmental object in the identified environmental image to a terminal with an augmented reality function. The terminal with the augmented reality function selects an object type corresponding to a specific environment object based on the function of the terminal, acquires a specific virtual image corresponding to the object type of the specific environment object and depth information of the specific virtual image corresponding to the object type of the specific environment object based on a mapping relation between the object type and the specific virtual image which are stored in advance, and sends the acquired specific virtual image and the corresponding depth information to the augmented reality device.
For example, when the terminal having the augmented reality function is one game terminal, the terminal having the augmented reality function may select an object type having a sofa in the environment as a specific environment object based on its own game function, and then generate a specific virtual object, which is a virtual monster, based on the sofa.
In one example, the terminal with augmented reality function may be a mobile phone, a notebook computer, a personal digital computer, a tablet computer, and the like.
In an embodiment, on the basis of any of the above embodiments, the method for controlling an augmented reality device according to the embodiment of the present invention further includes the following steps S1025 and S1026:
and S1025, acquiring the head motion state information of the user in real time.
In this embodiment, it can be understood that the head motion state information of the user is motion state information of the augmented reality device worn by the user.
And S1026, adjusting the display position and/or the display state of the specific virtual image according to the motion state information.
In the present embodiment, the display position mentioned above refers to the display position of the specific virtual image in the display, and it is understood that when the display position of the specific virtual image in the display changes, the position of the specific virtual image presented in the human eye also changes.
The above-mentioned display state refers to a specific virtual image being presented at different angles with respect to the user, i.e. the augmented display device.
In this embodiment, the display position and/or the display state of the specific virtual image may be adjusted according to the head motion state information of the user acquired in real time, which improves the intelligence of the control method of the augmented reality device provided in this embodiment.
S103, controlling the transparent screen to convert the first part of the first polarization state ambient light into second polarization state ambient light, emitting the second polarization state ambient light to the spectroscope, and transmitting the second part of the first polarization state ambient light to the spectroscope; the display is controlled to display a virtual image and emit virtual image light of a first polarization state. Wherein the first portion of the ambient light of the first polarization state is used to present a particular ambient object to the user and the particular ambient object is intended to occlude a portion of the virtual image light from presenting to a particular virtual image of the user.
In this embodiment, the transparent screen may be an LCD display screen. When the transparent screen is an LCD screen, the transparent screen is composed of a liquid crystal matrix. And the liquid crystal in the transparent screen may be deflected under the control of the processor. When a part of the liquid crystal on the transparent screen is deflected under the control of the processor, the polarization state of the light transmitted by the part of the liquid crystal is changed. For example, when a portion of the liquid crystal in the transparent screen is deflected by 90 ° under the control of the processor and passes through the polarizer to obtain the ambient light of the first polarization state, the portion of the liquid crystal at which the corresponding ambient light of the first polarization state is converted into the ambient light of the second polarization state. Based on this, in S103, the manner of controlling the transparent screen to convert the first portion of the ambient light with the first polarization state into the ambient light with the second polarization state and then to emit the ambient light to the spectroscope includes: the liquid crystal at the position corresponding to the shielding position on the transparent screen is controlled to deflect 90 degrees.
In this embodiment, the beam splitter can be a PBS beam splitter. In addition, as shown in fig. 2, the first preset included angle is denoted as α, and α may be 45 °, but α may also be another angle. The beam splitter is configured to exit (i.e., reflect) ambient light of the second polarization state and transmit a second portion of the ambient light of the first polarization state. Based on this, the human eye will not receive ambient light of the second polarization state, i.e. the human eye will not see the specific ambient object.
In this embodiment, the display is typically a microdisplay, and in one example, the display may be an LCOS reflective device. Alternatively, the display may be a micro display that actively emits light, such as a uOLED or a uuled. It should be noted that both the LCOS reflective device and the active light emitting microdisplay include optical structures to emit light with a predetermined polarization state. In the embodiment of the present invention, the predetermined polarization state is the first polarization state. As shown in fig. 2, the second preset included angle is represented as β, β may be 90 °, and β may also be other angles.
In addition, taking fig. 2 as an example, the optical path of the display device in this embodiment for emitting the virtual image light in the first polarization state is: the virtual image light of the first polarization state emitted by the display passes through the spectroscope and directly reaches the polarized light conversion reflector; the polarized light conversion reflector converts the virtual image light of the first polarization state emitted by the display into virtual image light of a second polarization state, and reflects the virtual image light of the second polarization state to the spectroscope; the spectroscope reflects the virtual image light of the second polarization state to human eyes.
Taking fig. 3 as an example, the optical path of the virtual image light of the first polarization state emitted by the display in this embodiment is: the virtual image light of the first polarization state emitted by the display passes through the spectroscope, and when the virtual image light of the first polarization state emitted by the display passes through the 1/4 wave plate for the first time, the vibration direction of the virtual image light of the first polarization state emitted by the display is deflected for the first time by 45 degrees, and at the moment, the polarization state of the light after the first deflection is between the P polarization state and the S polarization state; the light after the first deflection reaches the mirror and is reflected by the mirror to reach the 1/4 wave plate again, the vibration direction is deflected again by 45 °, at this time, the light after the second deflection becomes virtual image light of the second polarization state; the virtual image light in the second polarization state directly reaches the spectroscope, and the spectroscope reflects the virtual image light in the second polarization state to human eyes.
It should be noted that, in this embodiment, the virtual image includes the specific virtual image, and all of the virtual images except the specific virtual image are transparent. In fig. 2 and 3, light of the second polarization state is indicated by thin lines with arrows, and light of the second polarization state is indicated by thick lines with arrows.
In this embodiment, a position corresponding to the shielding on the transparent screen may be obtained, and then the transparent screen is controlled to convert the first part of the ambient light in the first polarization state at the position corresponding to the shielding into ambient light in the second deflection state and emit the ambient light to the spectroscope. The ambient light in the second deflected state will not enter the human eye at this time. I.e. the first portion of the ambient light of the first polarization state will not be visible to the user for the particular ambient object presented to the user. On the basis, the display is controlled to display the virtual image and emit the virtual image light in the first polarization state, at this time, the virtual image is highly fused with the real environment object (in the embodiment, the object represented by the ambient light in the first polarization state), and the experience that the virtual image blocks the real environment object can be provided for the user. Therefore, the user can clearly distinguish the virtual image from the real environment object, and the user experience is greatly improved. For example, the display effect of the method provided by the present embodiment can be as shown in fig. 4.
On the basis of any of the above embodiments, the method for controlling an augmented reality device according to an embodiment of the present invention further includes the following steps:
and S104, determining a shielding area on the display.
In an embodiment, a specific implementation manner of determining the occlusion region on the display may be as follows: known directly from the terminal with augmented reality functionality.
In another embodiment, the specific implementation manner of determining the occlusion area on the display may further be: acquiring a display position of a specific virtual image on a display; and obtaining the position of the specific virtual image on the transparent screen based on the conversion relation between the coordinate system of the display and the coordinate system of the transparent screen. And when the visual distance between the specific virtual image represented by the depth information corresponding to the specific virtual image and the transparent screen is greater than the distance between the specific environment object at the position of the specific virtual image corresponding to the transparent screen and the transparent screen, determining the position of the specific virtual image corresponding to the transparent screen as the shielding corresponding position.
And S105, controlling the area except the shielding area on the display to display the virtual image.
The virtual image light plan from the occlusion region in S104 is used to present a set virtual image, and the set virtual image plan occludes a set environment object presented to the user, where the set environment object is presented by a part of the second part of the environment light in the first polarization state. That is, the setting environment object is different from the specific environment object, and the setting virtual image is different from the specific virtual image.
In this embodiment, an occlusion region on the display may be determined, and a region other than the occlusion region on the display is controlled to show the virtual image. Thus, the effect of setting the virtual image to be occluded by the environmental object can be achieved. At this time, the virtual image will be highly fused with the real-environment object (in this embodiment, the object represented by the second part of the first polarization state of the ambient light), and the experience that the virtual image is blocked by the real-environment object can be provided for the user. Therefore, the user can clearly distinguish the virtual image from the real environment object, and the user experience is further improved. For example, the display effect of the method provided by the present embodiment can be as shown in fig. 5. The remaining second portion of the ambient light with the first polarization state referred to in fig. 5 refers to the second portion of the ambient light with the first polarization state except for the second portion of the ambient light with the first polarization state.
< augmented reality apparatus >
The embodiment of the invention provides augmented reality equipment which can be augmented reality glasses or an augmented reality helmet and the like. The enhanced display device includes: the device comprises a polarizer, a transparent screen, a spectroscope, a display, a polarization conversion reflector and a processor. Wherein:
the polarizer is used for filtering natural light to obtain ambient light in a first polarization state;
the transparent screen is arranged in parallel with the polaroid and positioned on one side of human eyes, and is used for transmitting a second part of the ambient light in the first polarization state and converting the first part of the ambient light in the first polarization state into the ambient light in the second polarization state;
the spectroscope and the transparent screen form a first preset included angle and are positioned on one side of human eyes, and the spectroscope is configured to transmit the second part of the first polarization state ambient light from the transparent screen and reflect the second polarization state ambient light;
the display and the spectroscope form a second preset included angle and is used for displaying a virtual image and emitting virtual image light in a first polarization state;
the polarized light conversion reflector is arranged on the other side of the spectroscope and is arranged on two sides of the spectroscope together with the display, and is used for converting the virtual image light in the first polarization state into virtual image light in a second polarization state and reflecting the virtual image light in the second polarization state to the spectroscope;
the beam splitter is further configured to transmit virtual image light of the first polarization state from the display; the augmented reality device further includes:
a processor, configured to obtain a position corresponding to a barrier on the transparent screen, where the position corresponding to the barrier is a position where the first portion of the ambient light in the first polarization state corresponds to the transparent screen, and control the transparent screen to convert the first portion of the ambient light in the first polarization state into the ambient light in the second polarization state; controlling the display to display a virtual image and emitting virtual image light in a first polarization state; wherein the first portion of the first polarization state of the ambient light is used to present a particular ambient object to the user and the particular ambient object is intended to occlude a portion of the virtual image light from presenting to a particular virtual image of the user.
In one embodiment, the processor is further configured to determine an occlusion region on the display and control a region on the display other than the occlusion region to display a virtual image; and the virtual image light from the shielding area is used for presenting a set virtual image to a user, and the set virtual image plans to shield a set environment object presented to the user, wherein the set environment object is presented by part of the second part of the first polarization state of the environment light.
In an embodiment, the augmented reality device provided in this embodiment further includes:
the image acquisition unit is used for acquiring an environment image containing depth of field information;
the processor is further configured to obtain the corresponding shielding position on the transparent screen according to the environment image.
In one embodiment, the processor is further configured to determine an object type of the specific environmental object according to the environmental image containing depth information;
generating the specific virtual image according to the object type.
In one embodiment, the processor is further configured to
Acquiring the motion state information of the user in real time;
and adjusting the display position and/or the display state of the specific virtual image according to the motion state information.
It should be noted that, the specific structure of the augmented reality device related to this embodiment can refer to the structure shown in fig. 2 and fig. 3. In addition, for specific functions implemented by the processor in the augmented reality device provided by this embodiment, reference may be made to the control method of the augmented reality device provided by the above embodiment, and details are not described here.
The present invention may be a system, method and/or computer program product. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied therewith for causing a processor to implement various aspects of the present invention.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present invention may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present invention are implemented by personalizing an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of computer-readable program instructions, which can execute the computer-readable program instructions.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, by software, and by a combination of software and hardware are equivalent.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the invention is defined by the appended claims.

Claims (10)

1. A control method of augmented reality equipment is characterized in that the augmented reality equipment comprises a polarizer, a transparent screen, a spectroscope, a display, a polarization conversion reflector and a processor, and the control method comprises the following steps:
receiving ambient light in a first polarization state from the polarizer, wherein the ambient light in the first polarization state is obtained by filtering natural light by the polarizer;
acquiring a shielding corresponding position on the transparent screen, wherein the shielding corresponding position is a position on the transparent screen corresponding to a first part of the first polarization state ambient light;
the transparent screen is controlled to convert the first part of the ambient light in the first polarization state into ambient light in a second polarization state, the ambient light is emitted to the spectroscope, and the second part of the ambient light in the first polarization state is transmitted to the spectroscope; controlling the display to display a virtual image and emitting virtual image light in a first polarization state; wherein the first portion of the first polarization state of the ambient light is used to present a particular ambient object to a user and the particular ambient object is intended to occlude a portion of the virtual image light from presenting to a particular virtual image of the user;
the beam splitter is configured to transmit the second portion of the ambient light with the first polarization state, transmit the virtual image light with the first polarization state, and reflect the ambient light with the second polarization state.
2. The control method according to claim 1, characterized in that the method further comprises:
determining an occlusion region on a display;
controlling an area of the display other than the occluded area to display a virtual image;
wherein the virtual image light plan from the occlusion region is used to present a set virtual image to a user, and the set virtual image plan occludes a set environment object presented to the user, the set environment object being presented by a portion of the second portion of the first polarization state of ambient light.
3. The control method according to claim 1, wherein the obtaining of the position corresponding to the occlusion on the transparent screen comprises:
acquiring an environment image containing depth information;
and acquiring the shielding corresponding position on the transparent screen according to the environment image.
4. The control method according to claim 3, characterized in that the method further comprises:
determining the object type of the environmental object according to the environmental image containing the depth information;
generating the specific virtual image according to the object type.
5. The control method according to claim 4, characterized in that the method further comprises:
acquiring the head motion state information of the user in real time;
and adjusting the display position and/or the display state of the specific virtual image according to the motion state information.
6. An augmented reality device, comprising:
the polarizer is used for filtering natural light to obtain ambient light in a first polarization state;
the transparent screen is arranged in parallel with the polaroid and positioned on one side of human eyes, and is used for transmitting a second part of the ambient light in the first polarization state and converting the first part of the ambient light in the first polarization state into the ambient light in the second polarization state;
the spectroscope and the transparent screen form a first preset included angle and are positioned on one side of human eyes, and the spectroscope is configured to transmit the second part of the first polarization state ambient light from the transparent screen and reflect the second polarization state ambient light;
the display and the spectroscope form a second preset included angle and is used for displaying a virtual image and emitting virtual image light in a first polarization state;
the polarized light conversion reflector is arranged on the other side of the spectroscope and is arranged on two sides of the spectroscope together with the display, and is used for converting the virtual image light in the first polarization state into virtual image light in a second polarization state and reflecting the virtual image light in the second polarization state to the spectroscope;
the beam splitter is further configured to transmit virtual image light of the first polarization state from the display; the augmented reality device further includes:
a processor, configured to obtain a position corresponding to a barrier on the transparent screen, where the position corresponding to the barrier is a position where the first portion of the ambient light in the first polarization state corresponds to the transparent screen, and control the transparent screen to convert the first portion of the ambient light in the first polarization state into the ambient light in the second polarization state; controlling the display to display a virtual image and emitting virtual image light in a first polarization state; wherein the first portion of the first polarization state of the ambient light is used to present a particular ambient object to a user and the particular ambient object is intended to occlude a portion of the virtual image light from presenting to a particular virtual image of the user.
7. Augmented reality device according to claim 6,
the processor is further configured to determine an occlusion region on the display and control a region other than the occlusion region on the display to display a virtual image; wherein the virtual image light from the occlusion region is used to present a set virtual image to a user, and the set virtual image plans to occlude a set environmental object presented to the user, the set environmental object being presented by a portion of the second portion of the first polarization state of the ambient light.
8. The augmented reality device of claim 6, further comprising:
the image acquisition unit is used for acquiring an environment image containing depth of field information;
the processor is further configured to obtain the corresponding shielding position on the transparent screen according to the environment image.
9. Augmented reality device according to claim 8,
the processor is also used for
Determining the object type of the specific environment object according to the environment image containing the depth information;
generating the specific virtual image according to the object type.
10. Augmented reality device according to claim 9,
the processor is also used for
Acquiring the head motion state information of the user in real time;
and adjusting the display position and/or the display state of the specific virtual image according to the motion state information.
CN201910906702.9A 2019-09-24 2019-09-24 Augmented reality device and control method thereof Pending CN110673340A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910906702.9A CN110673340A (en) 2019-09-24 2019-09-24 Augmented reality device and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910906702.9A CN110673340A (en) 2019-09-24 2019-09-24 Augmented reality device and control method thereof

Publications (1)

Publication Number Publication Date
CN110673340A true CN110673340A (en) 2020-01-10

Family

ID=69077502

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910906702.9A Pending CN110673340A (en) 2019-09-24 2019-09-24 Augmented reality device and control method thereof

Country Status (1)

Country Link
CN (1) CN110673340A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111458881A (en) * 2020-05-13 2020-07-28 歌尔科技有限公司 Display system and head-mounted display equipment
CN113703582A (en) * 2021-09-06 2021-11-26 联想(北京)有限公司 Image display method and device
CN113747626A (en) * 2020-05-29 2021-12-03 北京小米移动软件有限公司 Ambient light determination method, ambient light determination device, terminal equipment and medium
CN114002869A (en) * 2021-12-30 2022-02-01 南昌虚拟现实研究院股份有限公司 Optical adjusting system applied to virtual reality display
WO2022111668A1 (en) * 2020-11-30 2022-06-02 华为技术有限公司 Virtual-reality fusion display device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103809687A (en) * 2012-11-06 2014-05-21 索尼电脑娱乐公司 Head mounted display, motion detector, motion detection method, image presentation system and program
CN107065196A (en) * 2017-06-16 2017-08-18 京东方科技集团股份有限公司 A kind of augmented reality display device and augmented reality display methods
US20170255017A1 (en) * 2016-03-03 2017-09-07 Disney Enterprises, Inc. Increasing returned light in a compact augmented reality / virtual reality display
CN108333773A (en) * 2018-04-13 2018-07-27 深圳鸿鑫晶光电有限公司 AR head-mounted display apparatus
CN208092341U (en) * 2017-12-21 2018-11-13 成都理想境界科技有限公司 A kind of optical system for wearing display equipment
WO2019135165A2 (en) * 2018-01-03 2019-07-11 Khan Sajjad Ali Method and system for occlusion capable compact displays

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103809687A (en) * 2012-11-06 2014-05-21 索尼电脑娱乐公司 Head mounted display, motion detector, motion detection method, image presentation system and program
US20170255017A1 (en) * 2016-03-03 2017-09-07 Disney Enterprises, Inc. Increasing returned light in a compact augmented reality / virtual reality display
CN107065196A (en) * 2017-06-16 2017-08-18 京东方科技集团股份有限公司 A kind of augmented reality display device and augmented reality display methods
CN208092341U (en) * 2017-12-21 2018-11-13 成都理想境界科技有限公司 A kind of optical system for wearing display equipment
WO2019135165A2 (en) * 2018-01-03 2019-07-11 Khan Sajjad Ali Method and system for occlusion capable compact displays
CN108333773A (en) * 2018-04-13 2018-07-27 深圳鸿鑫晶光电有限公司 AR head-mounted display apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王红等: "光学透视式增强现实显示系统虚实遮挡问题研究", <<中国图象图形学报>> *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111458881A (en) * 2020-05-13 2020-07-28 歌尔科技有限公司 Display system and head-mounted display equipment
CN113747626A (en) * 2020-05-29 2021-12-03 北京小米移动软件有限公司 Ambient light determination method, ambient light determination device, terminal equipment and medium
CN113747626B (en) * 2020-05-29 2023-08-29 北京小米移动软件有限公司 Ambient light determining method, device, terminal equipment and medium
WO2022111668A1 (en) * 2020-11-30 2022-06-02 华为技术有限公司 Virtual-reality fusion display device
CN114578554A (en) * 2020-11-30 2022-06-03 华为技术有限公司 Display equipment for realizing virtual-real fusion
CN114578554B (en) * 2020-11-30 2023-08-22 华为技术有限公司 Display equipment for realizing virtual-real fusion
CN113703582A (en) * 2021-09-06 2021-11-26 联想(北京)有限公司 Image display method and device
CN114002869A (en) * 2021-12-30 2022-02-01 南昌虚拟现实研究院股份有限公司 Optical adjusting system applied to virtual reality display

Similar Documents

Publication Publication Date Title
CN110673340A (en) Augmented reality device and control method thereof
US11838518B2 (en) Reprojecting holographic video to enhance streaming bandwidth/quality
US9928655B1 (en) Predictive rendering of augmented reality content to overlay physical structures
JP6596510B2 (en) Stereo rendering system
US9581820B2 (en) Multiple waveguide imaging structure
US10394317B2 (en) Interaction with holographic image notification
US10228564B2 (en) Increasing returned light in a compact augmented reality/virtual reality display
US9696798B2 (en) Eye gaze direction indicator
US10715791B2 (en) Virtual eyeglass set for viewing actual scene that corrects for different location of lenses than eyes
US11281003B2 (en) Near eye dynamic holography
US10482666B2 (en) Display control methods and apparatuses
US11353955B1 (en) Systems and methods for using scene understanding for calibrating eye tracking
KR20170081244A (en) Preventing display leakage in see-through displays
US11574389B2 (en) Reprojection and wobulation at head-mounted display device
US11367226B2 (en) Calibration techniques for aligning real-world objects to virtual objects in an augmented reality environment
US20200117007A1 (en) Wide field of view occlusion capable augmented reality pancake optics head mounted display
CN111095348A (en) Transparent display based on camera
CN114442814A (en) Cloud desktop display method, device, equipment and storage medium
Itoh et al. OST Rift: Temporally consistent augmented reality with a consumer optical see-through head-mounted display
JP6534972B2 (en) Image display apparatus, image display method and image display program
US11010865B2 (en) Imaging method, imaging apparatus, and virtual reality device involves distortion
US10802281B2 (en) Periodic lenses systems for augmented reality
US20130278629A1 (en) Visual feedback during remote collaboration
WO2019061884A1 (en) Display device and method
Fukuda et al. Head mounted display implementations for use in industrial augmented and virtual reality applications

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20201013

Address after: 261031 north of Yuqing street, east of Dongming Road, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronic office building)

Applicant after: GoerTek Optical Technology Co.,Ltd.

Address before: 266104 Laoshan Qingdao District North House Street investment service center room, Room 308, Shandong

Applicant before: GOERTEK TECHNOLOGY Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200110