CN113625866A - Augmented reality system and method for displaying virtual screen by using augmented reality glasses - Google Patents
Augmented reality system and method for displaying virtual screen by using augmented reality glasses Download PDFInfo
- Publication number
- CN113625866A CN113625866A CN202010380858.0A CN202010380858A CN113625866A CN 113625866 A CN113625866 A CN 113625866A CN 202010380858 A CN202010380858 A CN 202010380858A CN 113625866 A CN113625866 A CN 113625866A
- Authority
- CN
- China
- Prior art keywords
- augmented reality
- display module
- image
- virtual screen
- processing circuit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
An augmented reality system includes a portable electronic device, augmented reality glasses, and a processing circuit. The portable electronic device is provided with a display module and a positioning component, wherein the positioning component is arranged on the display module. The augmented reality glasses comprise an image capturing module, and the image capturing module is configured to capture at least one image of the display module. The processing circuit is configured to determine a position of an image center of the at least one image relative to the positioning component according to the at least one image. When the processing circuit judges that the position of the image center point moves from the inside of the display module and exceeds the positioning assembly, the processing circuit controls the augmented reality glasses to display a virtual screen, and the virtual screen extends from a first side edge of the display module. The present disclosure also relates to a method of displaying a virtual screen using augmented reality glasses.
Description
Technical Field
The present disclosure relates to an augmented reality system and method, and more particularly, to an augmented reality system and method for displaying a virtual screen using augmented reality glasses.
Background
With the development of technology, a tablet computer, a desktop computer or a notebook computer is a very popular product.
Generally, a notebook computer has only a single display screen. However, as the performance of the notebook computer is improved, more and more programs can be executed, so that the single display screen is not suitable for the user's requirement. However, the physical expansion screen is a screen externally hung on the notebook computer, which increases the weight, reduces the mobility of the notebook computer, and also increases the cost of the user.
Therefore, how to design a solution that can increase the display screen without reducing the mobility is a problem worthy of discussion and solution.
Disclosure of Invention
In view of the above, the present disclosure provides an augmented reality system to solve the above problems.
The present disclosure provides an augmented reality system including a portable electronic device, augmented reality glasses, and a processing circuit. The portable electronic device is provided with a display module and a positioning component, wherein the positioning component is arranged on the display module. The augmented reality glasses comprise an image capturing module, and the image capturing module is configured to capture at least one image of the display module. The processing circuit is configured to determine a position of an image center point of the image relative to the positioning component according to the at least one image. When the processing circuit judges that the position of the image center point moves from the inside of the display module and exceeds the positioning assembly, the processing circuit controls the augmented reality glasses to display a virtual screen, and the virtual screen extends from a first side edge of the display module.
The present disclosure provides a method for displaying a virtual screen using augmented reality glasses, comprising: arranging a positioning component on a display module; capturing at least one image of the display module; judging a position of an image center point of at least one image relative to the positioning component according to the at least one image; and when the position of the image center point moves from the display module to exceed the positioning component, controlling the augmented reality glasses to display a virtual screen, wherein the virtual screen extends from one side of the display module.
An augmented reality system includes a portable electronic device, augmented reality glasses, and an expansion device. The user may wear the augmented reality glasses and use the portable electronic device at the same time. The portable electronic device is provided with a positioning component, the processing circuit can judge the position of the central point of the visual field range of the user relative to the positioning component according to the image provided by the image capturing module on the augmented reality glasses, and further control the optical display module of the augmented reality glasses to display the corresponding virtual screen.
Therefore, based on the design of the present disclosure, a user does not need to purchase a physical screen installed on the portable electronic device, so that the mobility of the portable electronic device can be maintained, and the purpose of increasing the expanded screen can be achieved.
Drawings
The present disclosure will become more apparent from the detailed description set forth below when taken in conjunction with the drawings. It is emphasized that, in accordance with the standard practice in the industry, various features are not drawn to scale and are used for illustrative purposes only. In fact, the dimensions of the various features may be arbitrarily expanded or reduced for clarity of presentation.
Fig. 1 is a schematic diagram of an augmented reality system 50 according to an embodiment of the present disclosure.
Fig. 1A is a block diagram of an augmented reality system 50 according to an embodiment of the present disclosure.
Fig. 2 is an image of the portable electronic device 100 captured by the image capturing module 204 of the augmented reality glasses 200 according to an embodiment of the disclosure.
Fig. 3 is another image captured by the image capture module 204 when the user turns his head to the right according to an embodiment of the disclosure.
Fig. 4 shows another image captured by the image capturing module 204 when the user turns his head to the left according to an embodiment of the disclosure.
Fig. 5 is another image captured by the image capturing module 204 when the user raises his head according to an embodiment of the disclosure.
Fig. 6 is another image of the portable electronic device 100 captured by the image capturing module 204 of the augmented reality glasses 200 according to an embodiment of the disclosure.
Fig. 7 is another image of the portable electronic device 100 captured by the image capturing module 204 according to an embodiment of the disclosure.
Fig. 8 is a schematic diagram of a display module 102 and a positioning assembly 150 according to another embodiment of the disclosure.
Fig. 9 is a flowchart of a method S400 of displaying a virtual screen using augmented reality glasses 200 according to an embodiment of the present disclosure.
Wherein the reference numerals are as follows:
50: augmented reality system
100: portable electronic device
102: display module
1020: entity screen
1021: the first side edge
1022: second side edge
1023: third side edge
104: host module
1041: arithmetic circuit
150: positioning assembly
151: positioning element
152: positioning element
153: positioning element
154 to 157: positioning element
200: augmented reality glasses
202: lens
204: image acquisition module
205: optical display module
206: optical projector
208: optical guiding element
200C: processing circuit
200M: memory circuit
200 FL: left frame buffer
200 FR: right frame buffer
200S: sensing element
300: expansion device
301: connecting end
AG: acute angle
CV: center point
FOV: field of view
H1: first height
H2: second height
HD: hand (W.E.)
L1: first length
L2: second length
L3: third length
TX: direction of transverse axis
VS 1: first virtual screen
VS 2: second virtual screen
VS 3: third virtual screen
S400: method of producing a composite material
S402, S404, S406, S408, S410: step (ii) of
X: x axis
Y: y-axis
Detailed Description
In order to make the objects, features and advantages of the present disclosure more comprehensible, embodiments accompanied with figures are described in detail below. The arrangement of the elements in the embodiments is for illustration and not for limiting the disclosure. And the reference numbers in the embodiments are partially repeated to simplify the description, and do not indicate the relevance between the different embodiments. Directional terms as referred to in the following examples, for example: up, down, left, right, front or rear, etc., are simply directions with reference to the drawings. Accordingly, the directional terminology used is intended to be in the nature of words of description rather than of limitation.
Referring to fig. 1, fig. 1 is a schematic diagram of an augmented reality system 50 according to an embodiment of the present disclosure. The augmented reality system 50 may include a portable electronic device 100, augmented reality glasses 200(augmented reality glasses), and an expansion device 300. The portable electronic device 100 may be a notebook computer, and includes a display module 102 and a host module 104. In this embodiment, the portable electronic device 100 may further include a positioning component 150 disposed on the display module 102.
The host module 104 may include an arithmetic circuit 1041, such as a Central Processing Unit (CPU), configured to control the operation of the portable electronic device 100. The central processor provides the processing power necessary to execute the Operating System (OS), programs, graphical user interfaces, software, modules, applications, and functions of the portable electronic device 100.
In this embodiment, the augmented reality glasses 200 are electrically connected to the host module 104 through the expansion device 300, and the augmented reality glasses 200 may include two lenses 202, an image capture module 204, and an optical display module 205. The lens 202 may be a transparent lens, and the image capture module 204 is configured to continuously capture images of the display module 102 and transmit the image signals to a processing circuit 200C (fig. 1A).
The optical display module 205 may include two optical projectors 206 and two optical guiding elements 208. In this embodiment, the optical projector 206 may be a micro-projector head configured to receive the projection signal provided by the processing circuit 200C to emit a projection image to the optical guiding element 208. The optical guiding element 208 may be a reflective prism configured to project the projected image to the pupil of the user.
It is noted that embodiments of the optical display module 205 are not so limited. For example, in other embodiments of the present disclosure, the lens 202 may be a half mirror, and the optical guiding element 208 projects the projection image onto the lens 202.
Referring to fig. 1 and 1A, fig. 1A is a block diagram of an augmented reality system 50 according to an embodiment of the disclosure. Specifically, the augmented reality glasses 200 may further include a processing circuit 200C, a memory circuit 200M, a left frame buffer 200FL, and a right frame buffer 200 FR. The processing circuit 200C may be a Micro-controller Unit (MCU), which is an integrated chip having a central processing Unit, a memory, a timer/counter, an input/output interface, and the like integrated therein, and has the advantages of simple input/output interface and small volume. The Memory circuit 200M may be, for example, a Random Access Memory (RAM), a flash Memory (flash Memory), a Read-Only Memory (ROM), and so on. The left frame buffer 200FL and the right frame buffer 200FR are electrically connected to the optical projector 206 on the left and right sides of FIG. 1, respectively, and the left frame buffer 200FL and the right frame buffer 200FR are used to temporarily store the image frames (frames) provided by the processing circuit 200C and then provide the frames to the optical projector 206.
The expansion device 300 may include a connection end 301, and the processing circuit 200C may determine the position of the center point of the image relative to the positioning element 150 according to the image provided by the image capturing module 204, and transmit the related position data to the host module 104 through the connection end 301 of the expansion device 300. The connection 301 may be, but is not limited to, a Universal Serial Bus (USB) or a Display Port (DP). In other embodiments, the augmented reality glasses 200 may also communicate with the host module 104 using wireless means (e.g., WIFI).
In addition, one or more sensing elements 200S, such as motion sensing elements or 6 degree of freedom (6 DoF) tracking elements, may be further disposed on the augmented reality glasses 200 and electrically connected to the processing circuit 200C, so that the position data calculated by the processing circuit 200C is more accurate. In addition, in other embodiments, the processing circuit 200C can also be integrated into the arithmetic circuit 1041.
Referring to fig. 1 and fig. 2, fig. 2 is an image of the portable electronic device 100 captured by the image capturing module 204 of the augmented reality glasses 200 according to an embodiment of the disclosure. Wherein, the visual field range FOV represents the visual field range when the user wears the augmented reality glasses 200, and the central point CV is the visual field central point of the user, and the size of the visual field range FOV is not limited to the one depicted in fig. 2. It is noted that, as shown in fig. 1, the image capturing module 204 of the present disclosure is disposed at a central position between the two lenses 202, so that the center point of the image captured by the image capturing module 204 overlaps the center point CV.
Furthermore, as shown in fig. 2, in the embodiment, the display module 102 has a first side 1021, a second side 1022 and a third side 1023, and the positioning assembly 150 includes three positioning elements 151, 152 and 153. The positioning elements 151, 152, and 153 have a strip structure, and the positioning elements 151, 152, and 153 are respectively disposed on the first side 1021, the second side 1022, and the third side 1023. Furthermore, the positioning elements have different colors. For example, the positioning element 151 may be orange, the positioning element 152 may be green, and the positioning element 153 may be yellow, but is not limited thereto.
The image capture module 204 can capture a color image of the portable electronic device 100. Therefore, the processing circuit 200C can determine the position (position data) of the image center point of the color image (i.e., the center point CV of the FOV of the user) relative to the positioning assembly 150 according to the color image captured by the image capture module 204. As shown in FIG. 2, the center point CV is located within a physical screen 1020 of the display module 102. The processing circuit 200C then transmits the position data to the computing circuit 1041 of the host module 104.
Referring to fig. 2 and 3, fig. 3 is another image captured by the image capturing module 204 when the user turns his head to the right according to an embodiment of the disclosure. When the user turns his head from right, as shown in fig. 3, the center point CV moves from the position of fig. 2 to the position of fig. 3. At this time, the processing circuit 200C determines that the position of the image center point (center point CV) moves from the display module 102 to exceed (cross) the orange positioning element 151 in the positioning assembly 150, and the processing circuit 200C transmits the position data to the computing circuit 1041.
Then, the arithmetic circuit 1041 receives the position data, calculates projection data (including the size and position of the image to be projected), and then transmits the projection data to the processing circuit 200C. The processing circuit 200C controls the optical projector 206 and the optical guiding element 208 to project a first virtual screen VS1 onto the retina of the user according to the projection data, as shown in fig. 3. The first virtual screen VS1 extends from a first side edge 1021 of the display module 102.
Specifically, as shown in fig. 3, the first side 1021 and the third side 1023 are arranged along a long axis direction (X axis) of the display module 102, and the first virtual screen VS1 extends along the long axis direction. In this embodiment, the first virtual screen VS1 has a first length L1 in the long axis direction, the physical screen 1020 of the display module 102 has a second length L2 in the long axis direction, and the first length L1 is less than or equal to one half of the second length L2, but is not limited thereto. In some embodiments, the second length L2 may also be the length of the display module 102 in the long axis direction.
Conversely, when the center point CV moves from the position of fig. 3 back to the position of fig. 2, the processing circuit 200C determines that the center point CV moves from outside the display module 102 to inside the display module 102 over the positioning element 151. Thus, the computing circuit 1041 and the processing circuit 200C can control the augmented reality glasses 200 to close the first virtual screen VS 1.
Referring to fig. 2 and 4, fig. 4 is another image captured by the image capturing module 204 when the user turns his head to the left according to an embodiment of the disclosure. When the user turns his head to look to the left, the center point CV moves from the position of fig. 2 to the position of fig. 4. At this time, the processing circuit 200C determines that the center point (center point CV) of the image moves from the display module 102 to exceed the yellow positioning element 153 of the positioning element 150, and the processing circuit 200C transmits the position data to the computing circuit 1041.
Then, the arithmetic circuit 1041 receives the position data, calculates projection data, and transmits the projection data to the processing circuit 200C. The processing circuit 200C controls the optical projector 206 and the optical guiding element 208 to project a second virtual screen VS2 onto the retina of the user according to the projection data, as shown in fig. 4. The second virtual screen VS2 is extended from the third side 1023 of the display module 102.
As shown in fig. 4, the second virtual screen VS2 extends along the long axis direction, and a third length L3 of the second virtual screen VS2 in the long axis direction is less than or equal to one half of the second length L2.
Referring to fig. 2 and 5, fig. 5 is another image captured by the image capturing module 204 when the user raises his head according to an embodiment of the disclosure. When the user looks up, the center point CV moves from the position of fig. 2 to the position of fig. 5. At this time, the processing circuit 200C determines that the center point (center point CV) of the image moves from the display module 102 to exceed the green positioning element 152 in the positioning element 150, and the processing circuit 200C transmits the position data to the computing circuit 1041.
Then, the arithmetic circuit 1041 receives the position data, calculates projection data, and transmits the projection data to the processing circuit 200C. The processing circuit 200C controls the optical projector 206 and the optical guiding element 208 to project a third virtual screen VS3 onto the retina of the user according to the projection data, as shown in fig. 5. The third virtual screen VS3 is extended from the second side 1022.
As shown in fig. 5, the third virtual screen VS3 has a first height H1 in the Y-axis direction, the physical screen 1020 of the display module 102 has a second height H2, and the first height H1 is less than or equal to one half of the second height H2, but is not limited thereto. In some embodiments, the second height H2 may also be the height of the display module 102 in the Y-axis direction.
Referring to fig. 6, fig. 6 is another image of the portable electronic device 100 captured by the image capturing module 204 of the augmented reality glasses 200 according to an embodiment of the disclosure. When the user's head is tilted, the field of view FOV is also tilted. In this embodiment, the augmented reality glasses 200 may define a horizontal axis TX, so the image captured by the image capturing module 204 may also have the horizontal axis TX. When the center point CV moves beyond the positioning element 151, the processing circuit 200C controls the first virtual screen VS1 to extend along the horizontal axis direction TX. In this embodiment, an acute angle AG is formed between the first virtual screen VS1 and the first side edge 1021.
Referring to fig. 7, fig. 7 is another image of the portable electronic device 100 captured by the image capturing module 204 according to an embodiment of the disclosure. When the hand HD of the user makes a gesture in the field of view FOV, the processing circuit 200C further determines the gesture according to the image. For example, as shown in fig. 7, the processing circuit 200C determines that the gesture is upward according to the image, so the processing circuit 200C and the computing circuit 1041 can control the augmented reality glasses 200 to display the third virtual screen VS3 according to the gesture. The rest gestures and the virtual screen are actuated in the same way.
It is to be noted that the height of the third virtual screen VS3 in the Y-axis direction does not exceed the field of view FOV. In addition, in other embodiments, the processing circuit 200C and the computing circuit 1041 may also control the augmented reality glasses 200 to display the virtual screen according to an input signal of a mouse device (not shown) or an input signal of a keyboard of the host module 104.
Referring to fig. 8, fig. 8 is a schematic diagram of a display module 102 and a positioning assembly 150 according to another embodiment of the disclosure. In this embodiment, the positioning assembly 150 includes four positioning elements 154-157 respectively disposed at four corners of the display module 102 and outside the solid screen 1020. The positioning elements 154-157 can be dots, and the positioning element 155 has a mark different from the positioning elements 154, 156, and 157, such as an X-shaped mark. In other embodiments, the positioning element 155 may have a shape or color different from the positioning elements 154, 156, and 157.
Based on the design of the positioning elements 154-157, the computing circuit 1041 and the processing circuit 200C can determine whether to display the first virtual screen VS1, the second virtual screen VS2 or the third virtual screen VS3 according to the relative position of the center point CV. For example, when the center point CV is moved from the display module 102 beyond the line connecting the positioning elements 154 and 155, a first virtual screen VS1 is displayed.
It is noted that, in other embodiments, the positioning elements 154-157 can also be pixels displayed on the solid screen 1020 and located at four corners of the solid screen 1020.
In addition, in some embodiments, the display module 102 may also be provided with a camera lens (not shown) configured to capture images of the user. Thus, the computing circuit 1041 can obtain more accurate position data than the images captured by the camera and the image capturing module 204.
Next, referring to fig. 9, fig. 9 is a flowchart of a method S400 for displaying a virtual screen by using the augmented reality glasses 200 according to an embodiment of the present disclosure. In step S402, the positioning assembly 150 is disposed on the display module 102. The positioning assembly 150 may be implemented by the positioning elements 151-153 of FIG. 2 or the positioning elements 154-157 of FIG. 8, but is not limited thereto. In step S404, at least one image of the display module 102 is captured by the image capturing module 204.
In step S406, the processing circuit 200C determines a position of an image center point (center point CV) of the image relative to the positioning element 150 according to the at least one image. In step S408, when the position of the center point of the image moves beyond the positioning element 150 from the range of the display module 102, the augmented reality glasses 200 are controlled to display a virtual screen, and the virtual screen extends from one side of the display module 102 (e.g., the first side 1021).
In addition, in step S410, when the position of the center point (center point CV) of the image moves from the outside of the display module 102 to the inside of the display module 102, the augmented reality glasses 200 are controlled to close the virtual screen.
The present disclosure provides an augmented reality system 50 including a portable electronic device 100, augmented reality glasses 200, and an augmentation device 300. The user may wear the augmented reality glasses 200 and use the portable electronic device 100 at the same time. The portable electronic device 100 is provided with a positioning assembly 150, and the processing circuit 200C can determine the position of the central point CV of the FOV of the user relative to the positioning assembly 150 according to the image provided by the image capturing module 204 of the augmented reality glasses 200, and further control the optical display module 205 to display the corresponding virtual screen.
Therefore, based on the design of the present disclosure, the user does not need to purchase a physical screen installed on the portable electronic device 100, so the mobility of the portable electronic device 100 can be maintained, and the purpose of increasing the expanded screen can be achieved.
Although embodiments of the present disclosure and their advantages have been disclosed above, it should be understood that various changes, substitutions and alterations can be made herein by those skilled in the art without departing from the spirit and scope of the disclosure. Moreover, the scope of the present disclosure is not intended to be limited to the particular embodiments of the process, machine, manufacture, composition of matter, means, methods and steps described in the specification, but rather, the present disclosure will suggest themselves to those skilled in the art having the benefit of this disclosure, and is intended to cover such modifications as would normally occur to one skilled in the art to which this disclosure pertains, if not expressly written herein, to the extent that such modifications are capable of being carried out in the practice of the present disclosure, or the equivalent of such modifications. Accordingly, the scope of the present disclosure includes the processes, machines, manufacture, compositions of matter, means, methods, and steps described above. In addition, each claim constitutes a separate embodiment, and the scope of protection of the present disclosure also includes combinations of the respective claims and embodiments.
Claims (10)
1. An augmented reality system comprising:
the portable electronic device is provided with a display module and a positioning component, wherein the positioning component is arranged on the display module;
the augmented reality glasses comprise an image capturing module, wherein the image capturing module is configured to capture at least one image of the display module; and
a processing circuit configured to determine a position of an image center of the image relative to the positioning component according to the at least one image;
when the processing circuit judges that the position of the image center point moves from the display module to exceed the positioning component, the processing circuit controls the augmented reality glasses to display a virtual screen, wherein the virtual screen extends from a first side edge of the display module.
2. The augmented reality system of claim 1, wherein the display module has a first side, a second side and a third side, the positioning assembly comprises three positioning elements having a strip structure, the positioning elements are respectively disposed on the first, second and third sides, and the positioning elements have different colors.
3. The augmented reality system of claim 2, wherein the first side and the third side are arranged along a long axis direction, the virtual screen extends along the long axis direction, the virtual screen has a first length in the long axis direction, the display module has a second length in the long axis direction, and the first length is less than or equal to one half of the second length.
4. The augmented reality system of claim 2, wherein the at least one image defines a transverse axis, the virtual screen extends along the transverse axis, and an acute angle is formed between the virtual screen and the first side.
5. The augmented reality system of claim 1, wherein the positioning assembly comprises four positioning elements respectively disposed at four corners of the display module, and one of the positioning elements has a mark, shape or color different from the other three positioning elements.
6. The augmented reality system of claim 1, wherein the processing circuit is further configured to determine a gesture according to the at least one image, and the processing circuit controls the augmented reality glasses to display the virtual screen according to the gesture.
7. The augmented reality system of claim 1, wherein the processing circuit controls the augmented reality glasses to close the virtual screen when the processing circuit determines that the position of the center point of the image moves from outside the display module to inside the display module.
8. The augmented reality system of claim 1, wherein the augmented reality glasses further comprise an optical display module electrically connected to the processing circuit, and the optical display module comprises:
an optical guiding element; and
an optical projector configured to emit a projection image to the optical guiding element according to a projection signal provided by the processing circuit.
9. A method of displaying a virtual screen with augmented reality glasses, comprising:
arranging a positioning component on a display module;
capturing at least one image of the display module;
judging a position of an image center point of the at least one image relative to the positioning component according to the at least one image; and
when the position of the image center point moves from the display module to exceed the positioning component, the augmented reality glasses are controlled to display a virtual screen, wherein the virtual screen extends from one side of the display module.
10. The method of displaying a virtual screen using augmented reality glasses according to claim 9, further comprising:
and when the position of the image center point moves from the outside of the display module to the inside of the display module, controlling the augmented reality glasses to close the virtual screen.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010380858.0A CN113625866A (en) | 2020-05-08 | 2020-05-08 | Augmented reality system and method for displaying virtual screen by using augmented reality glasses |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010380858.0A CN113625866A (en) | 2020-05-08 | 2020-05-08 | Augmented reality system and method for displaying virtual screen by using augmented reality glasses |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113625866A true CN113625866A (en) | 2021-11-09 |
Family
ID=78377126
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010380858.0A Withdrawn CN113625866A (en) | 2020-05-08 | 2020-05-08 | Augmented reality system and method for displaying virtual screen by using augmented reality glasses |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113625866A (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104049363A (en) * | 2013-03-11 | 2014-09-17 | 精工爱普生株式会社 | Image display system and head-mounted display device |
CN106133645A (en) * | 2014-01-17 | 2016-11-16 | 索尼互动娱乐美国有限责任公司 | The second screen is used to follow the tracks of HUD as private |
CN107168513A (en) * | 2017-03-22 | 2017-09-15 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN110208947A (en) * | 2019-06-03 | 2019-09-06 | 歌尔股份有限公司 | Display equipment and display methods based on human eye tracking |
TW201937462A (en) * | 2018-02-22 | 2019-09-16 | 亞東技術學院 | Augmented reality interactive language learning device |
-
2020
- 2020-05-08 CN CN202010380858.0A patent/CN113625866A/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104049363A (en) * | 2013-03-11 | 2014-09-17 | 精工爱普生株式会社 | Image display system and head-mounted display device |
CN106133645A (en) * | 2014-01-17 | 2016-11-16 | 索尼互动娱乐美国有限责任公司 | The second screen is used to follow the tracks of HUD as private |
CN107168513A (en) * | 2017-03-22 | 2017-09-15 | 联想(北京)有限公司 | Information processing method and electronic equipment |
TW201937462A (en) * | 2018-02-22 | 2019-09-16 | 亞東技術學院 | Augmented reality interactive language learning device |
CN110208947A (en) * | 2019-06-03 | 2019-09-06 | 歌尔股份有限公司 | Display equipment and display methods based on human eye tracking |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11366516B2 (en) | Visibility improvement method based on eye tracking, machine-readable storage medium and electronic device | |
US11307659B2 (en) | Low-power eye tracking system | |
US11249315B2 (en) | Augmented reality system and method of displaying virtual screen using augmented reality glasses | |
EP3195595B1 (en) | Technologies for adjusting a perspective of a captured image for display | |
US10698475B2 (en) | Virtual reality interaction method, apparatus and system | |
WO2022022141A1 (en) | Image display method and apparatus, and computer device and storage medium | |
US20200293258A1 (en) | Layout identification method for display wall, and electronic device using the same | |
US12019797B2 (en) | Gaze tracking for a retinal projection display system | |
US11682183B2 (en) | Augmented reality system and anchor display method thereof | |
US20230054450A1 (en) | Retinal projection display system | |
CN113625866A (en) | Augmented reality system and method for displaying virtual screen by using augmented reality glasses | |
KR20240090407A (en) | Dual system on chip eyewear | |
CN114201028B (en) | Augmented reality system and method for anchoring display virtual object thereof | |
CN113703161B (en) | Augmented reality system and anchoring display method thereof | |
US11380071B2 (en) | Augmented reality system and display method for anchoring virtual object thereof | |
CN114253389B (en) | Augmented reality system integrating motion sensor and augmented reality display method | |
US20240333900A1 (en) | Display device and operating method thereof | |
CN107015708A (en) | Optical touch control apparatus and optical touch control method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20211109 |