CN108038916B - Augmented reality display method - Google Patents

Augmented reality display method Download PDF

Info

Publication number
CN108038916B
CN108038916B CN201711443850.9A CN201711443850A CN108038916B CN 108038916 B CN108038916 B CN 108038916B CN 201711443850 A CN201711443850 A CN 201711443850A CN 108038916 B CN108038916 B CN 108038916B
Authority
CN
China
Prior art keywords
screen
augmented reality
input
display screen
virtual display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711443850.9A
Other languages
Chinese (zh)
Other versions
CN108038916A (en
Inventor
李树欣
张锐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Laini Intelligent Technology Co ltd
Original Assignee
Shanghai Laini Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Laini Intelligent Technology Co ltd filed Critical Shanghai Laini Intelligent Technology Co ltd
Priority to CN201711443850.9A priority Critical patent/CN108038916B/en
Publication of CN108038916A publication Critical patent/CN108038916A/en
Application granted granted Critical
Publication of CN108038916B publication Critical patent/CN108038916B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Abstract

The embodiment of the invention relates to the field of augmented reality and discloses a display method of augmented reality. The invention provides an augmented reality display method, which comprises the following steps: acquiring a two-dimensional plane where a screen of input equipment is located; acquiring a target space position according to the two-dimensional plane; the target space position is used as a space position of a virtual display screen superposed in a real environment; the virtual display screen displays the content input by the input equipment, so that the position of the virtual display screen can be quickly acquired, and the input equipment with display capacity and the augmented reality glasses are adopted to perform direct information interaction, so that the application capacity of the augmented reality is improved.

Description

Augmented reality display method
Technical Field
The embodiment of the invention relates to the field of augmented reality, in particular to a display method of augmented reality.
Background
Augmented reality refers to the use of an image display device to present the real world environment and add some computer-generated sensory input, such as sound, video, graphics, etc. data. As a novel human-computer interface and a simulation tool, the augmented reality technology is increasingly concerned, plays an important role in many fields, provides a powerful means for intelligent expansion of human beings, and has great profound influence on production modes and social life. Augmented reality glasses have been widely used as a new computing device in military, industrial, medical, scientific and other fields. The augmented reality glasses combine various comprehensive technical means such as a computer graphic technology, a simulation technology, a display technology and the like, and can intuitively display and control digital contents by superposing virtual three-dimensional graphics generated by a computer in a real scene. Since the augmented reality glasses need to be superimposed with the current real environment when displaying the digital content, the augmented reality glasses need to acquire the spatial coordinate information of the current environment through a sensor system. When the augmented reality glasses are used for scanning the environment, the augmented reality environment needs to be created by combining a computer graphics technology, a simulation technology, a display technology and the like, and the augmented reality environment interacts with the real environment.
However, the inventors found that at least the following problems exist in the prior art: the generation of the augmented reality environment requires that structured light scans the reality environment for many times and performs modeling, the technical implementation is very complex, and when a virtual display screen is loaded, the virtual display screen is often out of the visual angle range of a user, so that the position of the virtual display screen needs to be searched, and the user experience is influenced to a great extent. That is to say, in the prior art, it is difficult to quickly acquire the position of the virtual display screen, and the technical implementation is complicated.
Another technical problem that augmented reality glasses need to solve is information interaction with a user, as wearable equipment, physical keys, a touch pad or a wireless remote control device integrated on the glasses are generally adopted as input equipment in the prior art, and a set of application software running independently is provided to use the interactive equipment.
Disclosure of Invention
The embodiment of the invention aims to provide an augmented reality display method, which can quickly acquire the position of a virtual display screen, and adopts input equipment with display capacity to perform direct information interaction with augmented reality glasses, so that the application capacity of augmented reality is improved.
In order to solve the above technical problem, an embodiment of the present invention provides an augmented reality display method, including:
acquiring a two-dimensional plane where a screen of input equipment is located;
acquiring a target space position according to the two-dimensional plane;
the target space position is used as a space position of a virtual display screen superposed in a real environment; wherein the virtual display screen displays the content input by the input device.
Compared with the prior art, the method and the device for inputting the image of the mobile terminal acquire the two-dimensional plane where the screen of the input device is located; acquiring a target space position according to the two-dimensional plane; the target space position is used as a space position of a virtual display screen superposed in a real environment; wherein the virtual display screen displays the content input by the input device. The spatial position of the virtual display screen in the real environment can be quickly obtained by directly obtaining the two-dimensional plane where the screen of the input device is located, the complex real environment is favorably prevented from being scanned, and the virtual display screen is overlaid to the real environment based on the scanning recognition result, so that the workload of the technical aspect is greatly reduced. Meanwhile, the content input by the user through the input device can be directly displayed on the virtual display screen, the input device is provided with a screen, the screen has display capacity, and relatively complex information interaction can be directly carried out with the augmented reality glasses, so that the application capacity of augmented reality is improved.
In addition, acquiring a two-dimensional plane where a screen of the input device is located specifically includes: scanning a characteristic image displayed on a screen of an input device; extracting feature points on the feature image; and acquiring a two-dimensional plane where the screen of the input equipment is located according to the extracted feature points. Because the characteristic of the characteristic point of the characteristic image on the screen is more obvious, the extracted characteristic point is relatively stable, and the two-dimensional plane acquired based on the relatively stable characteristic point is relatively stable, so that the spatial position of the obtained virtual display screen is relatively stable, and the user experience is greatly improved.
In addition, after superimposing the target spatial position as a virtual display screen at a spatial position in the real environment, the method further includes: receiving control information input by input equipment; and controlling the display state of the virtual display screen according to the control information. The change of the display states such as the size and the shape of the virtual display screen can be controlled through the control information input by the input device, so that the display state of the virtual display screen can be adjusted according to actual requirements, and the use of a user is facilitated.
In addition, after controlling the display state of the virtual display screen, the method further comprises the following steps: judging whether a screen locking signal sent by input equipment is received; and if the screen locking signal is received, locking the virtual display screen. The virtual display screen is locked, so that the user can use the input device as a device which is daily used by the user, such as a keyboard, a mouse or a remote controller. The input device can be used for intuitively and conveniently carrying out related operations on the locked virtual display screen, so that the augmented reality display method is more flexible.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the figures in which like reference numerals refer to similar elements and which are not to scale unless otherwise specified.
Fig. 1 is a flowchart illustrating a display method of augmented reality according to a first embodiment of the present invention;
fig. 2 (a) is a schematic diagram of a feature image displayed on a screen of an input device according to a first embodiment of the present invention;
fig. 2 (b) is a schematic diagram of a feature image displayed on a screen of an input device according to a first embodiment of the present invention;
fig. 2 (c) is a schematic diagram of a feature image displayed on a screen of an input device according to a first embodiment of the present invention;
fig. 3 is a schematic diagram of a practical application of the augmented reality display method according to the first embodiment of the present invention;
fig. 4 is a flowchart illustrating an augmented reality display method according to a second embodiment of the present invention;
FIG. 5 is a schematic view of a setup parameter interface of an input device according to a second embodiment of the invention;
FIG. 6 is a schematic illustration of the variation in size and position of a virtual display screen according to a second embodiment of the present invention;
FIG. 7 is a diagram of an input device according to a second embodiment of the present invention viewed as a mouse;
FIG. 8 is a schematic diagram of an input device according to a second embodiment of the present invention as a keyboard;
fig. 9 is a flowchart illustrating an augmented reality display method according to a third embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, embodiments of the present invention will be described in detail below with reference to the accompanying drawings. However, it will be appreciated by those of ordinary skill in the art that numerous technical details are set forth in order to provide a better understanding of the present application in various embodiments of the present invention. However, the technical solution claimed in the present application can be implemented without these technical details and various changes and modifications based on the following embodiments.
A first embodiment of the present invention relates to a display method of augmented reality. The core of the embodiment lies in acquiring a two-dimensional plane where a screen of the input device is located; acquiring a target space position according to the two-dimensional plane; the target space position is used as a space position of a virtual display screen superposed in a real environment; the virtual display screen displays the content input by the input device, so that the position of the virtual display screen can be quickly acquired, and the input device with display capacity and the augmented reality glasses are adopted to perform direct information interaction, so that the application capacity of augmented reality is improved. The following describes in detail the implementation details of the augmented reality display method according to the present embodiment, and the following is only provided for easy understanding and is not essential to implementing the present embodiment.
Fig. 1 shows a schematic flow chart of an augmented reality display method according to a first embodiment of the present invention, where the augmented reality display method may be applied to augmented reality glasses, and specifically includes:
step 101: and acquiring a two-dimensional plane where a screen of the input device is located.
Specifically, the augmented reality glasses may obtain a two-dimensional plane where a screen of the input device is located, and the input device may be an intelligent device with a screen capable of displaying images, such as a personal computer, a mobile phone, a tablet computer, and the like. The input device in the present embodiment is a mobile phone as an example, but is not limited to this in practical applications.
The augmented reality glasses can scan the screen of the mobile phone and extract feature points on the screen of the mobile phone, the feature points can be understood as special points different from other points in one image, for example, the feature points do not exist on a pure white image, and if the pure white image has several black points, the several black points can be used as the feature points because the several black points are different from other points on the pure white image. The characteristic points on the mobile phone screen can be understood as points at the frame of the mobile phone or points at the black-white junction around the mobile phone screen. It can be understood that at least four coordinates of points are required to obtain a two-dimensional plane, and therefore, at least four coordinates of feature points are required to obtain the two-dimensional plane where the mobile phone screen is located. The augmented reality glasses can extract coordinates of at least four points on the screen of the mobile phone, and certainly, the more feature points are extracted, the more accurate the acquired two-dimensional plane is. For example, the extracted feature points on the mobile phone screen may be coordinates (u 1, v 1), (u 2, v 2), \8230; (un, vn) of the feature points on the screen, and then a two-dimensional plane where the mobile phone screen is located may be obtained according to the coordinates of the points.
The augmented reality glasses can also scan a feature image displayed on a mobile phone screen and extract feature points on the feature image. The characteristic images on the mobile phone screen can be as shown in fig. 2 (a), 2 (b), and 2 (c), and of course, this embodiment provides only three possible characteristic images, but it is not limited to this in practical application, and the characteristic images can be various patterns, numbers, letters, and the like. Because the characteristic points of the characteristic image on the mobile phone screen are more prominent and obvious, and the augmented reality glasses can easily extract a plurality of more stable characteristic points, the two-dimensional plane where the mobile phone screen is located, which is acquired based on the more and more stable characteristic points, is relatively accurate and more stable.
It should be noted that, in the present embodiment, only two methods for acquiring the two-dimensional plane where the screen of the input device is located are provided, but the present invention is not limited to this in practical applications, and any method for acquiring the two-dimensional plane where the screen of the input device is located with respect to the augmented reality glasses is within the protection scope of the present embodiment.
Step 102: and acquiring the space position of the target according to the two-dimensional plane.
Specifically, the augmented reality glasses can acquire a first spatial position of the mobile phone according to a two-dimensional plane where a screen of the mobile phone is located; and acquiring a target space position according to the first space position of the mobile phone, wherein the target space position can be understood as the space position of the virtual display screen. For example, when a camera in the augmented reality glasses scans a mobile phone screen, a camera coordinate system (Xc, yc, zc) with the camera as a center can be established, the origin is located at the optical center of the lens, the X and Y axes are respectively parallel to two sides of the phase plane, and the Z axis is the optical axis of the lens and is perpendicular to the phase plane. And the coordinates under the camera coordinate system are the coordinates of the first space position where the mobile phone is located. Meanwhile, a world coordinate system (Xw, yw, zw) taking the mobile phone screen as a reference is established, the world coordinate system takes the certain extracted characteristic point as an origin, an XY plane (two-dimensional plane) is superposed with a plane where the mobile phone plane is located, and a Z axis is perpendicular to the mobile phone plane. And the coordinates in the world coordinate system are the coordinates of the target space position of the virtual display screen. The conversion relation between the camera coordinate system and the world coordinate system can be established through an algorithm, and a rotation matrix R and a translation matrix T from the world coordinate system to the camera coordinate system are obtained. Homography matrixes Homography of the world plane and the camera plane are calculated by utilizing a direct linear transformation algorithm, so that a rotation matrix R and a translation matrix T between two coordinate systems are solved. Similarly, the rotation matrix R and the translation matrix T between the two coordinate systems may be calculated by a nonlinear optimization method. Therefore, the target space position is obtained according to the first space position of the mobile phone.
The coordinate system may be transformed by using the extracted 4 or more than 4 feature points in order to obtain the spatial position of the virtual display screen. Assuming that coordinates (u 1, v 1), (u 2, v 2), (8230), (un, vn) of the feature point on a two-dimensional plane where the screen of the mobile phone is located can be converted into camera coordinates (X1 c, Y1c, Z1 c), (X2 c, Y2c, Z2 c), (8230), (Xnc, yc, znc) by the internal parameters of the camera. If the mobile phone screen or the related feature information, such as the size and the like, of the feature image displayed on the mobile phone screen is marked in advance, the position of each corresponding feature point in the world coordinate system can also be found to be (X1 w, Y1w, Z1 w), (X2 w, Y2w, Z2 w), \8230; (Xnw, ynw, znw).
It should be noted that, this embodiment only provides two implementation methods for obtaining the target spatial position according to the two-dimensional plane, which is not limited in practical application, and any method for obtaining the target spatial position based on the two-dimensional plane is within the protection scope of the embodiment of the present invention.
Step 103: and projecting the virtual display screen to the target space position.
Specifically, the target spatial position obtained in step 102 is used as a spatial position where the virtual display screen is superimposed in the real environment, that is, the augmented reality glasses project the virtual display screen to the target spatial position, and the virtual display screen can display the content input by the input device. The spatial position of the virtual display screen is determined, and the size and the shape of the virtual display screen can be default values or values set through a mobile phone. For example, in the present embodiment, the relationship among the augmented reality glasses 301, the virtual display 302, and the mobile phone 303 may be as shown in fig. 3, and the content input by the mobile phone 303 may be displayed on the virtual display 302. A user can obtain a virtual display screen 302 above the mobile phone, the virtual display screen 302 can display the content on the screen of the mobile phone 303, and meanwhile, the virtual display screen 302 can also move along with the movement of the mobile phone 303, which is equivalent to adding a portable large screen to the mobile phone 303.
Compared with the prior art, in the embodiment, a two-dimensional plane where a screen of the input device is located is obtained; acquiring a target space position according to the two-dimensional plane; the target space position is used as a space position of a virtual display screen superposed in a real environment; wherein the virtual display screen displays the content input by the input device. The spatial position of the virtual display screen in the real environment can be quickly obtained by directly obtaining the two-dimensional plane where the screen of the input equipment is located, so that the relatively complex method that the complex real environment is scanned and then the virtual display screen is superposed to the real environment based on the result of scanning identification is favorably avoided, and the workload of the technical aspect is greatly reduced. Meanwhile, the virtual display screen can directly display the content input by the user through the input device, the input device is provided with a screen, the screen has display capacity, and complex information interaction can be directly carried out with the augmented reality glasses, so that the application capacity of the augmented reality is improved.
A second embodiment of the present invention relates to a display method of augmented reality. The second embodiment is a further improvement of the first embodiment, and the main improvements are as follows: after the target spatial position is used as a virtual display screen and is superposed on the spatial position in the real environment, the method further comprises the following steps: receiving control information input by input equipment; and controlling the display state of the virtual display screen according to the control information.
A flowchart of an augmented reality display method according to a second embodiment of the present invention is shown in fig. 4, and specifically includes:
step 401: and acquiring a two-dimensional plane where a screen of the input device is located.
Step 402: and acquiring the space position of the target according to the two-dimensional plane.
Step 403: the virtual display screen is projected to a target spatial location.
Since steps 401 to 403 in this embodiment are substantially the same as steps 101 to 103 in the first embodiment, they are not repeated again.
Step 404: and receiving control information input by the input device.
Specifically, the control information input by the input device may be information such as the size, shape, and distance from the augmented reality glasses of the virtual display screen. For example, if the user feels that the size of the current virtual display screen is too small, the input device may send the expected size to the augmented reality glasses by inputting the expected size to the input device, and of course, the augmented reality glasses and the input device may be connected wirelessly or by wire, so that the augmented reality glasses may receive the control information input by the input device. The user can also set the shape of the virtual display screen according to his preference, such as a circle, a square or any other shape. If the user feels that the current virtual display screen is close to the user's own distance and is uncomfortable to see, the distance can be adjusted to reach the distance satisfied by the user.
For example, a schematic view of a setting parameter interface of the input device is shown in fig. 5. The input device can be a mobile phone, and the display states of the size, the position and the like of the virtual display screen are adjusted through inputting parameters by the mobile phone. To accommodate the various needs of the user, the size, shape, and distance of the virtual display screen may be changed. The button 501 on the mobile phone screen is used for adjusting the virtual display screen to move up, down, left and right or selecting the shape of the virtual display screen, the button 502 is used for adjusting the virtual display screen to move back and forth, the button 503 is used for adjusting the size of the virtual display screen, and the button 504 is used for moving the smart phone interface forward and backward. The present embodiment provides only a schematic view of a parameter setting interface of an input device, and the interface for setting parameters in practical applications is not limited thereto. Step 405: and controlling the display state of the virtual display screen according to the control information.
Specifically, if the control information is to increase the size of the virtual display screen, the augmented reality glasses control the size of the virtual display screen to increase to the size indicated by the control information, and the position of the virtual display screen can also be controlled to change according to the control information input by the mobile phone. For example, as shown in fig. 6, the schematic diagram of the size and the position change of the virtual display screen may be that a user feels that neither the current position nor the current size of the virtual display screen 601 can meet the current requirements of the user, then control information may be input by a mobile phone to change the size and the display position of the virtual display screen 601, and after receiving the control information, the augmented reality glasses display the updated virtual display screen 602 at a position desired by the user. Of course, the shape of the virtual display screen in the present embodiment is a quadrilateral, which is not limited in practical application.
Step 406: and judging whether a screen locking signal sent by the input equipment is received, if so, executing a step 407, and if not, ending the process.
For example, if the user feels that the display states such as the size and the shape of the current virtual display screen are appropriate and does not want to change the position of the current virtual display screen due to the movement of the mobile phone (input device), a screen locking signal can be sent to the augmented reality glasses through the mobile phone. The augmented reality glasses can judge whether a screen locking signal sent by the mobile phone is received, if the screen locking signal is received, step 407 is executed, otherwise, the process is ended.
Step 407: and locking the virtual display screen.
Specifically, after the virtual display screen is locked, the position of the virtual display screen does not change due to the movement of the input device, and then the input device may be regarded as a mouse as shown in fig. 7, and the input device may be a mobile phone, and at this time, the mobile phone has the function of the mouse, and can perform corresponding operations on the content displayed on the virtual display screen. The input device can also be regarded as a keyboard as shown in fig. 8, and the input device can be a mobile phone, and in this case, the mobile phone has the function of the keyboard, and can input the content that the user wants to see on the virtual display screen through the keyboard. Of course, the input device may also be used as a remote controller, a handle, etc., and in this embodiment, only the input device is provided as an implementation of a mouse and a keyboard, but the present invention is not limited to this in practical application.
The input device in the present embodiment is only a mobile phone as an example, but is not limited to this in practical applications.
Compared with the prior art, the embodiment of the invention can control the change of the display states such as the size, the shape and the like of the virtual display screen through the control information input by the input equipment, so that the display state of the virtual display screen can be adjusted according to the actual requirement, and the use of a user is facilitated. The virtual display screen is locked, so that the user can use the input device as a device which is daily used by the user, such as a keyboard, a mouse or a remote controller. The input device can be used for intuitively and conveniently carrying out related operations on the locked virtual display screen, so that the augmented reality display method is more flexible.
A third embodiment of the present invention relates to a display method of augmented reality. The third embodiment is a further improvement of the first embodiment, and the main improvement is that after superimposing the target spatial position as a virtual display screen on a spatial position in the real environment, the method further comprises: acquiring display content according to information input by input equipment; the virtual display screen displays the content input by the input device, and specifically comprises the following steps: and displaying the acquired display content by the virtual display screen.
Fig. 9 shows a schematic flow chart of an augmented reality display method according to a third embodiment of the present invention, which specifically includes:
step 901: and acquiring a two-dimensional plane where a screen of the input device is located.
Step 902: and acquiring the space position of the target according to the two-dimensional plane.
Step 903: the virtual display screen is projected to a target spatial location.
Since steps 901 to 903 in this embodiment are substantially the same as steps 101 to 103 in the first embodiment, they are not repeated again.
Step 904: and acquiring display content according to the information input by the input equipment.
Specifically, the augmented reality glasses may acquire information input by the input device by scanning a screen of the input device, and acquire display content according to the scanned information. The input device can be a mobile phone, the information input by the mobile phone can be two-dimension code information, the two-dimension code is displayed on a screen of the mobile phone, the augmented reality glasses scan the two-dimension code to obtain the content to be displayed, and the content to be displayed is the content corresponding to the two-dimension code. The information input by the mobile phone can also be a website, the website is displayed on the screen of the mobile phone in a text and picture format, the augmented reality glasses acquire the content to be displayed by scanning the website on the screen of the mobile phone, and the content to be displayed can be a webpage corresponding to the website and also can be the website itself. If the information input by the mobile phone is only a common picture, the displayed content acquired by the augmented reality glasses through scanning the picture displayed on the mobile phone is the picture displayed on the mobile phone. Certainly, in practical use, the mobile phone can be connected with the augmented reality glasses in a matching mode through the Bluetooth, the mobile phone can directly send input information to the augmented reality glasses, and the augmented reality glasses acquire the information input by the mobile phone by receiving the information sent by the mobile phone.
It is worth mentioning that the augmented reality glasses can also perform operations such as login, addition, logout or deletion on the account information input by the user when receiving the account information. For example, the content corresponding to the two-dimensional code may also be a user name (account information), and if the user needs to log in using the two-dimensional code, the user name corresponding to the two-dimensional code may be displayed on the virtual display screen, and an interface indicating success or failure of login may also be displayed. The augmented reality glasses can add the account information corresponding to the identified two-dimensional code to the friend, delete the account information from the friend list, or log out the account information. The account information can be obtained by scanning the two-dimensional code displayed by the input equipment through the augmented reality glasses, and can also be obtained by directly inputting the account information in the input equipment by a user and then sending the account information to the augmented reality glasses. In short, the information input by the input device may correspond to a plurality of display states, and the content displayed on the virtual display screen may be content represented by an image displayed on the screen of the input device or other content corresponding to the information input by the input device.
It should be noted that, the present embodiment only provides two methods for acquiring the content to be displayed by the augmented reality glasses, and the practical application is not limited to this, and any method for acquiring the display content by the augmented reality glasses is within the protection scope of the embodiment of the present invention.
Step 905: and displaying the display content on the virtual display screen.
For example, if the display content acquired by the augmented reality glasses is the content of the web page corresponding to the two-dimensional code, the content of the web page is displayed on the virtual display screen. If the display content acquired by the augmented reality glasses is the content in the website corresponding to the website, the content in the website is displayed on the virtual display screen. If the display content acquired by the augmented reality glasses is a video corresponding to the video link, the corresponding video can be played on the virtual display screen. If the display content acquired by the augmented reality glasses is the account information of the user, the virtual display screen can display a related login or logout interface, a friend addition interface, a friend deletion interface, an account logout interface and the like. Of course, the present embodiment only provides several possible contents displayed on the virtual display screen, but the present invention is not limited to this in practical applications. It is worth mentioning that the input device can also send the data in the form of abstract characters, images, sounds and the like to the augmented reality glasses through a wired or wireless channel, so that the input capability of the input device is expanded, and meanwhile, the output capability of the augmented reality glasses is greatly improved.
Compared with the prior art, the third embodiment of the invention acquires the display content according to the information input by the input equipment, so that the display content can be changed according to the change of the input information, and the user can input different information through the input equipment according to actual needs, so that the content which the user wants to see is displayed on the virtual display screen, the input and output capabilities of the augmented reality glasses are greatly improved, an augmented reality environment which can be flexibly operated is created for the user, and the use of the user is facilitated.
The steps of the above methods are divided for clarity, and the implementation may be combined into one step or split some steps, and the steps are divided into multiple steps, so long as the same logical relationship is included, which are within the scope of the present patent; it is within the scope of this patent to add insignificant modifications or introduce insignificant designs to the algorithms or processes, but not to change the core designs of the algorithms and processes.
Those skilled in the art will understand that all or part of the steps in the method according to the above embodiments may be implemented by a program instructing related hardware to complete, where the program is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, etc.) or a processor (processor) to execute all or part of the steps in the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It will be understood by those of ordinary skill in the art that the foregoing embodiments are specific examples of practicing the invention, and that various changes in form and detail may be made therein without departing from the spirit and scope of the invention in practice.

Claims (8)

1. An augmented reality display method, comprising:
acquiring a two-dimensional plane where a screen of input equipment is located;
acquiring a target space position according to the two-dimensional plane;
the target space position is used as a space position of a virtual display screen superposed in a real environment; wherein the virtual display screen displays the content input by the input device;
wherein, after the overlaying the target spatial position as a spatial position of a virtual display screen in a real environment, the method further comprises:
receiving control information input by the input equipment;
controlling the display state of the virtual display screen according to the control information;
after the controlling the display state of the virtual display screen, the method further comprises:
judging whether a screen locking signal sent by the input equipment is received; the screen locking signal is a signal sent by the input equipment after the display state of the virtual display screen is determined to be not required to be adjusted;
if the screen locking signal is received, locking the virtual display screen; after the virtual display screen is locked, the input device has the functions of a mouse, a keyboard, a remote controller or a handle, and the position of the virtual display screen cannot be changed due to the movement of the input device.
2. The augmented reality display method according to claim 1, wherein the obtaining a spatial position of the target according to the two-dimensional plane specifically includes:
acquiring a first spatial position of the input device according to the two-dimensional plane;
and acquiring the target space position according to the first space position.
3. The augmented reality display method according to claim 1, wherein the obtaining a two-dimensional plane where a screen of the input device is located specifically includes:
scanning a screen of the input device;
extracting feature points on a screen of the input device;
and acquiring a two-dimensional plane where the screen of the input equipment is located according to the extracted feature points.
4. The augmented reality display method according to claim 1, wherein the obtaining of the two-dimensional plane where the screen of the input device is located specifically includes:
scanning a feature image displayed on a screen of the input device;
extracting feature points on the feature image;
and acquiring a two-dimensional plane where the screen of the input equipment is located according to the extracted feature points.
5. The augmented reality display method according to claim 1, wherein the superimposing the target spatial position as a virtual display screen after the spatial position in the real environment further comprises:
acquiring display content according to the information input by the input equipment;
the virtual display screen displays the content input by the input device, and specifically comprises the following steps: and the virtual display screen displays the acquired display content.
6. The augmented reality display method according to claim 5, wherein the obtaining of the display content according to the information input by the input device specifically includes:
acquiring information input by the input equipment by scanning a screen of the input equipment;
and acquiring the display content according to the scanned information.
7. The augmented reality display method according to claim 5, wherein the obtaining of the feature information input by the input device specifically includes:
acquiring information input by the input equipment by receiving the information sent by the input equipment;
and acquiring the display content according to the sent information.
8. The augmented reality display method according to claim 5, wherein superimposing the target spatial position as a virtual display screen after the spatial position in the real environment further comprises:
judging whether account information provided by input equipment is received;
and if the account information is received, logging in, logging out, adding or deleting the account information.
CN201711443850.9A 2017-12-27 2017-12-27 Augmented reality display method Active CN108038916B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711443850.9A CN108038916B (en) 2017-12-27 2017-12-27 Augmented reality display method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711443850.9A CN108038916B (en) 2017-12-27 2017-12-27 Augmented reality display method

Publications (2)

Publication Number Publication Date
CN108038916A CN108038916A (en) 2018-05-15
CN108038916B true CN108038916B (en) 2022-12-02

Family

ID=62097980

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711443850.9A Active CN108038916B (en) 2017-12-27 2017-12-27 Augmented reality display method

Country Status (1)

Country Link
CN (1) CN108038916B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109085924A (en) * 2018-08-03 2018-12-25 百度在线网络技术(北京)有限公司 The management method and device of smart machine based on AR
JP2021174047A (en) * 2020-04-20 2021-11-01 キヤノンメディカルシステムズ株式会社 Medical image display device, medical image output device, medium for medical image, and medical image display program
CN112506463A (en) * 2020-12-04 2021-03-16 歌尔光学科技有限公司 Display method, device and equipment based on head-mounted equipment
CN116932119B (en) * 2023-09-15 2024-01-02 深圳市其域创新科技有限公司 Virtual screen display method, device, equipment and computer readable storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105026982A (en) * 2012-11-20 2015-11-04 微软技术许可有限责任公司 Head mount display and method for controlling the same

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8547401B2 (en) * 2004-08-19 2013-10-01 Sony Computer Entertainment Inc. Portable augmented reality device and method
US20130215230A1 (en) * 2012-02-22 2013-08-22 Matt Miesnieks Augmented Reality System Using a Portable Device
GB2501567A (en) * 2012-04-25 2013-10-30 Christian Sternitzke Augmented reality information obtaining system
KR102182161B1 (en) * 2014-02-20 2020-11-24 엘지전자 주식회사 Head mounted display and method for controlling the same
CN104134229A (en) * 2014-08-08 2014-11-05 李成 Real-time interaction reality augmenting system and method
US20170061700A1 (en) * 2015-02-13 2017-03-02 Julian Michael Urbach Intercommunication between a head mounted display and a real world object
CN106355153B (en) * 2016-08-31 2019-10-18 上海星视度科技有限公司 A kind of virtual objects display methods, device and system based on augmented reality
CN206226634U (en) * 2016-10-13 2017-06-06 上海分众软件技术有限公司 It is wireless to throw screen system in real time
CN106683196A (en) * 2016-12-30 2017-05-17 上海悦会信息科技有限公司 Display method and system of augmented reality, and smart terminal

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105026982A (en) * 2012-11-20 2015-11-04 微软技术许可有限责任公司 Head mount display and method for controlling the same

Also Published As

Publication number Publication date
CN108038916A (en) 2018-05-15

Similar Documents

Publication Publication Date Title
CN108038916B (en) Augmented reality display method
WO2018188499A1 (en) Image processing method and device, video processing method and device, virtual reality device and storage medium
CN108525298B (en) Image processing method, image processing device, storage medium and electronic equipment
CN110163942B (en) Image data processing method and device
US9268410B2 (en) Image processing device, image processing method, and program
TW201331824A (en) Display control apparatus, display control method, and program
CN104656893A (en) Remote interaction control system and method for physical information space
JP2023515411A (en) Video rendering method, apparatus, electronic equipment and storage medium
CN112099681B (en) Interaction method and device based on three-dimensional scene application and computer equipment
JP6376591B2 (en) Data output device, data output method, and three-dimensional object manufacturing system
CN111459432B (en) Virtual content display method and device, electronic equipment and storage medium
CN111818326B (en) Image processing method, device, system, terminal device and storage medium
CN112328155B (en) Input device control method and device and electronic device
KR102082894B1 (en) Apparatus and method for displaying object and program stored in computer-readable medium for performing the same
CN111913564B (en) Virtual content control method, device, system, terminal equipment and storage medium
WO2020140905A1 (en) Virtual content interaction system and method
CN111524240A (en) Scene switching method and device and augmented reality equipment
JP2001209826A (en) Method and device for editing virtual space, computer program storage medium and virtual reality presenting device
CN116055708B (en) Perception visual interactive spherical screen three-dimensional imaging method and system
KR102419133B1 (en) Apparatus amd method for producing augmented reality content
CN117592098A (en) Image storage method and device
CN116385599B (en) Text interaction method, text interaction device, electronic equipment and storage medium
KR102645640B1 (en) Apparatus for generating image dataset for training and evaluation of image stitching and method thereof
KR102605429B1 (en) Method for display augmented reality image
CN114119812A (en) Method and device for generating brush pattern and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant