CN112601067B - Augmented reality display device and display method thereof - Google Patents

Augmented reality display device and display method thereof Download PDF

Info

Publication number
CN112601067B
CN112601067B CN202011458500.1A CN202011458500A CN112601067B CN 112601067 B CN112601067 B CN 112601067B CN 202011458500 A CN202011458500 A CN 202011458500A CN 112601067 B CN112601067 B CN 112601067B
Authority
CN
China
Prior art keywords
display
image
augmented reality
display device
processing unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011458500.1A
Other languages
Chinese (zh)
Other versions
CN112601067A (en
Inventor
闫桂新
孙建康
张�浩
陈丽莉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Beijing BOE Optoelectronics Technology Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Beijing BOE Optoelectronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd, Beijing BOE Optoelectronics Technology Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202011458500.1A priority Critical patent/CN112601067B/en
Publication of CN112601067A publication Critical patent/CN112601067A/en
Application granted granted Critical
Publication of CN112601067B publication Critical patent/CN112601067B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/261Image signal generators with monoscopic-to-stereoscopic image conversion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/293Generating mixed stereoscopic images; Generating mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention provides an augmented reality display device and a display method thereof, belongs to the technical field of display, and can at least partially solve the problem of poor user experience of the existing augmented reality display device. An augmented reality display device of the present invention includes: the image acquisition assembly is used for acquiring environmental images of surrounding environments and watching positions, and the watching positions are positions of eyes of a user; the processing unit is used for forming a fusion picture according to the environment image, the viewing position and the virtual image stored by the processing unit; the display component displays the fusion picture in response to the control of the processing unit.

Description

Augmented reality display device and display method thereof
Technical Field
The invention belongs to the technical field of display, and particularly relates to an augmented reality display device and a display method thereof.
Background
With the development of technology and the popularization of 5G technology, augmented reality (Augmented Reality, AR) technology has also been unprecedented. AR technology refers to providing a user with additional information in the real world (i.e., so-called "augmentation") by some technical means, which organically fuses together the virtual world and the real world scene, and by deep fusing the calculated information with the real world, provides a user with richer information and better immersive experience.
In the prior art, AR technology is most commonly applied to AR glasses, virtual-real fusion which can be embodied by the AR glasses is limited to the eyes of a person in the glasses, the sense of experience is reduced due to the head load sense brought by the AR glasses, and most of myopia and hyperopia people wear correction glasses themselves at present, so that the wearing of the AR glasses is also a big problem. Most importantly, the fusion degree of the virtual content and the surrounding environment is influenced by the spectacle lens, so that the user experience is greatly reduced.
Disclosure of Invention
The invention at least partially solves the problem of poor user experience of the existing augmented reality display device, and provides the augmented reality display device capable of improving the user experience.
The technical scheme adopted for solving the technical problem of the invention is an augmented reality display device, comprising: the image acquisition assembly is used for acquiring environmental images of surrounding environments and watching positions, and the watching positions are positions of eyes of a user; the processing unit is used for forming a fusion picture according to the environment image, the viewing position and the virtual image stored by the processing unit; the display component displays the fusion picture in response to the control of the processing unit.
It is further preferable that the display assembly includes at least two display screens, and the display directions of the different display screens are different, and the display directions are directions perpendicular to the display surfaces of the display screens.
It is further preferable that the display assembly includes five display screens, five display screens are connected in pairs to form a cavity, the display directions of two adjacent display screens are mutually perpendicular, and the display surface of the display screen deviates from the center of the cavity.
It is further preferable that each display screen is square, the cavity is square, and five faces of the square correspond to one display screen respectively.
It is further preferred that the image acquisition assembly comprises at least two acquisition units, the acquisition directions of different acquisition units are different, and the acquisition directions are directions in which the acquisition units point to the center of the image acquired by the acquisition units.
It is further preferred that the five display screens are divided into a top display screen and four side display screens, and the top display screen is connected with the four side display screens; the image acquisition assembly comprises eight acquisition units, one acquisition unit is arranged in the center of each side display screen, which is close to the edge of the top display screen, and one acquisition unit is arranged at each vertex angle formed by the side display screen and the top display screen.
It is further preferred that the field angle of each of said acquisition units is larger than 45 °.
The technical scheme adopted for solving the technical problem of the invention is a display method of augmented reality display, based on the augmented reality display device, the method comprises the following steps: the image acquisition component acquires an environment image and a viewing position of a surrounding environment and transmits data of the environment image and the viewing position to the processing unit, wherein the viewing position is the position of eyes of a user; the processing unit fuses the environment image and the virtual image based on the viewing position to form a fused display picture; a display component displays the fused picture in response to the control of the processing unit.
It is further preferred that the capturing, by the image capturing component, the environmental image and the viewing position of the surrounding environment, and transmitting the data of the environmental image and the viewing position to the processing unit includes: determining a viewing position; and acquiring the environment image according to the watching position.
It is further preferable that the display unit displaying the fused picture corresponding to the processing unit includes: and determining the fusion picture displayed by each display screen of the display assembly according to the viewing position.
Drawings
The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification, illustrate the invention and together with the description serve to explain, without limitation, the invention. In the drawings:
fig. 1 is a schematic structural diagram of an augmented reality display device according to an embodiment of the present invention;
fig. 2 is a schematic top view of an augmented reality display device according to an embodiment of the invention;
FIG. 3 is a schematic block diagram of an augmented reality display device according to an embodiment of the invention;
FIG. 4 is a schematic block diagram of an augmented reality display device according to an embodiment of the invention;
FIG. 5 is a flow chart of a display method of an augmented reality display according to an embodiment of the invention;
wherein, the reference numerals are as follows: 1. a processing unit; 2. a display screen; 21. a top display screen; 22. a side display screen; 3. an acquisition unit; 4. a support plane; 5. a bottom plate; 6. an environmental background; 7. a texture candidate region; 8. and (5) a dead zone.
Detailed Description
The present invention will be described in further detail below with reference to the drawings and detailed description for the purpose of better understanding of the technical solution of the present invention to those skilled in the art.
The invention will be described in more detail below with reference to the accompanying drawings. Like elements are denoted by like reference numerals throughout the various figures. For clarity, the various features of the drawings are not drawn to scale. Furthermore, some well-known portions may not be shown in the drawings.
Numerous specific details of the invention, such as construction, materials, dimensions, processing techniques and technologies, may be set forth in the following description in order to provide a thorough understanding of the invention. However, as will be understood by those skilled in the art, the present invention may be practiced without these specific details.
Example 1:
as shown in fig. 1 to 5, the present embodiment provides an augmented reality display device including: the image acquisition component, the display component and the processing unit 1, the processing unit 1 is respectively and electrically connected with the image acquisition component and the display component,
the image acquisition component is used for acquiring an environment image of the surrounding environment and a watching position, and the watching position is the position of eyes of a user;
the processing unit 1 is used for forming a fusion picture according to the environment image, the viewing position and the virtual image stored in the processing unit 1;
the display component displays the fusion screen in response to the control of the processing unit 1.
The image acquisition assembly can acquire an environment image around the augmented reality display device, that is to say, the image acquisition assembly can acquire an image of the environment where the augmented reality display device is located, and specifically, the acquisition angle of the image acquisition assembly can comprise the periphery and the top of the augmented reality display device. Meanwhile, the image acquisition component can detect the position of the user and accurately position the position of the eyes of the user.
The processing unit 1 is pre-stored with virtual images, and when the processing unit 1 receives the environment images and the watching positions from the image acquisition component, the environment images and the virtual images can be fused to form a fusion picture, and the fusion picture is transmitted to the display component so that the display component can display the fusion picture; meanwhile, the specific viewing angle of the fusion picture is related to the viewing position, namely, the fusion picture is related to the position of the user. It should be noted that, the fused image seen by the user may be a three-dimensional image, and the virtual image is located in the three-dimensional image, that is, the augmented reality display device forms a virtual-real fused image.
The augmented reality display device of this embodiment may form a virtual-real fusion image according to the combination of the environment image and the virtual image, and since the augmented reality display device is capable of determining the viewing position of the user, the relative position between the augmented reality display device and the user may be changed, that is, the formed fusion image may change with different positions of the user, so that the user can see a suitable three-dimensional image regardless of the relative position of the user and the augmented reality display device. Compared with the existing augmented reality display device (worn at the glasses of the user), the augmented reality display device of the embodiment is better in flexibility and can meet the requirements of various users.
Example 2:
as shown in fig. 1 to 5, the present embodiment provides an augmented reality display device including: the image acquisition component, the display component and the processing unit 1, the processing unit 1 is respectively and electrically connected with the image acquisition component and the display component,
the image acquisition component is used for acquiring an environment image of the surrounding environment and a watching position, and the watching position is the position of eyes of a user;
the processing unit 1 is used for forming a fusion picture according to the environment image, the viewing position and the virtual image stored in the processing unit 1;
the display component displays the fusion screen in response to the control of the processing unit 1.
The image acquisition assembly can acquire an environment image around the augmented reality display device, that is to say, the image acquisition assembly can acquire an image of the environment where the augmented reality display device is located, and specifically, the acquisition angle of the image acquisition assembly can comprise the periphery and the top of the augmented reality display device. Meanwhile, the image acquisition component can detect the position of the user and accurately position the position of the eyes of the user.
The processing unit 1 is pre-stored with virtual images, and when the processing unit 1 receives the environment images and the watching positions from the image acquisition component, the environment images and the virtual images can be fused to form a fusion picture, and the fusion picture is transmitted to the display component so that the display component can display the fusion picture; meanwhile, the specific viewing angle of the fusion picture is related to the viewing position, namely, the fusion picture is related to the position of the user. It should be noted that, the fused image seen by the user may be a three-dimensional image, and the virtual image is located in the three-dimensional image, that is, the augmented reality display device forms a virtual-real fused image.
The augmented reality display device of this embodiment may form a virtual-real fusion image according to the combination of the environment image and the virtual image, and since the augmented reality display device is capable of determining the viewing position of the user, the relative position between the augmented reality display device and the user may be changed, that is, the formed fusion image may change with different positions of the user, so that the user can see a suitable three-dimensional image regardless of the relative position of the user and the augmented reality display device. Compared with the existing augmented reality display device (worn at the glasses of the user), the augmented reality display device of the embodiment is better in flexibility and can meet the requirements of various users.
Preferably, the display assembly comprises at least two display screens 2, wherein the display directions of different display screens 2 are different, and the display directions are directions perpendicular to the display surfaces of the display screens 2.
Wherein, when the display assembly has a plurality of display screens 2 with different display directions, the user can watch the augmented reality display device at different positions relative to the augmented reality display device, and can see a proper three-dimensional image regardless of the relative positions of the user and the augmented reality display device.
Preferably, the display assembly comprises five display screens 2, the five display screens 2 are connected in pairs to form a cavity, the display directions of two adjacent display screens 2 are mutually perpendicular, and the display surface of the display screen 2 deviates from the center of the cavity.
The five display screens 2 can be divided into five directions, namely front, rear, left, right and upper directions, of the augmented reality display device, so that the fusion picture can be displayed more comprehensively in all directions, and the requirements of users at different viewing positions can be met to a greater extent.
Specifically, each display screen 2 is square, the cavity is square, and five faces of the square correspond to one display screen 2 respectively.
In this case, the five display panels 2 are square five surfaces.
It should be noted that the augmented reality display device further includes a bottom plate 5, which has the same shape as the display screen 2, such that the five display screens 2 and the bottom plate 5 enclose a cube, and the bottom plate 5 is a surface of the augmented reality display device that contacts the supporting plane 4 (e.g., a desktop or the like). The base plate 5 may not have a display function or may have a display function according to actual circumstances.
Preferably, the image acquisition assembly comprises at least two acquisition units 3, the acquisition directions of the different acquisition units 3 are different, and the acquisition directions are directions from the acquisition units 3 to the center of the image which can be acquired by the acquisition units 3.
Wherein, in order to be able to collect the environmental image around the augmented reality display device comprehensively, the image collection assembly may comprise a plurality of collection units 3, and the collection direction of each collection unit 3 is different.
Preferably, as shown in fig. 1, the five display screens 2 are divided into a top display screen 21 and four side display screens 22, and the top display screen 21 is connected with the four side display screens 22; the image acquisition assembly comprises eight acquisition units 3, wherein one acquisition unit 3 is arranged in the center of each side display screen 22, which is close to the edge of the top display screen 21, of each side display screen 22, and one acquisition unit 3 is arranged at each vertex angle formed by each side display screen 22 and each top display screen 21.
The acquisition unit 3 may be a camera, or other suitable acquisition structure.
Specifically, as shown in fig. 1, that is, a collection unit 3 is respectively disposed at the middle positions of the upper edges of the front, rear, left and right display screens 2 of the augmented reality display device. The four acquisition units 3 may be arranged in a hole digging screen manner, or in an under-screen camera technology, or in other arrangement manners. The collection direction (optical center direction) of the four collection units 3 is perpendicular to the display surface of the corresponding display screen 2, which may be referred to as an in-screen collection unit.
Four apex angles that augmented reality display device's top display screen 21 corresponds set up a collection unit 3 respectively, and the collection direction of every collection unit 3 is perpendicular with the diagonal of cube top surface, and these four collection units 3 are called corner collection unit. The corner acquisition units 3 can be arranged in a hole digging mode, namely holes are needed to be dug on the side display screens 22 of the two adjacent side display screens 22, and the top display screen 21 does not need to be dug; other suitable arrangements may be used.
If the collecting unit 3 is arranged in a hole digging mode, transparent glass or other materials can be used for supplementing the irregular shape of the hole digging position of the display screen 2, so that the appearance of the augmented reality display device can keep integrity and attractive appearance. If the setting of the acquisition unit 3 is in the technical mode of an under-screen camera, after the acquisition unit 3 is started, the position of the display screen 2 corresponding to the acquisition unit 3 needs to be processed in a transparent mode, so that the acquisition unit 3 can shoot the image around the augmented reality display device.
Preferably, the field angle of each acquisition unit 3 is greater than 45 °.
As shown in fig. 2, which is a top View Of the augmented reality display device, the Field Of View (FOV) Of each side display screen 22 is the same, and in order to make the acquisition range Of eight acquisition units 3 cover a 360 degree range, the field Of View Of each acquisition unit 3 needs to be greater than 45 degrees.
Further, the size of the field angle of the acquisition unit 3 affects the detection range of the augmented reality display device for the viewing position. For example, as shown in fig. 1, if the vertical angle of view of the acquisition units 3 (e.g., angle b in fig. 1) is small, each acquisition unit 3 may not take its position directly below, and thus there is a dead zone 8 (shown in fig. 1) below the acquisition unit 3. It can be seen that the blind zone 8 is smaller the greater the vertical field angle of the acquisition unit 3. Similarly, as shown in fig. 2, there is a blind area 8 between two adjacent acquisition units 3 in the horizontal direction, and the larger the horizontal viewing angle (angle a in fig. 2) of the acquisition units 3 is, the smaller the blind area 8 is. And the photographed image is distorted if the field angle of the acquisition unit 3 is too large. From the above, the angle of view of the acquisition unit 3 is preferably 60 to 90 degrees.
Furthermore, as shown in fig. 1, the processing unit 1 may be located in a cube and fixed to the base plate 5.
Wherein, that is to say, the processing unit 1 is inside the augmented reality display device. The processing unit 1 is a data processing center of the whole augmented reality display device, comprises two parts of software and hardware, is particularly responsible for the functions of image processing, display control of a display screen 2, image data fusion, rendering, equipment posture monitoring and the like of an acquisition unit 3 of the augmented reality display device, and comprises the following functions:
a. controlling the opening and closing of the eight cameras and receiving photographed images; b. controlling the display contents of the five display screens 2; c. image data fusion: fusing the environment image shot by the camera with the virtual image and outputting a display image; d. positioning and map building function (SLAM): constructing a 360-degree full scene map centering on the augmented reality display device through joint detection of eight cameras; e. user eye position calculation: each camera performs face detection and calculates the point of regard of the human eyes on the current display screen 2; f. rendering: respectively calculating the superposition effect of the environment image shot by the camera and the virtual image on the five display screens 2, rendering a final effect by using a three-dimensional scene reconstruction mode, and outputting the fusion image of the five display screens 2; g. augmented reality display device gesture detection function: the processing unit 1 is internally provided with a 9-axis attitude sensor and is fixed inside the processing unit 1, and when the augmented reality display device is horizontally placed (namely, when the bottom plate 5 is in a horizontal position), the attitude sensor is also in a horizontal state; h. the wifi/bluetooth function processing unit 1 is internally provided with a wifi/bluetooth module which can be connected with a wireless network or other control equipment; i. cell-phone interconnection function: the augmented reality display device can be connected with a smart phone through wifi or Bluetooth, control software is manufactured at a mobile phone end, and various functions of the augmented reality display device can be controlled through the mobile phone.
Example 2:
as shown in fig. 1 to 5, the present embodiment provides a display method of augmented reality display, based on the augmented reality display device of embodiment 1, the method includes:
s11, the image acquisition component acquires an environment image and a viewing position of the surrounding environment, and sends data of the environment image and the viewing position to the processing unit 1, wherein the viewing position is the position of eyes of a user.
Specifically, capturing the environmental image and the viewing position of the surrounding environment by the image capturing assembly, and transmitting the data of the environmental image and the viewing position to the processing unit 1 includes:
s111, determining the watching position.
In other words, three-dimensional coordinate detection is performed on the eyes of the user, specifically, as shown in fig. 3, only one eye is drawn here for simplicity of description, which represents the position of the eyes of the user. Specifically, after the camera in the screen of the display screen 2, which is opposite to the eyes of the person, is started, the face is detected, the position of the face is obtained, and then a three-dimensional coordinate point of the head relative to the camera is obtained through pnp (peselect-n-point) solving, and is recorded as H (x, y, z). It should be noted that, since there are very mature algorithms for face detection and pnp solution of head gestures at present, such as open source algorithm in opencv, the solution process is not described in detail here.
S112 acquires an environmental image according to the viewing position.
Wherein, first, calibration is performed. Any adjacent acquisition unit 3 (camera) calculates the rotation and offset matrix: and calculating the rotation and offset matrix of the adjacent cameras by using a binocular camera calibration algorithm, and calibrating two adjacent eight cameras respectively, so as to obtain coordinates (calibration data) of the eight cameras. The positions of any two camera images can be calculated according to the eight groups of data.
And secondly, image stitching is carried out. The coordinates of the eight cameras are fused, so that a 360-degree full scene image can be generated. Specifically, after eight cameras are turned on, as shown in fig. 2, the eight cameras are numbered. The cameras 1 and 2 start SLAM function and search key points in images acquired by the two cameras, and as the shooting contents of the two cameras are coincident, a part of the same key points exist. And according to the same key points, using an image stitching algorithm to stitch the images of the two cameras into one image. And in the same way, the images of the No. 2 camera and the No. 3 camera, the No. 3 camera and the No. 4 camera, the No. 4 camera and the No. 5 camera, the No. 5 camera and the No. 6 camera and the No. 7 camera, the No. 7 camera and the No. 8 camera and the No. 1 camera are respectively spliced, and finally a 360-degree full scene image is formed by splicing.
S12, the processing unit 1 fuses the environment image and the virtual image to form a fused display picture.
The processing unit 1 places the virtual image at a corresponding position in the environment image (three-dimensional space) according to the real size, obtains a virtual-real fusion display picture through a rendering technology, and displays the fusion display picture by a display component, so that a user can see that the virtual object is fused into the surrounding real environment. Because the fusion display picture is related to the watching position, the processing unit 1 can track the human eyes in real time to adjust the image of the display screen 2, so that the imaging of the virtual image is changed along with the adjustment, and the unified effect of virtual reality is achieved.
In particular, when we want to present a virtual small object to the augmented reality display device, we need to calculate the plane of the horizontal position of the augmented reality display device, and then display the virtual small object on this plane. Since the display effect of the display screen 2 can achieve transparent display, that is, the display screen 2 displays the surrounding environment image, and the displayed environment image is fused with the surrounding real scene, the imaging effect of the virtual small object is similar to that of placing a real object at the current horizontal position. Therefore, the augmented reality display device of the embodiment can be applied to the fields of science and technology display, museum article display, child object recognition and the like to display virtual articles without real objects.
S13, the display component responds to the control of the processing unit 1 to display the fusion picture.
Specifically, the display component displays the fusion screen corresponding to the processing unit 1 includes:
s131, determining the fusion picture displayed by each display screen 2 according to the viewing position.
Specifically, as shown in fig. 3, the range facing the display screen 2 with the line of sight of the user at the viewing position (point H) is the portion from the two dotted arrows from point H to the display screen 2. The view of the scene where the augmented reality display device can be seen by the user is shown by the dotted line from the display screen 2 to the background, that is, the image displayed by the display screen 2 needs to be fused with the environmental background 6, so that the user can see the display screen 2 as if looking at the real environment, and the transparent display effect of the augmented reality display device is achieved, that is, the full-scene image function is realized. In the whole scene image display process, the position of the display screen 2 in the whole scene image is required to be calculated, and the distorted scene image is subjected to anti-distortion processing by using calibrated data to obtain a two-dimensional camera shooting image. The image is presented into the display screen 2 so that the user at point H can see the full scene image.
The effect of the transparency in this case is not that the display 2 is a transparent display, but that the display 2 displays an image so that the user looks like a transparent display.
When the position of the user changes, the range of the visual angle of the human eye changes due to the change of the position of the human eye, so that the content on the display screen 2 needs to be synchronously adjusted, and the effect of synchronizing the human eye and the display picture is achieved. Similarly, the rest of the display screens 2 can be imaged by adopting the same algorithm, so that the image with the transparentization effect is seen in front of any display screen 2, namely, the display of the transparentization effect of the augmented reality display device is achieved.
Further, unlike the four display screens 2, the top display screen 21 does not acquire a surrounding environment image because it is on the top surface. In order to make the top display screen 21 also have a transparent display effect, an image of the support plane 4 on which the augmented reality display device is placed is obtained from the acquisition of the camera image by means of the feature points of the SLAM scan. And selects a texture candidate region 7, the texture candidate region 7 being a region surrounding the base plate 5, as shown in fig. 4, the texture candidate region 7 being a part of the entire scene photographed by the camera and being planar. Based on the image of the texture candidate region 7, a complementary image algorithm in image processing, such as an open source algorithm in opencv, is used, and the size of the complementary image area is the size of the top display screen 21, even if the top display screen 21 displays the calculated image, when the user looks at the top display screen 21, the image of the support plane 4 is seen, thereby achieving the effect of transparentizing the top display screen 21.
In addition, a 9-axis attitude sensor is built in the processing unit 1 of the augmented reality display device, and the attitude of the sensor is detected in real time. For example, when the original State of the recorded virtual image is State1 and the posture conversion matrix of the posture sensor is M, the State of the virtual image after the position change is State 2=m×state1. Therefore, the posture of the virtual image can be changed along with the change of the posture of the augmented reality display device, and the phenomenon that the fusion of the virtual image and the environment image is not matched after the augmented reality display device moves is avoided.
It should be noted that, the processing unit 1 of the augmented reality display device integrates a wifi/bluetooth module, and can be connected with a mobile phone. Correspondingly, the mobile phone end can develop a set of software for controlling the augmented reality display device, comprising: on/off of the augmented reality display device; the camera is turned on/off, and the full scene scanning function is started/turned off; the attitude sensor is turned on/off; virtual object selection, scaling, rotation, etc.; calculating opening/closing of the eye sight line; monitoring of the electric quantity, and the like.
Specifically, the display device may be any product or component with a display function, such as a liquid crystal display panel, an Organic Light Emitting Diode (OLED) display panel, electronic paper, a mobile phone, a tablet computer, a television, a display, a notebook computer, a digital photo frame, and a navigator.
It should be noted that in this document relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
Embodiments in accordance with the present invention, as described above, are not intended to be exhaustive or to limit the invention to the precise embodiments disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best utilize the invention and various modifications as are suited to the particular use contemplated. The invention is limited only by the claims and the full scope and equivalents thereof.

Claims (5)

1. A display method of an augmented reality display, the method comprising:
the image acquisition component acquires an environment image and a viewing position of a surrounding environment and transmits data of the environment image and the viewing position to the processing unit, wherein the viewing position is the position of eyes of a user;
the processing unit fuses the environment image and the virtual image based on the viewing position to form a fused picture;
a display component responds to the control of the processing unit to display the fusion picture; wherein, the liquid crystal display device comprises a liquid crystal display device,
the display assembly comprises five display screens, wherein the five display screens are divided into a top display screen and four side display screens;
the specific method for the display component to display the fusion picture in response to the control of the processing unit comprises the following steps:
determining the fusion pictures displayed by each display screen according to the viewing positions; wherein, the liquid crystal display device comprises a liquid crystal display device,
the image displayed by the display screen is fused with the environment background;
the four side display screens adopt the same algorithm for imaging when displaying;
the top display screen acquires an image of a supporting plane placed by the augmented reality display device through acquisition; based on the image of the texture candidate area, the image displayed by the top display screen is fused with the environmental background by using a complementary image algorithm in image processing; wherein, the liquid crystal display device comprises a liquid crystal display device,
the texture candidate region is a region surrounding the base plate.
2. The display method of augmented reality display of claim 1, wherein the image acquisition component acquiring an environmental image and a viewing position of a surrounding environment and transmitting data of the environmental image and the viewing position to a processing unit comprises:
determining a viewing position;
and acquiring the environment image according to the watching position.
3. The display method of augmented reality display of claim 2, wherein the display component displaying the fused picture in response to the processing unit comprises:
and determining the fusion picture displayed by each display screen of the display assembly according to the viewing position.
4. An augmented reality display device, characterized in that the display device is configured to perform the display method of any one of claims 1-3, comprising: the processing unit is respectively and electrically connected with the image acquisition component and the display component,
the image acquisition component is used for acquiring an environment image of the surrounding environment and a watching position, and the watching position is the position of eyes of a user;
the processing unit is used for forming a fusion picture according to the environment image, the viewing position and the virtual image stored by the processing unit;
the display component responds to the control of the processing unit to display the fusion picture;
the display assembly comprises five display screens, wherein the five display screens are connected in pairs to form a cavity, the display directions of two adjacent display screens are mutually perpendicular, and the display surface of the display screen deviates from the center of the cavity;
each display screen is square, the cavity is square, and five faces of the square correspond to one display screen respectively;
the five display screens are divided into a top display screen and four side display screens, and the top display screen is connected with the four side display screens;
the image acquisition assembly comprises eight acquisition units, one acquisition unit is arranged in the center of each side display screen, which is close to the edge of the top display screen, and one acquisition unit is arranged at each vertex angle formed by the side display screen and the top display screen.
5. The augmented reality display device of claim 4, wherein the field angle of each acquisition unit is greater than 45 °.
CN202011458500.1A 2020-12-11 2020-12-11 Augmented reality display device and display method thereof Active CN112601067B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011458500.1A CN112601067B (en) 2020-12-11 2020-12-11 Augmented reality display device and display method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011458500.1A CN112601067B (en) 2020-12-11 2020-12-11 Augmented reality display device and display method thereof

Publications (2)

Publication Number Publication Date
CN112601067A CN112601067A (en) 2021-04-02
CN112601067B true CN112601067B (en) 2023-08-15

Family

ID=75192999

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011458500.1A Active CN112601067B (en) 2020-12-11 2020-12-11 Augmented reality display device and display method thereof

Country Status (1)

Country Link
CN (1) CN112601067B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114663558A (en) * 2022-03-14 2022-06-24 南京青臣创意数字科技有限公司 Construction method and device for realizing double-sided naked eye 3D animation effect, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201174206Y (en) * 2008-03-21 2008-12-31 朱一帆 Three-dimensional stereo display device
CN205862299U (en) * 2016-06-15 2017-01-04 苏州创捷传媒展览股份有限公司 Virtual reality interactive experience device
CN106997618A (en) * 2017-04-14 2017-08-01 陈柳华 A kind of method that virtual reality is merged with real scene
CN111050145A (en) * 2018-10-11 2020-04-21 上海云绅智能科技有限公司 Multi-screen fusion imaging method, intelligent device and system
CN111881861A (en) * 2020-07-31 2020-11-03 北京市商汤科技开发有限公司 Display method, device, equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019105678A (en) * 2017-12-11 2019-06-27 京セラドキュメントソリューションズ株式会社 Display device and method to display images

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201174206Y (en) * 2008-03-21 2008-12-31 朱一帆 Three-dimensional stereo display device
CN205862299U (en) * 2016-06-15 2017-01-04 苏州创捷传媒展览股份有限公司 Virtual reality interactive experience device
CN106997618A (en) * 2017-04-14 2017-08-01 陈柳华 A kind of method that virtual reality is merged with real scene
CN111050145A (en) * 2018-10-11 2020-04-21 上海云绅智能科技有限公司 Multi-screen fusion imaging method, intelligent device and system
CN111881861A (en) * 2020-07-31 2020-11-03 北京市商汤科技开发有限公司 Display method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN112601067A (en) 2021-04-02

Similar Documents

Publication Publication Date Title
AU2019246856B2 (en) Devices, systems and methods of capturing and displaying appearances
US8976086B2 (en) Apparatus and method for a bioptic real time video system
US9123272B1 (en) Realistic image lighting and shading
CA2875261C (en) Apparatus and method for a bioptic real time video system
KR102209745B1 (en) An information display device of a mirror display for advertisement and shopping by recognizing the reflected images on the mirror and method thereof
US10602033B2 (en) Display apparatus and method using image renderers and optical combiners
US11843758B2 (en) Creation and user interactions with three-dimensional wallpaper on computing devices
US20220172319A1 (en) Camera-based Transparent Display
CN106168855B (en) Portable MR glasses, mobile phone and MR glasses system
Sandnes et al. Head-mounted augmented reality displays on the cheap: a DIY approach to sketching and prototyping low-vision assistive technologies
KR102183692B1 (en) An augmented reality service apparatus for a mirror display by recognizing the reflected images on the mirror and method thereof
US11195341B1 (en) Augmented reality eyewear with 3D costumes
US20210390780A1 (en) Augmented reality environment enhancement
CN112601067B (en) Augmented reality display device and display method thereof
US9449427B1 (en) Intensity modeling for rendering realistic images
WO2014103088A1 (en) Display system and display control method
CN113870213A (en) Image display method, image display device, storage medium, and electronic apparatus
US20230007227A1 (en) Augmented reality eyewear with x-ray effect
CN113168228A (en) Systems and/or methods for parallax correction in large area transparent touch interfaces
CN109565584A (en) The dynamic of three-dimensional digit content is reconfigured quasi-
EP4169246A1 (en) Systems and methods for temporal corrections for parallax reprojection
JP2000184397A (en) Virtual image stereoscopic synthesis communication system and virtual image stereoscopic synthesis communication method
CN115794004A (en) Information display method, processing device thereof and information display system
CN117459702A (en) Digital-real fusion naked eye 3D presentation method, system and related equipment
BR102013030771A2 (en) augmented reality media device superimposed on user reflection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant