CN104918036B - Augmented reality display device and method - Google Patents
Augmented reality display device and method Download PDFInfo
- Publication number
- CN104918036B CN104918036B CN201410090169.0A CN201410090169A CN104918036B CN 104918036 B CN104918036 B CN 104918036B CN 201410090169 A CN201410090169 A CN 201410090169A CN 104918036 B CN104918036 B CN 104918036B
- Authority
- CN
- China
- Prior art keywords
- eye
- depth
- display screen
- camera
- digital picture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Landscapes
- Processing Or Creating Images (AREA)
Abstract
A kind of augmented reality display device and method are disclosed, the augmented reality display device includes: transparent display screen, for showing digital picture on it;Focus point determines component, for obtaining the position of eye focus point;Image production components are shown, for generating based on the position at eye focus point obtained for the digital picture in the transparence display screen display.By determining the position of the virtual image of digital picture according to the position of eye focus point, so that the shown digital picture of enhancing and the sense of reality of real scene merged.
Description
Technical field
The present invention relates to augmented reality fields, and relate more specifically to a kind of augmented reality display device and side
Method.
Background technique
Augmented reality has been got growing concern for as a kind of technology realized and virtually combined with reality.
Augmented reality can be realized on augmented reality glasses, when user wears such augmented reality glasses simultaneously
When watching real scene through it, digital picture is shown by augmented reality glasses, so that user is in real scene
On the basis of experience virtual effect caused by digital picture.However, the digital picture shown at present and real scene merge
The sense of reality it is not strong.
Therefore, it is necessary to a kind of augmented reality display devices, can be perfectly combined digital picture and real scene, so that
User really experiences feeling on the spot in person.
Summary of the invention
In order to solve the above-mentioned technical problems, the present invention provides a kind of augmented reality display device and methods, pass through root
The position of the virtual image of digital picture is determined according to the position of eye focus point, so that the shown digital picture of enhancing and true field
The sense of reality of the fusion of scape.
According to an aspect of the invention, there is provided a kind of augmented reality display device, comprising: transparent display screen is used for
Digital picture is shown on it;Focus point determines component, for obtaining the position of eye focus point;Show image production components,
For being generated based on the position at eye focus point obtained for the digital picture in the transparence display screen display.
Preferably, the augmented reality display device further include: the depth of field determines component, for obtaining the digital picture
The desired location of the virtual image, wherein the display image production components are based on the position at eye focus point obtained, Yi Jisuo
The desired location of the virtual image is stated to generate for the digital picture in the transparence display screen display.
Preferably, the virtual image of the digital picture is located at the eye focus point.
Preferably, the transparent display screen includes left transparent display screen and right transparent display screen, the left transparent display screen
For showing that the image for left eye viewing, the right transparent display screen are used to show the image for right eye viewing;And it is described aobvious
Show that image production components are generated respectively for the left eye digital picture in left transparence display screen display and for right transparent
Show the right eye digital picture of screen display.
Preferably, the focus point determines that component includes the camera for tracking the movement of at least one eyeball, and
At least one described camera includes being respectively used to tracking left eye eye movement and the oculomotor left camera of right eye and right photograph
Camera, the left camera and right camera are centrosymmetrically arranged relative to left eye and right eye.
Preferably, the depth of field determines that component includes left depth of field camera and right depth of field camera, the left depth of field photograph
Machine and right depth of field camera are centrosymmetrically arranged relative to left eye and right eye, and the depth of field determines component according to the left depth of field
The distance between camera and the right depth of field camera and the left depth of field camera and right depth of field camera are respectively clapped
The image taken the photograph determines the depth of field at eye focus point.
Preferably, the augmented reality display device is augmented reality glasses, and the left transparent display screen and the right side are thoroughly
Obvious display screen is implemented as left eyeglass and right eyeglass respectively, and the left camera and the left depth of field camera are mounted on the increasing
Near the left eyeglass of strong Reality glasses, the right camera and the right depth of field camera are mounted on the augmented reality glasses
Near right eyeglass.
According to a further aspect of the invention, a kind of augmented reality display methods is provided, comprising: obtain the position of eye focus point
It sets;It is generated based on the position at eye focus point obtained for the digital picture in the transparence display screen display;
And the digital picture described in the transparence display screen display.
Preferably, the augmented reality display methods further include: obtain the desired location of the virtual image of the digital picture;Its
In, based on the desired location of position and the virtual image at eye focus point obtained, to generate for described transparent
Show the digital picture of screen display.
Preferably, the transparent display screen includes left transparent display screen and right transparent display screen, wherein based on obtained
Position at eye focus point is generated for including: to generate on a left side in the digital picture of the transparence display screen display
The left eye digital picture of transparence display screen display and for the right eye digital picture in right transparence display screen display;Its
In, the digital picture described in the transparence display screen display includes: in the left transparence display screen display left eye digitized map
Picture, and in the right transparence display screen display right eye digital picture.
Preferably, pass through the left camera being centrosymmetrically arranged and right camera shooting left eye relative to left eye and right eye
Eyeball image and right eye eyeball image, to obtain the position of eye focus point.
Preferably, pass through the left depth of field camera being centrosymmetrically arranged and right depth of field camera relative to left eye and right eye
Shoot left-eye image and eye image, come obtain the digital picture the virtual image desired location.
Using augmented reality display device according to the present invention and display methods, by according to the eye focus point of human eye
Position generates digital picture, so that the virtual image of digital picture and merging for real scene are more true, enhances user
Experience.
Other features and advantages of the present invention will be illustrated in the following description, also, partly becomes from specification
It obtains it is clear that understand through the implementation of the invention.The objectives and other advantages of the invention can be by specification, right
Specifically noted structure is achieved and obtained in claim and attached drawing.
Detailed description of the invention
Attached drawing is used to provide further understanding of the present invention, and constitutes part of specification, with reality of the invention
It applies example to be used to explain the present invention together, not be construed as limiting the invention.In the accompanying drawings:
Fig. 1 illustrates the schematic block diagrams of augmented reality display device according to a first embodiment of the present invention;
Fig. 2 illustrates the schematic block diagram of augmented reality display device according to a second embodiment of the present invention;
Fig. 3 illustrates the schematic diagram of augmented reality glasses according to a second embodiment of the present invention;
Fig. 4 A illustrates the schematic diagram that eye focus point according to a first embodiment of the present invention determines;
Fig. 4 B and 4C illustrate the schematic diagram of display effect according to a first embodiment of the present invention;
Fig. 5 A illustrates the schematic diagram that the point of eye focus according to a second embodiment of the present invention determines;
Fig. 5 B-5E illustrates the schematic diagram of display effect according to a second embodiment of the present invention;
Fig. 6 illustrates the flow chart of augmented reality display methods according to a third embodiment of the present invention;And
Fig. 7 illustrates the flow chart of augmented reality display methods according to a fourth embodiment of the present invention.
Specific embodiment
It will be described in detail with reference to the accompanying drawings each embodiment according to the present invention.Here it is to be noted that it in the accompanying drawings,
It assigns identical appended drawing reference to component part substantially with same or like structure and function, and will omit about it
Repeated description.
First embodiment
As shown in Figure 1, showing augmented reality display device 100 according to a first embodiment of the present invention comprising: it is transparent
Display screen 110, focus point determine component 120 and display image production components 130.
Transparent display screen 110 receives digital picture from display image production components 130 and shows the received digital picture of institute.
The specific implementation of transparent display screen 110 is well known to the skilled person, and is not repeated herein it.Ying Liao
Solution, transparent display screen 110 in augmented reality display device 100 according to an embodiment of the present invention can using any existing or
The transparence display technology of exploitation makes in the future, if its technical effect that can be realized transparence display, therefore herein not
Any restrictions are made to its specific implementation.
Focus point determines that component 120 is configured to obtain the position of the eye focus point of user, the eye focus point
Position include two-dimensional surface position of the eye focus point in the human eye visual field and the eye focus point away from described transparent aobvious
The distance of display screen 110.
Show position of the image production components 130 based on eye focus point obtained (for example, specifically, the eyes are poly-
The two-dimensional surface position of focus and the distance of the eye focus point away from transparent display screen 110), generation will be in the transparence display
The digital picture shown on screen 110.
The focus point determines that component 120 may include at least one photograph for tracking the movement of at least one eyeball
Machine.
As an example, the focus point determines that component 120 may include left camera and right camera, the left camera
For tracking the movement of left eye eyeball and shooting the image of left eye eyeball, the right camera is used to track the fortune of right eye eyeball
Image that is dynamic and shooting right eye eyeball.Advantageously, the center of the left camera and right camera relative to left eye and right eye
It is arranged symmetrically.For example, the left camera and right camera can symmetrically be arranged in a left side relative to left eye and right eye
Above eye and right eye, or being symmetrically arranged in below left eye and right eye relative to left eye and right eye, Huo Zheke
With being symmetrically arranged on the outside of left eye and on the outside of right eye relative to left eye and right eye.
In addition, the focus point determines that component 120 can also include focus point calculating unit, the focus point calculating unit
The image of the image of left eye eyeball based on the left camera shooting and the right eye eyeball of the right camera shooting is based on institute
State the distance between left camera and the right camera and based on the left camera at a distance from left eye and the right photograph
Camera calculates the left eye of user and the eye focus point of right eye at a distance from right eye.
As an example, the position of the eye focus point includes space bit of the eye focus point in real scene
It sets, and can specifically include the three-dimensional coordinate information in space.For example, the upper left for the real scene watched with user
The actual position for the eye focus point that vertex, bottom left vertex, right vertices or bottom right vertex measure for coordinate origin;Or with
The actual position of the eye focus point that is projected as coordinate origin and measures of the center of the transparent display screen in real scene.
The three-dimensional coordinate information can be the three-dimensional coordinate information under three-dimensional system of coordinate, or the three-dimensional seat being also possible under polar coordinates
Mark information.
As an example, the transparent display screen 110 may include left transparent display screen and right transparent display screen.In this situation
Under, the two-dimensional surface position of the display image production components 130 based on eye focus point obtained and eye focus point away from
The distance of transparent display screen 110, generate will left transparence display screen display left eye digital picture and will be right transparent aobvious
Show the right eye digital picture of screen display;The left transparent display screen receives left eye number from the display image production components 130
Word image and show received left eye digital picture so that the left eye of viewer is watched, the right transparent display screen is from described aobvious
Show image production components 130 receive right eye digital picture and show institute received right eye digital picture for viewer right eye sight
It sees.
As an example, the transparent display screen 110 shows the digital picture that the display image production components 130 generate,
The virtual image of shown digital picture is located at the eye focus point.
In augmented reality display device 100 according to an embodiment of the present invention, according to the position of the eye focus of user point
It sets to generate the digital picture (that is, virtual image) for showing on transparent display screen 110, so that virtual image and true
The fusion of real field scape is truer.
According to the present invention first is described for placing a cup in real scene and fill the water into the cup below
The working condition of the augmented reality display device 100 of embodiment.
Augmented reality display device 100 according to a first embodiment of the present invention is made as augmented reality glasses, such as Fig. 3 institute
Show, the transparent display screen 110 includes left eyeglass lens L and right eye eyeglass R, and the focus point determines that component 120 includes left photograph
Machine L1 and right camera R1.
In this example, human eye looks at cup, and augmented reality display device 100 according to a first embodiment of the present invention can
Generate the virtual image filled the water into cup.
The focus point determines component 120 according to the figure of the left camera L1 and right camera R1 left eye eyeball shot
Position of the image and the left camera L1 and right camera R1 of picture and right eye eyeball relative to left eye and right eye is come true
Determine the position (i.e. two-dimensional surface position) of the eye focus point of user and determines the eye focus point away from the transparent display screen
110 distance.
As shown in Figure 4 A, the schematic diagram that eye focus point determines in the embodiment is shown.
When human eye looks at cup, the focus point determines that component 120 determines that eye focus point is cup position, passes through benefit
It can produce with augmented reality display device 100 according to a first embodiment of the present invention and correspond to the virtual image filled the water into cup
Digital picture.Shown in Fig. 4 B, the case where filling the water when human eye watches cup attentively into cup is shown.
In addition, the focus point determines that component 120 determines new eye focus point when human eye deviates cup, pass through benefit
It can produce with augmented reality display device 100 according to a first embodiment of the present invention and correspond to the virtual image filled the water to outside cup
Digital picture.For example, not injecting water into cup, but flow on the desktop outside cup, further can produce
Digital picture corresponding to the virtual image for forming a beach water on the desktop outside cup.As shown in Figure 4 C, it shows human eye and watches water attentively
Water flows to the situation on desktop when cup is outer.
Second embodiment
Firstly, the augmented reality display device 200 by reference Fig. 2 description according to a second embodiment of the present invention.
It is and shown in FIG. 1 as shown in Fig. 2, show augmented reality display device 200 according to a second embodiment of the present invention
Augmented reality display device 100 is compared, and is determined except component 120 in addition to transparent display screen 110 and focus point, is further included
The depth of field determines component 225 and display image production components 230.
Transparent display screen 110 in augmented reality display device 200 and focus point according to a second embodiment of the present invention is true
Determine transparent display screen 110 in the augmented reality display device 100 according to a first embodiment of the present invention of component 120 and focus point is true
It is identical to determine component 120, is no longer repeated herein.
Focus point determines that component 120 is configured to obtain the position of the eye focus point of user, the eye focus point
Position include two-dimensional surface position of the eye focus point in the human eye visual field and the eye focus point away from described transparent aobvious
The distance of display screen 110.
The depth of field determines that component 225 is configured to obtain the desired location of the digital virtual image, and the desired location includes expected scape
It is deep.
Show two-dimensional surface position of the image production components 130 based on eye focus point obtained and eye focus point away from
The distance of transparent display screen 110 and it is based on the expected depth of field obtained, generation will show on the transparent display screen 110
Digital picture.
The depth of field determines that component 225 may include at least one depth of field camera.
As an example, the depth of field determines that component 225 may include left depth of field camera and right depth of field camera, the left side
Depth of field camera is for shooting left-eye image, and the right depth of field camera is for shooting eye image.Advantageously, the left depth of field
Camera and right depth of field camera are centrosymmetrically arranged relative to left eye and right eye.For example, the left depth of field camera and the right side
Depth of field camera being symmetrically arranged in above left eye and right eye relative to left eye and right eye, or can be relative to
Left eye and right eye are symmetrically arranged in below left eye and right eye, or can be relative to the central symmetry of left eye and right eye
Ground is arranged on the outside of left eye and on the outside of right eye.
In addition, the depth of field determines that component 225 can also include depth of field calculating unit, the depth of field calculating unit is based on institute
State the left-eye image of left depth of field camera shooting and the eye image of the right depth of field camera shooting and based on the left scape
The distance between deep camera and the right depth of field camera calculate the depth of field in real scene at least the first object.
As an example, first object is eye focus point object corresponding in real scene.
As an example, the transparent display screen 110 may include left transparent display screen and right transparent display screen.In this situation
Under, the display position and eye focus point of the image production components 230 based on eye focus point obtained are away from transparence display
Shield 110 distance and based on the depth of field in real scene obtained at least the first object, generation will be in left transparence display
The left eye digital picture of screen display and will be in the right eye digital picture of right transparence display screen display;The left side is transparent aobvious
Display screen receives left eye digital picture from the display image production components 230 and shows the received left eye digital picture of institute for seeing
The left eye for the person of seeing is watched, and the right transparent display screen receives right eye digital picture from the display image production components 230 and shows
Show received right eye digital picture for viewer right eye watch.
As an example, the transparent display screen 110 shows the digital picture that the display image production components 230 generate,
The virtual image of shown digital picture is located at the eye focus point or is located at at least first object.
In augmented reality display device 200 according to an embodiment of the present invention, according to the position of the eye focus of user point
It sets and the depth of field of at least first object generates the digital picture for showing on transparent display screen 110 (that is, virtual
Image) so that virtual image and merging for real scene are truer.
First example
According to the present invention second is described for placing a cup in real scene and fill the water into the cup below
The working condition of the augmented reality display device 200 of embodiment.
Augmented reality display device 200 according to a second embodiment of the present invention is made as augmented reality glasses, such as Fig. 3 institute
Show, the transparent display screen 110 includes left eyeglass lens L and right eye eyeglass R, and the focus point determines that component 120 includes left photograph
Machine L1 and right camera R1, the depth of field determine that component 225 includes left depth of field camera L2 and right depth of field camera R2.
In this example, human eye looks at cup, and augmented reality display device 200 according to a second embodiment of the present invention can
Generate the virtual image filled the water into cup.
The focus point determines component 120 according to the figure of the left camera L1 and right camera R1 left eye eyeball shot
Position of the image and the left camera L1 and right camera R1 of picture and right eye eyeball relative to left eye and right eye is come true
Determine the two-dimensional surface position of the eye focus point of user and determine the eye focus point away from the transparent display screen 110 away from
From.For example, human eye may watch the different location of cup attentively, the eye focus point thereby determined that may also be different.
The depth of field determines the left eye figure that component 225 is shot according to the left depth of field camera L2 and right depth of field camera R2
The position of picture and eye image, the position of the left depth of field camera L2 and right depth of field camera R2 and eye focus point
It sets, to determine the depth of field at eye focus point position.For example, determining that the eyes that component 225 determines are poly- by the depth of field
The depth of field at focal position can further guarantee to judge the accuracy of the cup depth of field.
As shown in Figure 5A, the schematic diagram that eye focus point determines in the example is shown.
It, can as cup position is mobile by utilizing augmented reality display device 200 according to a second embodiment of the present invention
To be always ensured that the virtual image of shown digital picture and the relative positional relationship of cup, i.e., filled the water into cup.Such as Fig. 5 B and
Shown in 5C, as cup is moved to right side from left side, filled the water always into cup.
In addition, by using augmented reality display device 200 according to a second embodiment of the present invention, it is constant in cup position
When, it is moved on the right side of cup with the slight change of eye focus point position, such as on the left of cup, it is poly- that eyes can also be tracked
The slight change of focus is guaranteeing fine adjust the specific location filled the water into cup while water filling into cup.
As shown in Figure 5 D, and 5 E, it as human eye sees the slight change to the position of cup, such as from left to right, is filled the water into cup
Slight change also occurs for position, correspondingly from left to right.
It is filled the water although being always ensured that in this example into cup, it is to be appreciated that the invention is not limited thereto, according to reality
Situation can also generate water filling water flow according only to the position of eye focus point, for example, it may be possible to do not inject water into cup,
But flowed on the desktop outside cup, it further can produce the void that a beach water is formed on the desktop corresponded to outside cup
The digital picture of picture.
Second example
Below by real scene the right hand make throwing motion to describing for wall pitching and wall reflection ball
The working condition of augmented reality display device 200 according to a second embodiment of the present invention.
In this example, augmented reality display device 200 according to a second embodiment of the present invention can be generated corresponding to small
The virtual image of ball motion profile.
The focus point determines component 120 according to the figure of the left camera L1 and right camera R1 left eye eyeball shot
Position of the image and the left camera L1 and right camera R1 of picture and right eye eyeball relative to left eye and right eye is come true
Determine the two-dimensional surface position of the eye focus point of user and determine the eye focus point away from the transparent display screen 110 away from
From.
The depth of field determines the left eye figure that component 225 is shot according to the left depth of field camera L2 and right depth of field camera R2
The position of picture and eye image, the left depth of field camera L2 and right depth of field camera R2 determines at eye focus point position
The depth of field.
For example, the focus point determines that component 120 determines that the position of eye focus point is the right hand when human eye watches the right hand attentively
Position, the depth of field determines that component 225 determines the depth of field at eye focus point position (i.e. right-hand lay) at this time.Correspondingly, institute
It states display image production components 230 and transparent display screen 110 generates the number for corresponding to the virtual image for appearing in the bead in the right hand
Image.Then, before right hand pitching, with the movement of the right hand, the depth of field determines that component 225 tracks position and the scape of the right hand
Deep, the display image production components 230 and transparent display screen 110 generate the number for corresponding to the virtual image that the right hand holds always bead
Word image, i.e. generation bead are always held at the visual effect in the right hand.
Then, human eye watches wall attentively, and the focus point determines that component 120 determines that the position of eye focus point is wall position
It sets, the depth of field determines that component 225 determines that the depth of field at eye focus point position (i.e. wall locations) is moved as target at this time
The depth of field.
In addition, the depth of field determines that component 225 can also be according to the left depth of field camera L2 when throwing motion occurs
With the left-eye image and eye image, the left depth of field camera L2 and right depth of field camera R2 of right depth of field camera R2 shooting
Position determines the direction of right hand pitching.
It is described to show image production components 230 according to the depth of field, the depth of field of wall and the right hand of the identified pitching right hand
The direction of pitching, determines the motion profile of bead, and generates corresponding digital picture.
In this example, the focus point determines that component 120 and the depth of field determine that the cooperation of component 225 is increasingly complex,
According to using needs, it is understood that there may be more more complicated fit systems, it should be understood that the reality that the present invention is not limited to specifically describe here
Example is applied, as long as applying the focus point determines that component 120 and the depth of field determine that component 225 cooperates to determine virtual image position,
Should just it cover in the scope of the present invention.
3rd embodiment
Next, it will be described with reference to Figure 6 augmented reality display methods 600 according to a third embodiment of the present invention, application
In augmented reality display device 100 according to a first embodiment of the present invention.
Augmented reality display methods 600 according to an embodiment of the present invention starts in step S601.
Firstly, obtaining the position of eye focus point in step S610, the position of the eye focus point includes that the eyes are poly-
Two-dimensional surface position and the eye focus point distance away from the transparent display screen 110 of the focus in the human eye visual field.
As previously mentioned, component 120 can be determined by focus point to obtain the position of eye focus point.
The focus point determines that component 120 may include left camera and right camera, and the left camera is for tracking
The movement of left eye eyeball and left eye eyeball image is shot, the right camera is used to track movement and the shooting of right eye eyeball
Right eye eyeball image, and the left camera and right camera being centrosymmetrically arranged relative to left eye and right eye.In other words
It says, left eye eyeball image and the right side is shot by the left camera being centrosymmetrically arranged relative to left eye and right eye and right camera
Eye eyeball image, to obtain the position of eye focus point.
For example, the left camera and right camera can symmetrically be arranged in left eye relative to left eye and right eye
With above right eye, perhaps can relative to left eye and right eye be symmetrically arranged in below left eye and right eye or can be with
Relative to being symmetrically arranged on the outside of left eye and on the outside of right eye for left eye and right eye.
More specifically, can be based on the right side for the left eye eyeball image and the right camera shooting that the left camera is shot
Eye eyeball image is based on the distance between the left camera and the right camera and based on the left camera and a left side
The distance of eye and the right camera calculate the left eye of user and the eye focus point of right eye at a distance from right eye.
Then, it in step S620, is generated based on the position at eye focus point obtained for described transparent aobvious
Show the digital picture of screen display.
Next, showing the digital picture on the transparent display screen 110 in step S630.
The transparent display screen 110 may include left transparent display screen and right transparent display screen.In the case, it is based on institute
The position of the eye focus point of acquisition, generate will left transparence display screen display left eye digital picture and will it is right thoroughly
Obviously show the right eye digital picture of screen display;The left transparent display screen shows the left eye digital picture for viewer's
Left eye viewing, the right transparent display screen show that the right eye digital picture is watched for the right eye of viewer.
In the case, the virtual image of shown digital picture is located at the eye focus point.
In augmented reality display methods 600 according to an embodiment of the present invention, according to the position of the eye focus of user point
It sets to generate the digital picture (that is, virtual image) for showing on transparent display screen 110, so that virtual image and true
The fusion of real field scape is truer.
Fourth embodiment
Next, it will be described with reference to Figure 7 augmented reality display methods 700 according to a fourth embodiment of the present invention,
Step S715 is increased on the basis of augmented reality display methods 600 according to a third embodiment of the present invention.
Step S710 in Fig. 7 is identical as the step S610 in Fig. 6, is no longer repeated herein.
In step S715, the desired location of the virtual image of the digital picture is obtained, the desired location includes the expected depth of field.
As previously mentioned, component can be determined by the depth of field to obtain the desired location of the virtual image of the digital picture.
The depth of field determines that component may include left depth of field camera and right depth of field camera, and the left depth of field camera is used
In shooting left-eye image, the right depth of field camera is for shooting eye image, and the left depth of field camera and the right depth of field
Camera is centrosymmetrically arranged relative to left eye and right eye.In other words, pass through the central symmetry relative to left eye and right eye
The left depth of field camera of arrangement and right depth of field camera shooting left-eye image and eye image, to obtain the void of the digital picture
The desired location of picture.
For example, the left depth of field camera and right depth of field camera can be relative to the symmetrically cloth of left eye and right eye
Set above left eye and right eye, or being symmetrically arranged in below left eye and right eye relative to left eye and right eye,
Or it being symmetrically arranged on the outside of left eye and on the outside of right eye relative to left eye and right eye.
More specifically, the right side of left-eye image and the right depth of field camera shooting based on the left depth of field camera shooting
Eye image and be based on the distance between the left depth of field camera and the right depth of field camera, calculate real scene in extremely
The depth of field at few first object.For example, the position of cup in above example, the position of hand, wall position.
In step S720, based on the desired location of position and the virtual image at eye focus point obtained, come
It generates for the digital picture in the transparence display screen display.
As an example, the desired location of the virtual image may include the depth of field at the eye focus point, in the case,
The virtual image of the digital picture is located at eye focus point.
As another example, the desired location of the virtual image can be at least the first object in real scene at, it is described extremely
Few first object is different from eye focus point position, and in the case, the virtual image of the digital picture is located in real scene
At at least the first object.
Then, the operation of step S730 is identical as the operation of step S630 in Fig. 6, is no longer repeated herein.
Augmented reality display device and display methods according to an embodiment of the present invention can be according to the eye focus points of human eye
Position generate digital picture so that the virtual image of digital picture and merging for real scene are more true, enhance use
Family experience.
It should be noted that in the present specification, the terms "include", "comprise" or its any other variant are intended to
Non-exclusive inclusion, so that the process, method, article or equipment including a series of elements is not only wanted including those
Element, but also including other elements that are not explicitly listed, or further include for this process, method, article or equipment
Intrinsic element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that
There is also other identical elements in process, method, article or equipment including the element.
Finally, it is to be noted that, it is above-mentioned it is a series of processing not only include with sequence described here in temporal sequence
The processing of execution, and the processing including executing parallel or respectively rather than in chronological order.
Through the above description of the embodiments, those skilled in the art can be understood that the present invention can be by
Software adds the mode of required hardware platform to realize, naturally it is also possible to all be implemented by hardware.Based on this understanding,
Technical solution of the present invention can be embodied in the form of software products in whole or in part to what background technique contributed,
The computer software product can store in storage medium, such as ROM/RAM, magnetic disk, CD, including some instructions are to make
It obtains a computer equipment (can be personal computer, server or the network equipment etc.) and executes each embodiment of the present invention
Or method described in certain parts of embodiment.
In embodiments of the present invention, units/modules can use software realization, to be executed by various types of processors.
For example, the executable code module of a mark may include the one or more physics or logic of computer instruction
Block, for example, it can be built as object, process or function.Nevertheless, the executable code of institute's mark module is not necessarily to
It is physically located together, but may include the different instructions being stored in different positions, combined when in these command logics
When together, Component units/module and the regulation purpose for realizing the units/modules.
When units/modules can use software realization, it is contemplated that the level of existing hardware technique, it is possible to software
The units/modules of realization, without considering the cost, those skilled in the art can build corresponding hardware circuit
Realize corresponding function, the hardware circuit includes conventional ultra-large integrated (VLSI) circuit or gate array and such as
The existing semiconductor of logic chip, transistor etc either other discrete elements.Module can also be set with programmable hardware
Standby, field programmable gate array, programmable logic array, programmable logic device etc. are realized.
The present invention is described in detail above, specific case used herein is to the principle of the present invention and embodiment party
Formula is expounded, and the above description of the embodiment is only used to help understand the method for the present invention and its core ideas;Meanwhile it is right
In those of ordinary skill in the art, according to the thought of the present invention, change is had in specific embodiments and applications
Place, in conclusion the contents of this specification are not to be construed as limiting the invention.
Claims (16)
1. a kind of augmented reality display device, comprising:
Transparent display screen, for showing digital picture on it;
Focus point determines component, for obtaining the position of eye focus point;
Image production components are shown, for being generated based on the position at eye focus point obtained for described transparent aobvious
Show the digital picture of screen display;
The depth of field determines component, for obtaining the virtual image of the digital picture based on at least depth of field of the first object in reality scene
Desired location, wherein the depth of field determines that component tracks position and the depth of field of first object, so that the display image
Generating means generates the digital picture for corresponding to first object;
Wherein, it is described display image production components based at eye focus point obtained position and the virtual image it is pre-
Phase position is generated for the digital picture in the transparence display screen display.
2. augmented reality display device as described in claim 1, wherein the desired location of the virtual image includes that the eyes are poly-
The depth of field of focal point.
3. augmented reality display device as claimed in claim 1 or 2, wherein
By Digital Image Display caused by the display image production components on the transparent display screen, institute
The virtual image for stating digital picture is located at eye focus point.
4. augmented reality display device as described in claim 1, wherein
By Digital Image Display caused by the display image production components on the transparent display screen, institute
The virtual image for stating digital picture is located at the desired location of the virtual image.
5. augmented reality display device as described in claim 1, wherein
The transparent display screen includes left transparent display screen and right transparent display screen, and the left transparent display screen is for showing for a left side
The image of eye viewing, the right transparent display screen are used to show the image for right eye viewing;And
The display image production components are generated respectively for the left eye digital picture and use in left transparence display screen display
In the right eye digital picture in right transparence display screen display.
6. augmented reality display device as described in claim 1, wherein
The focus point determines that component includes at least one camera for tracking the movement of at least one eyeball.
7. augmented reality display device as claimed in claim 6, wherein
At least one described camera include be respectively used to tracking left eye eye movement and the oculomotor left camera of right eye and
Right camera, the left camera and right camera are centrosymmetrically arranged relative to left eye and right eye.
8. augmented reality display device as described in claim 1, wherein
The focus point determine component include be respectively used to tracking left eye eye movement and the oculomotor left camera of right eye and
Right camera, the left camera and right camera are centrosymmetrically arranged relative to left eye and right eye;And
The depth of field determines that component includes left depth of field camera and right depth of field camera, and the left depth of field camera and the right depth of field are shone
Camera is centrosymmetrically arranged relative to left eye and right eye,
The depth of field determines component according to the distance between the left depth of field camera and the right depth of field camera and described
Respectively captured image determines the depth of field at eye focus point for left depth of field camera and right depth of field camera.
9. augmented reality display device as claimed in claim 8, wherein
The augmented reality display device is augmented reality glasses, and the transparent display screen includes that left transparent display screen and the right side are transparent
Display screen, the left transparent display screen and the right transparent display screen are implemented as left eyeglass and right eyeglass, the left photograph respectively
Camera and the left depth of field camera are mounted near the left eyeglass of the augmented reality glasses, the right camera and the right side
Depth of field camera is mounted near the right eyeglass of the augmented reality glasses.
10. a kind of augmented reality display methods, comprising:
Obtain the position of eye focus point;
It is generated based on the position at eye focus point obtained for the digital picture in transparence display screen display;And
The digital picture described in the transparence display screen display;
The desired location of the virtual image of the digital picture is obtained based on at least depth of field of the first object in reality scene, wherein
Position and the depth of field of first object are tracked, to generate the digital picture for corresponding to first object;
Wherein, it is generated based on the desired location of position and the virtual image at eye focus point obtained in institute
State the digital picture of transparence display screen display.
11. augmented reality display methods as claimed in claim 10, wherein the desired location of the virtual image includes the eyes
The virtual image of the depth of field of focal spot, the digital picture is located at eye focus point.
12. augmented reality display methods as claimed in claim 10, wherein
The virtual image of the generated digital picture is located at eye focus point.
13. augmented reality display methods as claimed in claim 10, wherein
The virtual image of the generated digital picture is located at the desired location of the virtual image.
14. augmented reality display methods as claimed in claim 10, wherein the transparent display screen includes left transparent display screen
With right transparent display screen,
Wherein, it is generated based on the position at eye focus point obtained for the number in the transparence display screen display
Image includes: to generate for the left eye digital picture in left transparence display screen display and for showing on right transparent display screen
The right eye digital picture shown;
Wherein, the digital picture described in the transparence display screen display includes: in the left transparence display screen display left eye
Digital picture, and in the right transparence display screen display right eye digital picture.
15. augmented reality display methods as claimed in claim 10, wherein
By the left camera being centrosymmetrically arranged relative to left eye and right eye and right camera shoot left eye eyeball image and
Right eye eyeball image, to obtain the position of eye focus point.
16. augmented reality display methods as claimed in claim 10, wherein
Pass through the left depth of field camera being centrosymmetrically arranged and right depth of field camera shooting left eye figure relative to left eye and right eye
Picture and eye image, come obtain the digital picture the virtual image desired location.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410090169.0A CN104918036B (en) | 2014-03-12 | 2014-03-12 | Augmented reality display device and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410090169.0A CN104918036B (en) | 2014-03-12 | 2014-03-12 | Augmented reality display device and method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104918036A CN104918036A (en) | 2015-09-16 |
CN104918036B true CN104918036B (en) | 2019-03-29 |
Family
ID=54086688
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410090169.0A Active CN104918036B (en) | 2014-03-12 | 2014-03-12 | Augmented reality display device and method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104918036B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10453431B2 (en) | 2016-04-28 | 2019-10-22 | Ostendo Technologies, Inc. | Integrated near-far light field display systems |
CN106127171A (en) * | 2016-06-28 | 2016-11-16 | 广东欧珀移动通信有限公司 | Display packing, device and the terminal of a kind of augmented reality content |
CN106131541A (en) * | 2016-08-26 | 2016-11-16 | 广州巧瞳科技有限公司 | Intelligent display device based on augmented reality and method |
CN106444042A (en) * | 2016-11-29 | 2017-02-22 | 北京知境科技有限公司 | Dual-purpose display equipment for augmented reality and virtual reality, and wearable equipment |
CN107024991A (en) * | 2017-04-13 | 2017-08-08 | 长沙职业技术学院 | A kind of glasses system based on Internet of Things |
CN108012147B (en) * | 2017-12-22 | 2019-08-02 | 歌尔股份有限公司 | The virtual image of AR imaging system is away from test method and device |
CN110572632A (en) * | 2019-08-15 | 2019-12-13 | 中国人民解放军军事科学院国防科技创新研究院 | Augmented reality display system, helmet and method based on sight tracking |
CN115280369A (en) * | 2020-05-26 | 2022-11-01 | Oppo广东移动通信有限公司 | Control method of image creation display device and image creation display device |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102566049A (en) * | 2010-11-08 | 2012-07-11 | 微软公司 | Automatic variable virtual focus for augmented reality displays |
CN102591016A (en) * | 2010-12-17 | 2012-07-18 | 微软公司 | Optimized focal area for augmented reality displays |
CN102981616A (en) * | 2012-11-06 | 2013-03-20 | 中兴通讯股份有限公司 | Identification method and identification system and computer capable of enhancing reality objects |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8767305B2 (en) * | 2011-08-02 | 2014-07-01 | Google Inc. | Method and apparatus for a near-to-eye display |
-
2014
- 2014-03-12 CN CN201410090169.0A patent/CN104918036B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102566049A (en) * | 2010-11-08 | 2012-07-11 | 微软公司 | Automatic variable virtual focus for augmented reality displays |
CN102591016A (en) * | 2010-12-17 | 2012-07-18 | 微软公司 | Optimized focal area for augmented reality displays |
CN102981616A (en) * | 2012-11-06 | 2013-03-20 | 中兴通讯股份有限公司 | Identification method and identification system and computer capable of enhancing reality objects |
Also Published As
Publication number | Publication date |
---|---|
CN104918036A (en) | 2015-09-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104918036B (en) | Augmented reality display device and method | |
EP3414742B1 (en) | Optimized object scanning using sensor fusion | |
KR102068801B1 (en) | Method and apparatus for adjusting virtual reality images | |
AU2013224653B2 (en) | Virtual reality display system | |
US10397539B2 (en) | Compensating 3D stereoscopic imagery | |
CN108885342B (en) | Virtual image generation system and method of operating the same | |
EP3106963B1 (en) | Mediated reality | |
JP7263451B2 (en) | Layered Enhanced Entertainment Experience | |
EP2926554A1 (en) | System and method for generating 3-d plenoptic video images | |
US9338425B2 (en) | Device and method for generating stereoscopic image | |
IL308285A (en) | System and method for augmented and virtual reality | |
US11663689B2 (en) | Foveated rendering using eye motion | |
JP2021518701A (en) | Multifocal plane-based method (MFP-DIBR) for producing a stereoscopic viewpoint in a DIBR system | |
CN113228688B (en) | System and method for creating wallpaper images on a computing device | |
US20230319256A1 (en) | Image Display Control Method, Image Display Control Apparatus, and Head-Mounted Display Device | |
CN106980377B (en) | A kind of interactive system and its operating method of three-dimensional space | |
CN111596763B (en) | Control method and device of virtual reality equipment | |
CN105592306A (en) | Three-dimensional stereo display processing method and device | |
JP2019133207A (en) | Video generation apparatus, video generation method, and video generation program | |
CN102447929B (en) | Display control chip and three-dimensional display equipment | |
US20200049994A1 (en) | Tilted focal plane for near-eye display system | |
Hirzle et al. | Towards a symbiotic human-machine depth sensor: Exploring 3D gaze for object reconstruction | |
KR101904489B1 (en) | Apparatus, method and computer program for generating contents | |
Hirzle et al. | Exploring 3D Gaze for Object Reconstruction | |
CN106125940A (en) | virtual reality interactive interface management method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |