CN107209565A - The augmented reality object of fixed size - Google Patents
The augmented reality object of fixed size Download PDFInfo
- Publication number
- CN107209565A CN107209565A CN201680006372.2A CN201680006372A CN107209565A CN 107209565 A CN107209565 A CN 107209565A CN 201680006372 A CN201680006372 A CN 201680006372A CN 107209565 A CN107209565 A CN 107209565A
- Authority
- CN
- China
- Prior art keywords
- augmented reality
- size
- real world
- eye
- show
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/275—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0132—Head-up displays characterised by optical features comprising binocular systems
- G02B2027/0134—Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2016—Rotation, translation, scaling
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Architecture (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
- Processing Or Creating Images (AREA)
Abstract
A kind of wearable display system of example includes:Controller;Left display, the left display is used to show that size includes left eye augmented reality image at left eye displaing coordinate with left eye;And right eye display, the right eye display is used to show that size includes right eye augmented reality image at right eye displaing coordinate with right eye, and the left eye and right eye augmented reality image are collectively forming the augmented reality object that can be perceived by the wearer of the display system in apparent real world depth.The left eye displaing coordinate is set to the function of the apparent real world depth of augmented reality object by the controller relative to the relation of the right eye displaing coordinate.The function maintains the left eye in the range of the non-scalable of the apparent real world depth of the augmented reality object and right eye shows the one side of size, and the function scales left eye and right eye display size outside non-scalable scope with apparent real world depth is changed.
Description
Background
Image can be presented to the left eye and right eye of viewer simultaneously in three-dimensional display.By by the different views of same target
It is presented at the diverse location in right eye and left eye perspective, the three-dimensional perception to the object can be implemented.
General introduction
This general introduction is provided to introduce following some concepts further described in detailed description in simplified form.This
General introduction is not intended as the key feature or essential feature of mark claimed subject, is intended to be used to limitation claimed
The scope of theme.In addition, theme claimed be not limited to solve any mentioned in any portion in the disclosure or
Institute is imperfect to be realized.
A kind of example is wearable, wear-type display system includes:Eye, see-through display in the vicinity, the eye in the vicinity, perspective are aobvious
Show that device is configured to show that size includes left eye augmented reality image at left eye displaing coordinate with left eye;Right nearly eye, perspective
Display, the right nearly eye, see-through display are configured to show that size includes right eye augmented reality image on the right side with right eye
At eye displaing coordinate, left eye augmented reality image and right eye the augmented reality image is collectively forming can be by wear-type display system
The augmented reality object that is perceived in apparent real world depth of wearer;And, controller.Controller shows left eye
Coordinate is set to the function of the apparent real world depth of augmented reality object relative to the relation of right eye displaing coordinate.In enhancing
In the range of the whole non-scalable of the apparent real world depth of real-world object, the function maintains left eye to show that size and right eye are shown
The one side of size, and outside the non-scalable scope of apparent real world depth, the function is by changing augmented reality object
Apparent real world depth shows that size and right eye show size to scale left eye.
Brief description
Fig. 1 shows to include wearing nearly eye, the example context of the user of perspective display device.
Fig. 2 schematically shows example perspective, nearly eye, perspective display device.
Fig. 3 is the apparent real world of example for schematically explaining the augmented reality object scaled according to the first scaling function
The diagram of size and depth.
Fig. 4 is the apparent real world of example for schematically explaining the augmented reality object scaled according to the second scaling function
The diagram of size and depth.
Fig. 5 is the flow chart for explaining the method for being used to show augmented reality object.
Fig. 6 A-6E are to explain the diagram that example scales function.
Fig. 7 schematically shows the first exemplary view of augmented reality object.
Fig. 8 schematically shows the second exemplary view of augmented reality object.
Fig. 9 schematically shows the 3rd exemplary view of augmented reality object.
Figure 10 schematically shows the 4th exemplary view of augmented reality object.
Figure 11 shows exemplary computing system.
Figure 12 shows example head-mounted display apparatus.
It is described in detail
Nearly eye perspective display device can be configured to display augmented reality image and (is sometimes referred to providing augmented reality object
For hologram) it is present in the illusion in the real world environments around the near-to-eye equipment.In order to imitate display device
How wearer perceives real-world object, when the perceived depth of shown augmented reality object changes, the augmented reality object
It can be scaled in terms of size.However, the observability in order to retain augmented reality object, expects the depth even in augmented reality object
When degree changes, the one or more aspects of augmented reality object size are also maintained.Such size, which retains, can reduce showing for object
Reality, because the object will will not be scaled exactly as with real-world object as scaling.It can make however, such size retains
Object must be more easily visible, if such as real-world object by scaling as will be too small or too big come the object if scaling, and/or
The interior increased ability being perhaps otherwise interact with for reading and being shown on object can be provided.
According to each embodiment disclosed herein, augmented reality content (user interface element, holographic icon etc.) can
Each corresponding scaling of augmented reality content size is scaled according to the perceived depth relative to augmented reality content is defined how
Function is displayed on nearly eye, had an X-rayed on display device.In some instances, different types of augmented reality content can be according to difference
Scaling function be sized.For example, user interface control element (such as cursor) in a certain depth bounds with identical
Size is perceived to be maintained, and the hologram shown as a part for immersion game environment can be by changing depth come linear
Ground is scaled.In this way, user interface control element can be maintained the visible size of user in display device, i.e.,
The user interface control element is set to be displayed at Apparent Depth relatively far away from.
As explained above, such scaling function can also increase user's visualization and be shown on augmented reality object
The ability of content.For example, the holographic newspaper floated on horizontal desk in a room from user may be visible in itself, but
Top news on the newspaper can be just visible only in the case of zoom technology as described above is adopted.
As another example, user may be for noticing (simulation) played on horizontal hologram television machine in a room
The 3D effect of three-dimensional 3D films is had any problem.By scaling described herein, television set in the visual field of user it is variable enough to
Greatly so that he or she is it can be seen that and appreciate the three-dimensional 3D effect of film.
As another example, when user walk close to the complete of the fixed size for showing (simulation) solid 3D films
When ceasing television set object, scaling described herein can allow the three-dimensional 3D effect of television set disabling, and replace with 2D videos
To prevent visual fatigue and maximize viewer's comfort level.Alternatively, when user when close in order to prevent television set from stopping user
Most of visual field, holographic object simply can fade out in video content.Fig. 1 shows that wherein user 102 wears nearly eye, perspective
The example context 100 of display device (being implemented as head mounted display (HMD) 104 herein).HMD is provided to user 102
The see-through view of environment 100.HMD also shows augmented reality image to user.In one example, HMD is that three-dimensional display is set
Standby, the separated augmented reality image of two of which is each displayed on HMD corresponding left eye and right eye display.When by
When HMD wearer (such as user 102) checks, it is appreciable as ring that the two augmented reality images are collectively forming wearer
The augmented reality object of the part in border 100.Fig. 1 depicted example augmented reality objects 106a and 106b.It is to be appreciated, however, that institute
The augmented reality object of description is invisible to other people in environment 100, and the augmented reality object can only be led to by user 102
HMD 104 is crossed to see.
Augmented reality image can be shown as by HMD 104 so that the augmented reality object perceived is body locking and/or generation
Boundary's locking.The augmented reality object of body locking is with HMD 104 6DOF posture (i.e. 6DOF:X, y, z, pitching, partially
Boat, rolling) change and move.So, when user's movement, steering etc., the augmented reality object of body locking is also showed
For the same section in the visual field for occupying user 102, and show as at away from the identical distance of user 102.
On the other hand, the augmented reality object of world's locking shows as being maintained in fixed position relative to surrounding environment.
When the visual angle of user's movement and user changes, the augmented reality object of world's locking still will appear as relative to surrounding
Environment is in identical location/orientation.As an example, augmented reality chess piece can behave as the identical on real world chessboard
In grid, but regardless of user watch the chessboard vantage point why.In order to support the augmented reality object of world's locking, HMD
Geometric maps/modeling in terms of HMD 6DOF postures and each surface of surrounding environment can be tracked.
According to the disclosure, the apparent real world size of augmented reality object or each several part of augmented reality object can bases
The apparent real world depth of the augmented reality object and change.In other words, as augmented reality object is displayed on farther out
At perceived distance, the big I of augmented reality object is increased, and with augmented reality object be displayed on nearer perception away from
From place, the big I of augmented reality object is reduced.Scaling function can be tuned to so that augmented reality object or augmented reality
The each several part of object will occupy the same section in the visual field (FOV) of user, the sense being displayed on but regardless of the augmented reality object
Why know distance.That is, the big I increase of apparent real world of a part for augmented reality object or augmented reality object subtracts
It is small to keep identical angle size relative to user.
In the example depicted in fig. 1, user 102 passes through posture input and creates augmented reality drawing.As shown, use
Family 102, which is along creating close to user 102 and HMD 104 the first wall 108, is depicted as augmented reality object 106a
First drawing.Augmented reality object 106a one or more aspects may be set so that augmented reality object 106a is use
Family 102 is visible.Although for example, augmented reality object 106a total size can be determined according to the input of the posture of user, strengthening
Between the first wall 108 that real-world object 106a line thickness can be placed to based on user and augmented reality object 106a away from
From setting, to ensure that augmented reality object is visible, and the kopiopia of user is reduced.
If the Apparent Depth of augmented reality object is changed, if for example augmented reality object is placed such that its table
Depth increase is seen, then the one or more aspects of augmented reality object can be maintained to maintain the observability of object.Such as Fig. 1 institutes
Show, the drawing that user is created is moved at bigger Apparent Depth.It is depicted as augmented reality object 106b movement
Drawing is placed on the second wall 110, and the wall 108 of the second wall 110 to the first is further from user 102 and HMD 104.Cause
This, the apparent real world depth of augmented reality object has increased, and the apparent real world of thus augmented reality object is big
It is small to reduce, to provide the perception to three dimensions.However, the line thickness of drawing is maintained, to maintain to the visible of drawing
Property.As depicted herein, the line thickness of the drawing just maintained refers to the line thickness that the user being maintained perceives.Show at some
Example, the line thickness for maintaining user to perceive may include the one or more aspects for adjusting the line of actual displayed.
As Fig. 1 is demonstrated, some types of augmented reality object can be scaled to so that one or more aspects (for example,
Line thickness) in the range of different Apparent Depths it is constant.Thus, when object is initially shown with the Apparent Depth in the range of this
When showing, or when such object is moved to the Apparent Depth in the range of that, this aspect of object can be set to the
Constant predeterminated level in the range of this.
Fig. 2 is to show to include the schematic diagram 200 of each side of the wearable three-dimensional display system 202 of controller 203.It is shown
Display system be similar to traditional glasses, and be Fig. 1 HMD 104 non-limiting example.Display system includes
Right display 206 and left display 204.In certain embodiments, right and left display be from the point of view of wearer all or
Partially transparent, to give wearer the clear view of his or her surrounding environment.This feature causes the display of computerization
Image energy is mixed with the image from surrounding environment, to obtain the mirage of augmented reality.
In certain embodiments, display image is by real time from the remote computation system for being operatively coupled to display system 202
System (not shown) is sent to display system 202.Show image can (that is, type) in any suitable form transmission signal sum
Transmitted according to structure.The signal encoded to display image can be via any kind of of the controller 203 to display system
Wired or wireless communication link is passed.In other embodiments, at least some display image synthesis and processing can be in controllers
It is middle to perform.
Continue in fig. 2, each of right and left display includes respective optical system, and controller 203 is grasped
It is coupled to right and left optical system with making.In the illustrated embodiment of the invention, controller is hidden together with right and left optical system
In in display system framework.Controller may include suitable input/output (IO) part, can receive from remote computation
The display image of system.Controller may also include position sensing component-such as global positioning system (GPS) receiver, gyro and pass
Sensor or accelerometer are to access head orientation and/or movement etc..When showing system 202 in operation, the light to the right of controller 203
System sends appropriate control signal, and this causes right optical system to form right display image in right display 206.It is similar
Ground, optical system sends appropriate control signal to controller to the left, and this causes left optical system to form a left side in left display 204
Display image.The wearer of display system checks right and left display image respectively by right and left eye.When right and left display image
By (see below) by rights combine and present when, wearer experience augmented reality object specified location and have
There are specified 3D contents and the mirage of other display properties.It will be understood that, " augmented reality object " as used in this article can be appointed
The object of what desired complexity and need not be constrained to single object.On the contrary, augmented reality object may include before having
The complete virtual scene of both scape and background parts.Augmented reality object also corresponds to the one of bigger augmented reality object
Part or site.
As shown in Fig. 2 (also referred herein as left eye display and right eye is aobvious for left display 204 and right display 206
Show device) each show corresponding augmented reality image (that is, the image of tree).Left display 204 is just showing left augmented reality image
208, and right display 206 is just showing right augmented reality image 210.Each of left display 204 and right display 206 are all
It may include to be configured to the suitable display based on the control signal formation display image from controller 203, such as LCD is shown
Device.Each display includes being disposed in rectangular mesh or other multiple addressable individual pixels geometrically.Zuo Xian
Show that each of device 204 and right display 206 can further comprise the optics device for the image of display to be delivered to eyes
Part.Such optics may include waveguide, optical splitter, local reflective mirror etc..
Jointly, left augmented reality image 208 and right augmented reality image 210 are looked into the wearer of shown system 202
Augmented reality object 212 is created when seeing.Although left augmented reality image 208 and right augmented reality image 210 are depicted as in fig. 2
It is identical, it will be understood that each of left and right augmented reality image can different (examples with identical or each
Such as, each may include same object but the image as viewed from the perspective of slightly different).Augmented reality object 212 has foundation
Left augmented reality image 208 and each size in right augmented reality image 210 and its position in respective display are come
The apparent real world size and apparent real world depth determined.
Apparent location (including apparent real world depth (that is, z coordinate), the apparent real world of augmented reality object 212
Lateral attitude (that is, x coordinate) and apparent real world vertical coordinate (that is, y-coordinate)) left and right augmented reality image can be passed through
208th, each of 210 displaing coordinate is described.Apparent size can be according to that object display size and Apparent Depth
To describe.As used herein, the displaing coordinate of augmented reality image includes each pixel comprising augmented reality image
X, y location.The display size of augmented reality image is such as by including the pixel of augmented reality image in one or more dimensions
The ratio of linear measure longimetry, such as display shared by augmented reality image that number is indicated.In addition, it is as used herein, increase
Strong real world images refer to the real image shown on display, and augmented reality object is referred to when the wearer of display system checks
During both right and left displays, the augmented reality content that wearer is perceived.It will be understood that, augmented reality object may include any
Suitable augmented reality content, including but not limited to graphic user interface, user interface control element, Virtual User are marked, entirely
Cease figure, animation, video simulation etc..
In order to adjust the apparent real world depth of augmented reality object, right displaing coordinate and/or left displaing coordinate can phases
For being set each other.For example, the apparent real world depth in order to reduce augmented reality object, left and right displaing coordinate can quilt
It is set to close to each other.As an example, tree Image can be on the display of left and right towards nose movement.In order to increase augmented reality pair
The apparent real world depth of elephant, left and right displaing coordinate can be set as away from each other.As an example, tree Image can be in left and right
Moved on display away from nose.
In order to adjust the apparent real world size of augmented reality object, right display size and/or the big I quilt of left display
Adjustment.For example, the big I of right and/or left display is increased to increase the apparent real world size of augmented reality object.However,
As will be explained in more detail, the apparent real world size of augmented reality object can be augmented reality object relative to phase
With the size of other real-world objects at Apparent Depth.Thus, in some instances, the apparent real world of augmented reality object
Big I is scaled according to apparent real world depth.
(and therefore it is to corresponding augmented reality to the scaling of augmented reality object size according to apparent real world depth
Image shows the scaling of size) it can be performed according to scaling function is expected, expectation scaling function will solved in more detail below
Release.In short, each scaling function can set left and right displaing coordinate relative to each other, augmented reality object is arranged on the phase
Hope apparent real world depth, and based on the apparent real world depth come scale that augmented reality image shows size one
Individual or many aspects.Each function can perform scaling differentially, such as linearly, non-linearly, only in certain depth scope
Interior scaling or other suitable functions.
In an example scaling function, outside the non-scalable scope of apparent real world depth, augmented reality image shows
Show big I by changing apparent real world depth to be linearly scaled, and in the non-scalable scope of apparent real world depth
Interior, augmented reality image shows that big I is maintained.By doing so it is possible, the big I of apparent real world of augmented reality object is led to
Cross and change apparent real world depth and change so that augmented reality object is maintained at the perseverance in the visual field of the wearer of display system
At certainty ratio.
Fig. 3 is the apparent real world of example for schematically explaining the augmented reality object scaled according to the first scaling function
The diagram 300 of size and depth.Augmented reality image 302 is displayed on nearly eye, in see-through display 304, is such as included in Fig. 1
HMD 104 and/or Fig. 2 display system 202 in display.When the eyes by user 306 are checked, augmented reality figure
Show as being augmented reality object 308 as 302.Although depict only an augmented reality image in Fig. 3, it will be understood that, display
Device 304 may include two displays, and each display all shows corresponding augmented reality image.Fig. 3 also includes timeline 310.
In first time point T1, augmented reality image 302 is shown with the first display size DS1, and has and will increase
Strong real-world object is arranged on the displaing coordinate at the first Apparent Depth AD1.Due to the display size and Apparent Depth, augmented reality
Object has the first apparent size AS1.
In the second time point T2, the Apparent Depth of augmented reality object is increased, as shown by Apparent Depth AD2.Fig. 3
Example in apply first scaling function specify when Apparent Depth changes, the display size of augmented reality image 302 is tieed up
Hold, thus show that size DS2 is equal to time T1 display size DS1.However, due to table while showing that size keeps identical
Seeing depth has but increased, therefore the apparent size increase of augmented reality object 308, as shown by apparent size AS2.Such as will be logical
Cross what Fig. 3 understood, the relative scale in the visual field at augmented reality image and augmented reality object occupancy family is from time T1 to time T2
Keep constant.
Fig. 4 is the apparent real world of example for schematically explaining the augmented reality object scaled according to the second scaling function
The diagram 400 of size and depth.Similar to Fig. 3, augmented reality image 402 is displayed on nearly eye, in see-through display 404, all
Such as it is included in the display in Fig. 1 HMD 104 and/or Fig. 2 display system 202.When the eyes by user 406 are checked
When, augmented reality image 402 shows as being augmented reality object 408.Although depict only an augmented reality image in Fig. 4,
It will be understood that, display 404 may include two displays, and each display all shows corresponding augmented reality image.Fig. 4 is also wrapped
Include timeline 410.
In first time point T1, augmented reality image 402 is shown with the 3rd display size DS3, and has and will increase
Strong real-world object is arranged on the displaing coordinate at the 3rd Apparent Depth AD3.Due to the display size and Apparent Depth, augmented reality
Object has the 3rd apparent size AS3.In the example depicted in fig. 4, the 3rd display size DS3 is equal to Fig. 3 the first display greatly
Small DS1.Equally, the 3rd Apparent Depth AD3 and the 3rd apparent size AS3 each be respectively equal to Fig. 3 the first Apparent Depth AD1 and
First apparent size AS1.
In the second time point T2, the Apparent Depth of augmented reality object is increased, as shown in Apparent Depth AD4.Fig. 4's shows
The the second scaling function applied in example specifies the big I of display of augmented reality image 302 to be linearly scaled with Apparent Depth.
Thus, display size DS4 reduces in time T1 relative to display size DS3.As a result, augmented reality object 408 is time T2's
Apparent size keeps identical, as shown in AS4.Thus, augmented reality object time T1 apparent size AS3 be equal to when
Between T2 apparent size AS4.As will be understood by Fig. 4, the visual field at augmented reality image and augmented reality object occupancy family
Relative scale reduces in time T2 relative to time T1.
Turning now to Fig. 5, the method 500 for showing augmented reality object is shown.Method 500 can it is wearable,
Realized in wear-type stereoscopic display system, such as Fig. 1 as described above HMD 104 or Fig. 2 display system 202, or
HMD 1200 in Figure 12 discussed below.
502, method 500 includes obtaining the augmented reality object to show on display system.Augmented reality object
Any suitable augmented reality content can be included and graphic user interface, game, guide or auxiliary system can be shown as
The part of system or any suitable enhancing or immersive environment.Augmented reality object can input in response to user, perform game
Or memory or other conjunction of the predefined procedure or other suitable actions of other contents from remote service, from display system
Suitable source is obtained.As explained above, augmented reality object may include right eye and left eye augmented reality image, each image
It is configured to be displayed on the corresponding right eye and left eye display of display system.Therefore, augmented reality object Ke Bao is obtained
Include the corresponding left eye of acquisition and right eye augmented reality image.
504, this method includes determining augmented reality object type and associated scaling function.Augmented reality object can
It is classified into the object of one or more types.The exemplary types of augmented reality object include graphic user interface, user interface
Control element (for example, cursor, arrow), Virtual User mark (for example, draw), navigation and/or auxiliary icon, hologram and its
The augmented reality object of his suitable type.The augmented reality object of each type can have associated scaling function, the scaling
Function provides the display size to form the augmented reality image of augmented reality world object how according to the augmented reality object
Apparent real world depth is scaled.
506, the apparent real world depth of augmented reality object is determined.Augmented reality object is displayed at suitably
Apparent real world depth.The apparent real world depth of augmented reality object can according to one or more suitable parameters come
Set, one or more suitable parameters include but is not limited to user command (will strengthen for example, whether user have issued instruction
Real-world object is placed on the posture of given position, voice or other orders), with one or more real world objects associate with
And the parameter preset of augmented reality object is (for example, augmented reality object can have presetting for the kopiopia for being chosen as reducing user
Depth).
508, method 500 is included according to scaling function in apparent real world depth and with apparent real world size
Show the augmented reality object.In order to show augmented reality object, method 500 is included in eye, basis in see-through display in the vicinity
Scale function and show that size includes left eye augmented reality image at left eye displaing coordinate with left eye, as indicated at 510.
In addition, method 500 is included in right nearly eye, according to scaling function shows size by right eye augmented reality with right eye in see-through display
Image is shown at right eye displaing coordinate, as indicated at 512.
As previously explained, the apparent real world depth of augmented reality object can be shown by corresponding right eye and left eye
Coordinate is indicated.Then, the appropriate of augmented reality object can be set according to scaling function according to apparent real world depth
Apparent real world size.For example, augmented reality object can have the apparent existing of acquiescence for giving apparent real world depth
Real world's size.Default size can the type based on augmented reality object, wherein place augmented reality object context and/
Or environment, user's input and/or other suitable factors.Scaling function can be subsequently based on the real world depth of determination to change
The apparent real world size.In order to adjust apparent real world size, the right eye and a left side of right eye and left eye augmented reality image
The big I of eye display is adjusted, as explained above.
The applicable example scaling function during the execution of method 500 is illustrated in Fig. 6 A-6E.Diagram 601,603,
605th, each of 607 and 609 enhancing is all drawn according to the apparent real world depth of corresponding augmented reality object now
Real image shows size.These example functions be applicable to augmented reality image one or more dimensions (for example, height or
Width or height and width).These example functions are applicable to the another aspect of augmented reality image, such as line thickness.
The first linear function that line 602 is explained is apparent by changing across all Apparent Depths in the visible range of user
Depth is come linearly (for example, 1:1 ground) adjustment display size.First linear scale function can be used for scaling to be intended to imitate user's
The augmented reality object of element (for example, object in game environment) in environment.Although (such as line 602 is solved linear function
That linear function said) can be accurately represented in object depth change when the object perceive size in terms of how to change,
But it can cause object to become too small so that can not be perceived exactly, or become very big so that having blocked the visual field of user.
Another example of linear scale function is explained by line 604.In the second linear scale function, augmented reality image
Display size keep constant, it is and unrelated with apparent real world depth.Although the size of this setting augmented reality object
Method perform it is very simple, but it can by it is identical with the first linear scale function the problem of, such as augmented reality object exists
Some depths are too small or too big.Actuality is also reduced, because the augmented reality object scaled in this way is not imitated
The scaling of real world objects.
, can be using various segmentation contractings in order to avoid above-described size issue simultaneously using the advantage of linear scale function
Put function.The example of first piecewise function is as shown in line 606.Herein, first non-scalable of the display size in Apparent Depth
In the range of be maintained constant, and depth outside the first non-scalable scope is linearly adjusted with depth is changed.Thus, root
According to the first non-linear zoom function, left eye and right eye show that size is scaled (for example, with increasing according to apparent real world depth
Plus depth and reduce size), until apparent real world depth reaches first threshold depth T1.Show size in non-scalable depth
In the range of remain constant, until Second Threshold depth T2 is reached.In the depth more than the first non-scalable scope, again according to
Apparent real world depth shows size to scale left eye and right eye.
First segmentation scaling function can be used to scale augmented reality object, and these augmented reality objects need not be with reality
Object or real world environments are related.This may include user interface control element, such as cursor, graphical interfaces and such as draw
Etc Virtual User mark.By the display size of the augmented reality image shown by maintenance, augmented reality object it is apparent
The big I of real world is smaller at smaller depth, and larger in larger depth, thus in the first non-scalable depth model
The same constant ratio in the visual field of user is occupied in enclosing.By doing so, user can easily visual enhancement real-world object and/
Or interact, or even in depth be relatively far away from also such.In addition, passing through the basis outside the first non-scalable scope
Depth carrys out Zoom display size, and the first segmentation scaling function prevents augmented reality object from becoming the visual field that is too big and blocking user.
Second segmentation scaling function is explained by line 608.Second segmentation scaling function is similar with the first segmentation scaling function, and
It is included in the second non-scalable depth bounds between first threshold depth T1 and Second Threshold depth T2, it is deep in second non-scalable
Spend at scope, the display size of augmented reality image is maintained at constant size.Second non-scalable depth bounds may differ from
First non-scalable scope, such as the second non-scalable scope can be the depth boundses bigger than the first non-scalable scope.
3rd segmentation scaling function is explained by line 610.3rd segmentation scaling function is in the range of depth of zoom according to depth
To be linearly scaled the display size of augmented reality image, but display size is maintained one or many outside depth of zoom scope
At individual constant size.For example, display size is maintained at first at close range depth, relatively large display size
Place, is linearly scaled in depth of zoom scope, and then big with second, relatively small display at remote range depth
It is small to be maintained.
Example described above scaling function can be each associated with corresponding different types of augmented reality object, and
All it is automatically applied when associated augmented reality object is shown.In other examples, corresponding scaling function can ring
Augmented reality function should be applied in user's request or other inputs.
When more than one augmented reality object is shown, the augmented reality object of each display can be according to its corresponding contracting
Function is put to scale.As a result, some augmented reality objects can be scaled similarly when being shown together, and other augmented realities
Object can be in different zooms.As a specific example, (for example, holographic set, such as scheme as the display object of a part for game
The holographic tree explained in 2) it can be linearly scaled by changing depth at all Apparent Depths, to imitate the object in reality
How will to be perceived in the world.On the contrary, control object (such as the cursor for each side for controlling game) can be according to first point
Section scales function to scale to maintain the observability of cursor.
Thus, in the above examples, can be according to the apparent real world depth of the first and second augmented reality objects come phase
For right eye displaing coordinate, left eye displaing coordinate is set.Left eye shows that size and right eye show the one side of size (for example, total figure
As size) it can be maintained in the range of the non-scalable of the apparent real world depth of only the first augmented reality object.Apparent existing
Outside the non-scalable scope of real world's depth, left eye shows that size and right eye show that big I is existing by changing the first and second enhancings
The apparent real world depth of both real objects is scaled.In the apparent real world depth only for the second augmented reality object
Non-scalable in the range of, left eye shows that size and right eye show big I by changing apparent real world depth to scale.
It is exemplary in nature above in association with Fig. 6 A-6E scaling functions described, and other scaling functions can quilt
Use.It can be used with any number of scaling function that is constant, being linearly or nonlinearly segmented.The different zoom of Same Function
Segmentation can have different scale attributes.For example, the slope of the scaling segmentation before constant segmentation can be more than in constant segmentation
The slope of scaling segmentation afterwards.
Other variants of the function explained in Fig. 6 A-6E are contemplated.For example, the slope of the first linear function is than institute
What is explained is smaller or greater.In another example, the first segmentation scaling function can be during non-scalable depth bounds in size side
Face is zoomed in and out, but is zoomed in and out with many speed smaller than outside non-scalable depth bounds.By doing so, the function can
Only scaling maintains the necessary ratio scaled needed for same angle size, so as to be mixed with following two considerations:Provide user's positive
The clue moved for augmented reality object, while mainly maintaining the angle size of the augmented reality object to allow user to be easier
Ground views the augmented reality object and interacted.In addition, in some instances, scaling function can be that user can match somebody with somebody
Put.
Some scaling functions can be restricted to minimum and maximum apparent real world size, this will cause user relative to
In the case that object movement exceedes respective physical distance, the angle size of augmented reality object shows as changing.Zoom operations can
Change to trigger by substantially any object positioning, and be not limited only to be attributed to and other real worlds or augmented reality pair
As the positioning of collision.
These zoom operations are continuously, periodically employed, or are employed at single time point.For example, floating
User interface element can the placement based on the real world surface just watched attentively against user and its continuously updated apparent reality
World's size is to maintain its angle size (for example, taking the ratio in the visual field at family), while the line that user draws can be by the big of itself
It is small to be set to maintain target angle size based on the distance on target physical surface thereon is plotted in away from it, but then in that point
Do not change in world's space size afterwards.
In addition, replacement or the supplement of size are shown as image, the augmented reality figure of some adjustable displays of scaling function
The each side of picture.For example, the colourity of augmented reality image, color, transparency, lighting effect and/or characteristic density can be based on table
Real world depth is seen to adjust.
How total apparent real world size above in association with augmented reality object is changed based on apparent real world depth
Describe example scaling function.However, replacement or supplement as the total apparent real world size of adjustment, can adjust augmented reality
One or more particular aspects of object.The exemplary aspect that can be adjusted is the line thickness of augmented reality object, its with
Under be described in more detail.Another exemplary aspect that can be adjusted includes object-oriented.For example, the augmented reality of such as book etc
Object can be easily seen when being checked by front.However, work as user checks same target from side (for example, 90 degree) angle
When, it becomes virtually impossible to read the book.Thus, augmented reality object can be rotated to be user oriented automatically.The effect is referred to alternatively as
Publicity.As zooming effect, the key of the effect of publicity can be apparent real world depth.For example, can be only in apparent reality
Publicity is realized in world's depth bounds.
Fig. 7 shows through nearly eye, see-through display (for example, HMD 104, display system 202) to show from the visual angle of user
Example view 700.In view 700, user can be seen that real world wall 702a, 702b, 702c, 702d and floor 704.Except
Outer in terms of the real world of surrounding environment, user can also be seen that the augmented reality object of the first example of Virtual User mark, this
The increasing of second example of the identical horizontal line 706 " being depicted as in text on the horizontal line 706' and wall 702d on wall 702b
Strong real-world object.
In this example, horizontal line 706 " is, away from the feet away of user 5, and to occupy 0.95 degree of vertical angular spread.Level
Line 706 " can behave as in world space coordinate to be one inch high.On the other hand, when 10 feet away from user, identical level
Line 706 ' can still occupy 0.95 degree of vertical angular spread, but show as in world space coordinate to be 2 inches high.In other words,
The line occupies the same ratio in the HMD visual field at different distances, and the line will have identical thickness, but regardless of the line
Why is the apparent real world depth being plotted in.Maintained at different distances the thickness may be such that user be easier perceive compared with
The augmented reality line of remote depth.
In some instances, horizontal line length can be scaled according to depth.As illustrated, the perception size of horizontal line 706 '
Sensing size than horizontal line 706 " is short.However, in other examples, similar to line thickness, line length can be kept constant.
As another example, user interface control element (being depicted as cursor herein) can scale function according to segmentation
(all first segmentation scaling functions as described above) is shown.Fig. 8 is shown with the increasing at distance relatively far away from
Second of identical augmented reality cursor 802 " at strong reality the first example of cursor 802 ' and the distance in relative close
The view 800 of example.In two examples, augmented reality cursor is maintained at the same ratio in the visual field of user.As more than
Explained, in order to so realize, segmentation scaling function at least in non-scalable depth bounds for augmented reality image (including increase
Strong reality cursor) left eye and right eye display maintain identical to show size.
It is scaled to as the total size of further example, including the augmented reality object of many constitution elements so that when place
In relatively far away from apart from when there is larger corresponding apparent real world size, and in relative close apart from when have
Relatively small corresponding apparent real world size.As an example, Fig. 9, which shows to have, is in relatively remote picture 902 '
The first example and distance in relative close same picture 902 " the second example augmented reality object view
900.Augmented reality object is scaled to so that occupying the same ratio in the HMD visual field at different distance.As a result, picture 902'
With the real world size bigger than picture 902 ".
In some instances, augmented reality object can include the parent object of multiple child's objects (for example, subobject).
For example, the object explained in Fig. 9 includes square box, the square box has two circles included in the inframe.In some instances,
Scaling function can be differently applied to the different children of father's augmented reality object.In this way, specific child's object
Each side can be scaled and/or be maintained based on depth, and each side of other child's objects can be not based on depth and be scaled or tie up
Hold.In one example, circle and frame can be scaled based on depth, and the thickness for the line for being used to rendering these objects is maintained at
Identical as explained in Figure 10 shows size, as described below.In this example, the total size of augmented reality object
It can remain identical relative to surrounding environment, but one or more of constitution element is scalable.For example, the total size of icon can table
It is smaller when being now at the perceived distance being displayed on farther out, but constitute the thickness of the line of the icon and can behave as near and far sense
Know distance both at show as it is identical.
As an example, Figure 10 is shown with the first example in relatively remote picture 1002 ' and in relative
The view 1000 of the augmented reality object of second example of the same picture 1002 " of nearer distance.Augmented reality object is contracted
Put as so that total real world dimension of object is consistent at different distances.Thus, the example farther out of picture 1002 ' is accounted for
According to the less HMD of the nearlyer example than the picture 1002 " visual field.However, the composition line for constituting these pictures be scaled so that
The same ratio in the HMD visual field is occupied at different distances.
In certain embodiments, the calculating system that method and process described herein can be with one or more computing devices
System binding.Specifically, such method and process can be implemented as computer applied algorithm or service, API
(API), storehouse and/or other computer program products.
Figure 11 schematically shows the computing system 1100 of one or more of the executable above method and process
Non-limiting example.Fig. 1 HMD 104, Fig. 2 display system 202 and/or Figure 12 as described below HMD 1200 are meters
The non-limiting example of calculation system 1100.It show in simplified form computing system 1100.Computing system 1100 can be taken following
Form:One or more personal computers, server computer, tablet PC, home entertaining computer, network calculations are set
Standby, game station, mobile computing device, mobile communication equipment (for example, smart phone) and/or other computing devices.
Computing system 1100 includes logical machine 1102 and storage machine 1104.Computing system 1100 optionally includes display
System 1106, input subsystem 1108, communication subsystem 1110 and/or unshowned other assemblies in fig. 11.
Logical machine 1102 includes the one or more physical equipments for being configured to execute instruction.For example, logical machine can by with
It is set to the instruction for performing the part as the following:One or more applications, service, program, routine, storehouse, object, group
Part, data structure or other logical constructs.This instruction can be implemented to performing task, realize data type, conversion one or
The state of multiple components, realize technique effect or be otherwise arrive at.
Logical machine may include the one or more processors for being configured to perform software instruction.Additionally or alternatively, patrol
The machine of collecting may include the one or more hardware or firmware logic machine for being configured to perform hardware or firmware instructions.The processing of logical machine
Device can be monokaryon or multinuclear, and the instruction performed thereon can be configured as serial, parallel and/or distributed treatment.Logic
Each component of machine is optionally distributed on two or more specific installations, these equipment can be located at it is long-range and/or by with
It is set to progress collaboration processing.The each side of logical machine can be calculated by the networking capable of making remote access configured with cloud computing
Equipment is virtualized and performed.
Storage machine 1104 includes being configured to keep being performed to realize methods and procedures described herein by logical machine
One or more physical equipments of instruction.When realizing these method and process, the state (example of storage machine 1104 can be converted
Such as, different data are preserved).
Storage machine 1104 can include removable and/or built-in device.Storage machine 1104 may include optical memory (example
Such as, CD, DVD, HD-DVD, Blu-ray disc etc.), semiconductor memory (for example, RAM, EPROM, EEPROM etc.) and/or magnetic memory
(for example, hard disk drive, floppy disk, tape drive, MRAM etc.) etc..Storage machine 1104 may include volatibility, non-
Volatibility, dynamic, static state, read/write, read-only, arbitrary access, sequential access, position addressable, file addressable and/or content
Addressable device.
It is appreciated that storage machine 1104 includes one or more physical equipments.However, each side of instruction described herein
Can alternatively by not by physical equipment have the communication media held in limit (for example, electromagnetic signal, optical signal etc.) Lai
Propagate.
The each side of logical machine 1102 and storage machine 1104 can be integrated together into one or more hardware logic components.
These hardware logic components may include the integrated circuit of such as field programmable gate array (FPGA), program and application specific
(PASIC/ASIC), the standardized product (PSSP/ASSP) of program and application specific, on-chip system (SOC) and complex programmable
Logical device (CPLD).
Term " module ", " program " and " engine " can be used for the computing system for describing to be implemented as performing a specific function
1100 one side.In some cases, can by perform the logical machine 1104 of the instruction kept by storage machine 1102 come
Instantiate module, program or engine.It will be understood that, can be from same application, service, code block, object, storehouse, routine, API, function
Deng instantiation different module, program and/or engines.It is also possible to by different application programs, service, code block, object,
Routine, API, function etc. instantiate same module, program and/or engine.Term " module ", " program " and " engine " can be covered
Single or groups of executable file, data file, storehouse, driver, script, data-base recording etc..
It should be appreciated that " service " can be across the executable application program of multiple user conversations as used herein.
Service can be available to one or more system components, program, and/or other services.In some implementations, service can be at one
Or run on multiple server computing devices.
When included, display subsystem 1106 can be used for the visual representation that the data kept by storage machine 1104 are presented.
This visual representation can take graphic user interface (GUI) form.Because approach described herein and process change are by depositing
The data that storage machine is kept, and the state of storage machine has thus been converted, therefore can equally change the state of display subsystem 1106
Visually to represent the change of bottom data.Display subsystem 1106 may include one using essentially any type of technology
Or multiple display devices.Such display device can be combined with logical machine 1102 and/or memory machine 1104 in shared encapsulation,
Or such display device can be peripheral display device.
When included, input subsystem 1108 may include keyboard, mouse, touch-screen or game console etc. one
Or multiple user input equipments or docked with these user input equipments.In certain embodiments, input subsystem can be wrapped
Include or be connected on selected natural user's input (NUI) part.Such part can be integrated form either peripheral hardware, and
And the conversion and/or processing of input action can be handled onboard or under plate.Example NUI parts may include be used for language and/or
The microphone of speech recognition;For machine vision and/or the infrared of gesture recognition, color, stereoscopic display and/or depth camera;With
In motion detection and/or the head-tracker of intention assessment, eye tracker, accelerometer and/or gyroscope;And for assessing
The electric field sensing part of brain activity.
When including communication subsystem 1110, communication subsystem 1110 can be configured to computing system 1100 and one or
Couple to other multiple computing device communications.Communication subsystem 1110 may include compatible with one or more different communication protocols
Wiredly and/or wirelessly communication equipment.As non-limiting example, communication subsystem may be configured for via radiotelephony network
Network or wired or wireless LAN or wide area network are communicated.In certain embodiments, communication subsystem can allow to calculate
System 1100 sends a message to other equipment and/or received from other equipment via network as such as internet to disappear
Breath.
Figure 12 shows wear-type, nearly eye, perspective display system in the form of the wearable glasses with see-through display 1202
The non-limiting example of system (also referred to as HMD 1200).HMD 1200 be Fig. 1 HMD 104, Fig. 2 display system 202 and/
Or the non-limiting example of Figure 11 computing system 1100.HMD can take any other suitable form, wherein it is transparent, half
Transparent and/or opaque display is supported on one or two drawing axis of viewer.For example, each reality described herein
Applying example can be used together with any other suitable computing device, including but not limited to mobile computing device, laptop computer,
Desktop computer, tablet PC, other wearable computers etc..For example, augmented reality image can be with the phase by mobile phone
The real world image that machine is caught is shown on a display of the mobile telephone together.
HMD1200 includes see-through display 1202 and controller 1204.See-through display 1202 can such as to strengthen
The image of real world images (also referred to as hologram object) etc is delivered to the eyes of HMD wearer.See-through display
1202 can be configured to wearer's visually augmented reality world Year of Physics that physical environment is observed through the transparent display
The outward appearance of environment.In one example, the display can be configured to show one or more UI objects of graphic user interface.
In certain embodiments, the UI objects presented on a graphical user interface can be covered in it is virtual in front of real world environments
Object.Similarly, in certain embodiments, the UI objects presented on a graphical user interface can be included in by see-through display
The element of the real world objects of 1202 real world environments being seen.In other examples, display can be configured to show
Show other one or more Drawing Objects, such as virtual objects associated with game, video or other vision contents.
Any suitable mechanism can be used for via the display image of see-through display 1202.For example, see-through display 1202
It may include the pattern-creating element (such as having an X-rayed Organic Light Emitting Diode (OLED) display) being located in lens 1206.Make
For another example, see-through display 1202 may include display device (such as silicon upper liquid in HMD 1200 framework
Brilliant (LCOS) equipment or OLED micro-displays).In this example, lens 1206 can be used as or otherwise include being used for light
The light guide of the eyes of wearer is delivered to from display device.Such light guide may be such that wearer can be perceived positioned at wearer just
3D hologram images in the physical environment checked, while also allow wearer to be directly viewable the physical object in physical environment,
Thus mixed reality environment is created.Additionally or alternatively, see-through display 1202 can be aobvious via corresponding left eye and right eye
Show that left eye and right eye augmented reality image is presented in device, as discussed above in association with Fig. 2.
HMD 1200 may also include the various sensors and related system for providing information to controller 1204.It is such to pass
Sensor may include but be not limited to, and one or more inward-facing imaging sensor 1208a and 1208b, one or more face out
Imaging sensor 1210, Inertial Measurement Unit (IMU) 1212 and one or more microphones 1220.It is one or more towards
Interior imaging sensor 1208a, 1208b can be configured to obtain the picture number for watching tracking data form attentively from the eyes of wearer
According to (for example, sensor 1208a can obtain the view data of the eyes of wearer, and sensor 1208b can obtain wearer
Another eyes view data).HMD can be configured to use based on the information for being received from imaging sensor 1208a, 1208b
Any suitable mode determines the direction of gaze of each eyes in wearer's eyes.For example, one or more light sources
1214a, 1214b (such as infrared light supply) may be configured such that flash of light from the corneal reflection of each eyes of wearer.One
Then individual or multiple images sensor 1208a, 1208b can be configured to catch the image of wearer's eyes.Such as from collected from figure
The flicker determined as sensor 1208a, 1208b view data and the image of pupil can be each for determining by controller 1204
The optical axis of eyes.Using this information, controller 1204 may be configured to determine that the direction of gaze of wearer.Controller 1204
It can be configured to additionally by watching user vector projection attentively on the 3D models of surrounding environment and determine wearer's positive injection
Depending on physics and/or virtual objects identity.
One or more imaging sensors 1210 faced out can be configured to measure the physical environment where HMD 1200
Physical environment attribute (for example, luminous intensity).Data from the imaging sensor 1210 faced out can be used for detection display
Movement in the visual field of device 1202, such as by the wearer or people in the visual field or physical object perform based on the defeated of posture
Enter or other movements.In one example, the data from the imaging sensor 1210 faced out can be used for detection by HMD's
Selection of the instruction that wearer performs to the selection of the UI objects shown on display device is inputted, and such as posture is (for example, mediate hand
Refer to, clench one's fists).Data from the sensor faced out may be additionally used for determining (for example, from imaging circumstances feature
) direction/position and directional data, this make it possible to realize the location/motion to HMD 1200 in real world environments with
Track.Data from the camera faced out may be additionally used for from HMD 1200 visual angle construct surrounding environment rest image and/
Or video image.
IMU 1212 can be configured to HMD 1200 position and/or directional data being supplied to controller 1204.One
In individual embodiment, IMU 1212 can be configured as three axles or Three Degree Of Freedom (3DOF) position sensor system.The example location is passed
Sensor system can for example including for indicate or measure HMD 1200 in 3d space around three normal axis (for example, rolling, pitching
And driftage) orientation change three gyroscopes.Orientation can be used for aobvious via perspective derived from IMU sensor signal
Show device to show one or more AR images with position true to nature and stable and orientation.
In another example, IMU 1212 can be configured as the position sensor system of six axles or six degree of freedom (6DOF).
This configuration may include three accelerometers and three gyroscopes to indicate or measure HMD 1200 along three orthogonal space countershaft (examples
Such as, x, y and change in location z) and the equipment change in orientation around this three orthogonal rotary shafts (for example, driftage, pitching and rolling).
In certain embodiments, it can combine and determine from the imaging sensor 1210 and IMU 1212 position that face out and directional data
HMD 1200 position and orientation are used.
HMD 1200 can also support other suitable location technologies, such as GPS or other Global Navigation Systems.In addition,
Notwithstanding the specific example of position sensor system, it will be understood that, any other suitable position sensor system can quilt
Use.For example, head pose and/or mobile data can be based on from the sensors worn on the wearer and/or outside wearer
Any combination of sensor information determine, including but not limited to any amount of gyroscope, accelerometer, inertia measurement list
Member, GPS device, barometer, magnetometer, camera are (for example, Visible Light Camera, infrared light camera, flight time depth camera, knot
Structure optical depth camera etc.), communication equipment (for example, WIFI antennas/interface) etc..
Continue Figure 12, controller 1204 can be configured to over time based upon by one or more inward-facing imaging sensors
The multiple eye gaze samples of information record that 1208a, 1208b are detected.For each eye gaze sample, eyctracker information
And head-tracking information (from imaging sensor 1210 and/or IMU 1212) in certain embodiments can be used for estimation
The origin and direction vector of the eye gaze sample are to produce the estimated location that eye gaze intersects with see-through display.For true
The example of the eyctracker information and head-tracking information of determining eye gaze sample may include eye gaze direction, orientation of head,
Eye gaze speed, eye gaze acceleration, eye gaze deflection change, and/or any other suitable tracking information.
In some embodiments, eye gaze tracking can be recorded separately for two eyes of HMD 1200 wearer.
Controller 1204 is configured to the information from the imaging sensor 1210 faced out to generate or update
The threedimensional model of surrounding environment.Additionally or alternatively, the information from the imaging sensor 1210 faced out can be communicated to
It is responsible for generation and/or updates the remote computer of the model of surrounding environment.In any case, HMD is relative to surrounding environment
Relative position and/or orientation can be evaluated so that augmented reality image can be accurately displayed in the expectation with expectation set
Real world locations.
As described above, HMD 1200 may also include the one or more microphones (such as microphone 1220) for catching voice data.
In some instances, one or more microphones 1220 may include the microphone array for including two or more microphones.For example, words
Cylinder array may include four microphones, and two microphones are positioned in above HMD right lens, and another two microphone is positioned in HMD
Left lens above.In addition, audio output can be presented to and wear via one or more loudspeakers (such as loudspeaker 1222)
Wearer.
Controller 1204 may include the logical machine and storage machine that can be communicated with HMD display and each sensor, such as
Discussed in more detail above in association with Figure 11.
A kind of example is wearable, wear-type display system includes:Eye, see-through display in the vicinity, the eye in the vicinity, perspective are aobvious
Show that device is configured to show that size includes left eye augmented reality image at left eye displaing coordinate with left eye;Right nearly eye, perspective
Display, the right nearly eye, see-through display are configured to show that size includes right eye augmented reality image on the right side with right eye
At eye displaing coordinate, left eye augmented reality image and right eye the augmented reality image is collectively forming can be by wear-type display system
The augmented reality object that is perceived in apparent real world depth of wearer;And, controller.The controller is by left eye
Displaing coordinate is set to the function of the apparent real world depth of augmented reality object, institute relative to the relation of right eye displaing coordinate
State function maintains left eye to show that size and right eye are aobvious in the range of the non-scalable of the apparent real world depth of augmented reality object
Show the one side of size, and the function is outside the scope of apparent real world depth, with change augmented reality object
Apparent real world depth show that size and right eye show size to scale left eye.Additionally or alternatively, such example
Marked including wherein augmented reality object including Virtual User.Additionally or alternatively, such example includes wherein maintaining institute
State left eye and show that size and the right eye show that the one side of size is included in and Virtual User mark maintained in the range of non-scalable
Line thickness.Additionally or alternatively, such example is included in the range of non-scalable according to apparent real world depth to scale
The line length of Virtual User mark.Additionally or alternatively, such example includes wherein described function as reduction is apparent existing
Real world's depth and reduce the distance between left eye displaing coordinate and right eye displaing coordinate.Additionally or alternatively, it is such to show
Example includes wherein maintaining the left eye to show size and the right eye in the range of the non-scalable of apparent real world depth
The aspect of display size, which is included in the range of the non-scalable of apparent real world depth, changes the augmented reality pair
The apparent real world size of the corresponding aspect of elephant so that the augmented reality object occupies the constant of the visual field of the wearer
Ratio.Additionally or alternatively, such example, which includes wherein described augmented reality object, includes user interface control element.It is attached
Plus ground or alternatively, such example includes wherein described function in the apparent real world depth more than the non-scalable scope
Place reduces the left eye and shows that size and the right eye show size, and in the apparent real generation less than the non-scalable scope
Boundary's depth increases the left eye and shows that size and the right eye show size.Additionally or alternatively, such example includes
Wherein described augmented reality object is the first augmented reality object, and wherein described controller is by the second augmented reality object
Left eye coordinates are set to the second augmented reality object relative to the relation of the right eye coordinate of the second augmented reality object
Apparent real world depth second function.Additionally or alternatively, such example exists including wherein described second function
The apparent real world depth of the second augmented reality object second, maintain second enhancing in the range of different non-scalable
The left eye of real-world object shows that size and right eye show the one side of size.Additionally or alternatively, such example includes it
Described in augmented reality object be father's augmented reality object child's object, and wherein described function is in father's augmented reality
With the apparent real world for changing father's augmented reality object in the range of the non-scalable of the apparent real world depth of object
Depth and the left eye that scales father's augmented reality object shows that size and right eye show size.Appointing in example described above
How one or all can be combined in each realization in any suitable way.
Another example provide it is a kind of be used for wearable, wear-type display system method, methods described be included in the vicinity eye,
Show that size includes left eye augmented reality image at left eye displaing coordinate according to scaling function with left eye in see-through display;
Show that size includes right eye augmented reality image on the right side according to the scaling function with right eye in right nearly eye, see-through display
At eye displaing coordinate, left eye augmented reality image and right eye the augmented reality image forms can be shown by the wear-type together
The augmented reality object that the wearer of system perceives in apparent real world depth, the scaling function is according to the enhancing
The apparent real world depth of real-world object sets the left eye displaing coordinate relative to the relation of the right eye displaing coordinate, institute
State scaling function maintains the left eye to show in the range of the non-scalable of the apparent real world depth of the augmented reality object
Size and the right eye show the one side of size, and the scaling function is outside the non-scalable scope of real world depth
The left eye is scaled with the apparent real world depth for changing the augmented reality object and shows that size and the right eye are aobvious
Show size.Additionally or alternatively, such example is included wherein outside the non-scalable scope with the change enhancing
The apparent real world depth of real-world object and scale the left eye and show that size and the right eye show that size is included in reality
Increase the left eye with apparent real world depth is reduced outside the non-scalable scope of world's depth and show size and described
Right eye shows size, and reduces the left eye with apparent real world depth is increased and show that size and right eye are shown greatly
It is small.Additionally or alternatively, such example include wherein maintaining in the range of the non-scalable left eye show size and
The right eye shows that the one side of size is included in the range of the non-scalable augmented reality object being maintained and accounts for described
The constant ratio in the visual field of wearer.Additionally or alternatively, such example includes wherein tieing up the augmented reality object
Holding the constant ratio in the visual field to account for the wearer includes the apparent real world depth with the augmented reality object
Degree changes, and the augmented reality object is changed relative to the real world objects at the same depth of the augmented reality object
Real world size.Additionally or alternatively, such example, which includes wherein described augmented reality object, includes Virtual User
Mark, and wherein maintain the left eye to show that size and the right eye are aobvious in the range of the non-scalable of apparent real world depth
Show that the one side of size includes maintaining the line thickness of the Virtual User mark.In example described above any one or it is complete
Portion can be combined in each realization in any suitable way.
Another example provides a kind of wearable, wear-type display system, including:Eye, see-through display in the vicinity, it is described in the vicinity
Eye see-through display is display configured to the first left eye enhancing display image and the second left eye augmented reality image, first He
Second left eye augmented reality image shows that size is displayed at different left eye displaing coordinates with different left eyes;Right nearly eye,
See-through display, the right nearly eye see-through display is display configured to the first right eye enhancing display image and the enhancing of the second right eye
Real world images, the first and second right eyes augmented reality image shows that size is displayed on different right eyes with different right eyes
At displaing coordinate;First left eye and the first right eye augmented reality image are collectively forming the first augmented reality object, described
Two left eyes and the second right eye augmented reality image are collectively forming the second augmented reality object, first and second augmented reality pair
As that can be perceived by the wearer of the wear-type display system in corresponding apparent real world depth;And controller,
Relation of the controller by the left eye displaing coordinate relative to the right eye displaing coordinate is set to described first and second
The function of the apparent real world depth of both augmented reality objects, the function is only for the first augmented reality object
Apparent real world depth non-scalable in the range of maintain the left eye to show that size and the right eye show a side of size
Face, the function is outside the non-scalable scope of apparent real world depth with change first and second augmented reality pair
The left eye is scaled as both apparent real world depth and shows that size and the right eye show size, and the function
With the apparent reality of change in the range of the non-scalable only for the apparent real world depth of the second augmented reality object
World's depth shows that size and the right eye show size to scale the left eye.Additionally or alternatively, such example bag
Including wherein described first augmented reality object includes user interface control element, and wherein described second augmented reality object bag
Include holographic game element.Additionally or alternatively, it is described that such example, which includes wherein described first augmented reality object,
The child of two augmented reality objects.Additionally or alternatively, such example, which includes wherein described function, is included applied to described
First piecewise function of the first augmented reality object and the second linear function applied to the second augmented reality object.More than
In the example of description any one or can all be combined in any suitable way in each realization.
It will be understood that, configuration described herein and/or mode essence are exemplary, these specific embodiments or are shown herein
Example be not construed as it is restricted because many variants are possible.Specific routine or method described herein can represent to appoint
One or more of processing strategy of what quantity.In this way, shown and/or described various actions can be with shown and/or institute
State order, with other orders, be performed in parallel, or be omitted.Equally, the order of said process can change.
The theme of the disclosure includes various processes, system and all novel and non-obvious combination of configuration and subgroup
Close and other features, function, action, and/or characteristic disclosed herein and its any and whole equivalent.
Claims (15)
1. a kind of wearable, wear-type display system, including:
Eye, see-through display in the vicinity, the eye in the vicinity, see-through display are configured to show that left eye is strengthened aobvious by size with left eye
Diagram picture is shown at left eye displaing coordinate;
Right nearly eye, see-through display, the right nearly eye, see-through display are configured to show that right eye is strengthened aobvious by size with right eye
Diagram picture is shown at right eye displaing coordinate, and left eye augmented reality image and right eye the augmented reality image is collectively forming can be by
The augmented reality object that the wearer of the wear-type display system perceives in apparent real world depth;And
The left eye displaing coordinate is set to described by controller, the controller relative to the relation of the right eye displaing coordinate
The function of the apparent real world depth of augmented reality object, apparent real world of the function in the augmented reality object
The left eye is maintained to show that size and the right eye show the one side of size, and the function in the range of the non-scalable of depth
Outside the scope of the apparent real world depth, with the apparent real world depth for changing the augmented reality object
Scale the left eye and show that size and the right eye show size.
2. system is shown as claimed in claim 1, it is characterised in that the augmented reality object includes Virtual User and marked.
3. system is shown as claimed in claim 2, it is characterised in that maintain the left eye to show that size and the right eye are shown
The one side of size is included in the line thickness that the Virtual User mark is maintained in the range of the non-scalable.
4. system is shown as claimed in claim 3, it is characterised in that further comprised in the range of the non-scalable according to table
Real world depth is seen to scale the line length of the Virtual User mark.
5. system is shown as claimed in claim 1, it is characterised in that the function is with the apparent real world depth of reduction
Reduce the distance between the left eye displaing coordinate and right eye displaing coordinate.
6. system is shown as claimed in claim 1, it is characterised in that in the range of the non-scalable of apparent real world depth
The left eye is maintained to show that size and the right eye show that the aspect of size is included in the described of apparent real world depth
Change the apparent real world size of the corresponding aspect of the augmented reality object in the range of non-scalable so that the augmented reality
Object occupies the constant ratio in the visual field of the wearer.
7. system is shown as claimed in claim 1, it is characterised in that the augmented reality object includes user interface control member
Element.
8. system is shown as claimed in claim 1, it is characterised in that the function is apparent more than the non-scalable scope
Real world depth reduces the left eye and shows that size and the right eye show size, and less than the non-scalable scope
Apparent real world depth increase the left eye and show that size and the right eye show size.
9. system is shown as claimed in claim 1, it is characterised in that the augmented reality object is the first augmented reality pair
As, and wherein described controller by the left eye coordinates of the second augmented reality object relative to the second augmented reality object
The relation of right eye coordinate is set to the second function of the apparent real world depth of the second augmented reality object.
10. system is shown as claimed in claim 9, it is characterised in that the second function is in second augmented reality pair
The apparent real world depth of elephant second, maintain the left eye of the second augmented reality object to show in the range of different non-scalable
Size and right eye show the one side of size.
11. system is shown as claimed in claim 1, it is characterised in that the augmented reality object is father's augmented reality object
Child, and wherein described function in the range of the non-scalable of the apparent real world depth of father's augmented reality object with
The left eye the apparent real world depth for changing father's augmented reality object and scale father's augmented reality object is shown
Size and right eye show size.
12. it is a kind of for wearable, wear-type display system method, including:
Show that size includes left eye enhancing display image on a left side according to scaling function with left eye in eye in the vicinity, see-through display
At eye displaing coordinate;
Show that size shows right eye enhancing display image according to the scaling function with right eye in right nearly eye, see-through display
At right eye displaing coordinate, left eye augmented reality image and right eye the augmented reality image is collectively forming can be by the wear-type
The augmented reality object that the wearer of display system perceives in apparent real world depth;
The scaling function sets the left eye display to sit according to the apparent real world depth of the augmented reality object
Mark the relation relative to the right eye displaing coordinate;
The scaling function maintains the left side in the range of the non-scalable of the apparent real world depth of the augmented reality object
Eye shows that size and the right eye show the one side of size;And
The scaling function is outside the non-scalable scope of real world depth with the change augmented reality object
Apparent real world depth and scale the left eye and show that size and the right eye show size.
13. method as claimed in claim 12, it is characterised in that with the change enhancing outside the non-scalable scope
The apparent real world depth of real-world object and scale the left eye and show that size and the right eye show that size is included in reality
Increase the left eye with apparent real world depth is reduced outside the non-scalable scope of world's depth and show size and described
Right eye shows size, and reduces the left eye with apparent real world depth is increased and show that size and right eye are shown greatly
It is small.
14. method as claimed in claim 12, it is characterised in that maintain the left eye to show greatly in the range of the non-scalable
It is in the range of the non-scalable that the enhancing is existing that small and described right eye shows that the one side of size includes coming in the following manner
Real object is maintained the constant ratio for accounting for the visual field of the wearer:With the apparent real generation of the augmented reality object
Boundary's depth changes, and the augmented reality is changed relative to the real world objects at the same depth of the augmented reality object
The real world size of object.
15. method as claimed in claim 12, it is characterised in that the augmented reality object is marked including Virtual User, and
And wherein maintain the left eye to show that size and the right eye show size in the range of the non-scalable of apparent real world depth
One side include maintaining the line thickness of Virtual User mark.
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562105672P | 2015-01-20 | 2015-01-20 | |
US62/105,672 | 2015-01-20 | ||
US14/717,771 | 2015-05-20 | ||
US14/717,771 US9934614B2 (en) | 2012-05-31 | 2015-05-20 | Fixed size augmented reality objects |
PCT/US2016/012778 WO2016118344A1 (en) | 2015-01-20 | 2016-01-11 | Fixed size augmented reality objects |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107209565A true CN107209565A (en) | 2017-09-26 |
CN107209565B CN107209565B (en) | 2020-05-05 |
Family
ID=55349938
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201680006372.2A Active CN107209565B (en) | 2015-01-20 | 2016-01-11 | Method and system for displaying fixed-size augmented reality objects |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN107209565B (en) |
WO (1) | WO2016118344A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107592520A (en) * | 2017-09-29 | 2018-01-16 | 京东方科技集团股份有限公司 | The imaging device and imaging method of AR equipment |
CN110928404A (en) * | 2018-09-19 | 2020-03-27 | 未来市股份有限公司 | Tracking system and related tracking method thereof |
CN111083391A (en) * | 2018-10-19 | 2020-04-28 | 舜宇光学(浙江)研究院有限公司 | Virtual-real fusion system and method thereof |
CN111506188A (en) * | 2019-01-30 | 2020-08-07 | 托比股份公司 | Method and HMD for dynamically adjusting HUD |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8977255B2 (en) | 2007-04-03 | 2015-03-10 | Apple Inc. | Method and system for operating a multi-function portable electronic device using voice-activation |
DE112014000709B4 (en) | 2013-02-07 | 2021-12-30 | Apple Inc. | METHOD AND DEVICE FOR OPERATING A VOICE TRIGGER FOR A DIGITAL ASSISTANT |
DK179496B1 (en) | 2017-05-12 | 2019-01-15 | Apple Inc. | USER-SPECIFIC Acoustic Models |
US10514801B2 (en) | 2017-06-15 | 2019-12-24 | Microsoft Technology Licensing, Llc | Hover-based user-interactions with virtual objects within immersive environments |
US10991138B2 (en) | 2017-12-22 | 2021-04-27 | The Boeing Company | Systems and methods for in-flight virtual reality displays for passenger and crew assistance |
US10523912B2 (en) * | 2018-02-01 | 2019-12-31 | Microsoft Technology Licensing, Llc | Displaying modified stereo visual content |
WO2023044050A1 (en) * | 2021-09-17 | 2023-03-23 | Apple Inc. | Digital assistant for providing visualization of snippet information |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102609942A (en) * | 2011-01-31 | 2012-07-25 | 微软公司 | Mobile camera localization using depth maps |
US20130050432A1 (en) * | 2011-08-30 | 2013-02-28 | Kathryn Stone Perez | Enhancing an object of interest in a see-through, mixed reality display device |
CN102981616A (en) * | 2012-11-06 | 2013-03-20 | 中兴通讯股份有限公司 | Identification method and identification system and computer capable of enhancing reality objects |
CN103136726A (en) * | 2011-11-30 | 2013-06-05 | 三星电子株式会社 | Method and apparatus for recovering depth information of image |
US20130314793A1 (en) * | 2012-05-22 | 2013-11-28 | Steven John Robbins | Waveguide optics focus elements |
US20130326364A1 (en) * | 2012-05-31 | 2013-12-05 | Stephen G. Latta | Position relative hologram interactions |
US20130328927A1 (en) * | 2011-11-03 | 2013-12-12 | Brian J. Mount | Augmented reality playspaces with adaptive game rules |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6198484B1 (en) * | 1996-06-27 | 2001-03-06 | Kabushiki Kaisha Toshiba | Stereoscopic display system |
US9342610B2 (en) * | 2011-08-25 | 2016-05-17 | Microsoft Technology Licensing, Llc | Portals: registered objects as virtualized, personalized displays |
-
2016
- 2016-01-11 WO PCT/US2016/012778 patent/WO2016118344A1/en active Application Filing
- 2016-01-11 CN CN201680006372.2A patent/CN107209565B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102609942A (en) * | 2011-01-31 | 2012-07-25 | 微软公司 | Mobile camera localization using depth maps |
US20130050432A1 (en) * | 2011-08-30 | 2013-02-28 | Kathryn Stone Perez | Enhancing an object of interest in a see-through, mixed reality display device |
US20130328927A1 (en) * | 2011-11-03 | 2013-12-12 | Brian J. Mount | Augmented reality playspaces with adaptive game rules |
CN103136726A (en) * | 2011-11-30 | 2013-06-05 | 三星电子株式会社 | Method and apparatus for recovering depth information of image |
US20130314793A1 (en) * | 2012-05-22 | 2013-11-28 | Steven John Robbins | Waveguide optics focus elements |
US20130326364A1 (en) * | 2012-05-31 | 2013-12-05 | Stephen G. Latta | Position relative hologram interactions |
CN102981616A (en) * | 2012-11-06 | 2013-03-20 | 中兴通讯股份有限公司 | Identification method and identification system and computer capable of enhancing reality objects |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107592520A (en) * | 2017-09-29 | 2018-01-16 | 京东方科技集团股份有限公司 | The imaging device and imaging method of AR equipment |
US10580214B2 (en) | 2017-09-29 | 2020-03-03 | Boe Technology Group Co., Ltd. | Imaging device and imaging method for augmented reality apparatus |
CN107592520B (en) * | 2017-09-29 | 2020-07-10 | 京东方科技集团股份有限公司 | Imaging device and imaging method of AR equipment |
CN110928404A (en) * | 2018-09-19 | 2020-03-27 | 未来市股份有限公司 | Tracking system and related tracking method thereof |
CN110928404B (en) * | 2018-09-19 | 2024-04-19 | 未来市股份有限公司 | Tracking system and related tracking method thereof |
CN111083391A (en) * | 2018-10-19 | 2020-04-28 | 舜宇光学(浙江)研究院有限公司 | Virtual-real fusion system and method thereof |
CN111506188A (en) * | 2019-01-30 | 2020-08-07 | 托比股份公司 | Method and HMD for dynamically adjusting HUD |
Also Published As
Publication number | Publication date |
---|---|
WO2016118344A1 (en) | 2016-07-28 |
CN107209565B (en) | 2020-05-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9934614B2 (en) | Fixed size augmented reality objects | |
CN107209565A (en) | The augmented reality object of fixed size | |
JP6730286B2 (en) | Augmented Reality Object Follower | |
US10670868B2 (en) | Identification of augmented reality image display position | |
US10304247B2 (en) | Third party holographic portal | |
CN106537261B (en) | Holographic keyboard & display | |
US9824499B2 (en) | Mixed-reality image capture | |
US10078917B1 (en) | Augmented reality simulation | |
US10373392B2 (en) | Transitioning views of a virtual model | |
EP3532177B1 (en) | Virtual object movement | |
EP3137982B1 (en) | Transitions between body-locked and world-locked augmented reality | |
US10127725B2 (en) | Augmented-reality imaging | |
US20190065026A1 (en) | Virtual reality input | |
US10134174B2 (en) | Texture mapping with render-baked animation | |
EP3106963B1 (en) | Mediated reality | |
US12022357B1 (en) | Content presentation and layering across multiple devices | |
US20180046352A1 (en) | Virtual cursor movement | |
CN107810634A (en) | Display for three-dimensional augmented reality | |
US20160320833A1 (en) | Location-based system for sharing augmented reality content | |
CN111670465A (en) | Displaying modified stereoscopic content | |
US20180330546A1 (en) | Wind rendering for virtual reality computing device | |
CN111602391B (en) | Method and apparatus for customizing a synthetic reality experience from a physical environment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |