CN103761763B - For the method using precalculated illumination to build augmented reality environment - Google Patents

For the method using precalculated illumination to build augmented reality environment Download PDF

Info

Publication number
CN103761763B
CN103761763B CN201310757195.XA CN201310757195A CN103761763B CN 103761763 B CN103761763 B CN 103761763B CN 201310757195 A CN201310757195 A CN 201310757195A CN 103761763 B CN103761763 B CN 103761763B
Authority
CN
China
Prior art keywords
augmented reality
virtual architecture
fragment
precalculated
environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310757195.XA
Other languages
Chinese (zh)
Other versions
CN103761763A (en
Inventor
J·斯蒂德
A·克劳斯
M·斯卡维泽
张炜
A·汤姆林
T·安布鲁斯
B·芒特
S·拉塔
R·哈斯汀斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Technology Licensing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing LLC filed Critical Microsoft Technology Licensing LLC
Priority to CN201310757195.XA priority Critical patent/CN103761763B/en
Publication of CN103761763A publication Critical patent/CN103761763A/en
Application granted granted Critical
Publication of CN103761763B publication Critical patent/CN103761763B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

It is directed to use with global illumination effect to build the embodiment of augmented reality environment efficiently and be disclosed.Such as, a disclosed embodiment provides a kind of method by display device display augmented reality image.The method includes receiving view data, the image of the local environment of this view data capture display device;And the physical features by this view data mark local environment.The method farther includes to build and spatially registrates from the visual angle of user with physical features, for showing the augmented reality image of the virtual architecture on physical features, this augmented reality image includes the multiple modular virtual architecture fragment being disposed in adjacent position to form virtual architecture feature, and each modular virtual architecture fragment includes precalculated global illumination effect;And export this augmented reality image to this display device.

Description

For the method using precalculated illumination to build augmented reality environment
Technical field
It relates to for the method and apparatus using precalculated illumination to build augmented reality environment.
Background technology
Increase illumination true to nature and shade in the virtual environment of such as virtual video game environment, may be on operand It is expensive.Similarly, can be probably can not connect for the render time for lighting effect during playing video game The length being subject to.Such as, in virtual environment, encode illumination true to nature (such as, global illumination) and the texture map of shade (" light map ") Establishment may take several hours, or calculate the most over these days.Therefore, this lighting effect is generally developed in virtual environment Period is that virtual environment precalculates rather than calculates in real time between game play session.
Dynamic illumination and shade will calculated faster.But, the visual quality of dynamic illumination may be well below in advance The visual quality of precalculated lighting effect.Additionally, dynamic illumination operationally may use substantial amounts of resource.
Summary of the invention
It is directed to use with global illumination effect to build multiple embodiments of augmented reality environment efficiently and be disclosed.Such as, one Individual disclosed embodiment provides a kind of method by display device display augmented reality image.The method includes receiving capture The view data of the image of the local environment of display device;And the physical features by this view data mark local environment. The method farther includes to build for showing the most in registry at physical features from the visual angle of user with physical features The augmented reality image of virtual architecture, it is special to form virtual architecture that this augmented reality image includes being disposed in adjacent position The multiple modular virtual architecture fragment levied, each modular virtual architecture fragment includes precalculated lighting effect; And export this augmented reality image to this display device.
Present invention is provided to introduce in simplified form the selection of concept, and it is by detailed description of the invention below In further describe.Present invention is not intended to determine key feature or the essential feature of the theme being required, is also not intended to It is used for limiting the scope of the theme being required.Point out additionally, the theme being required is not limited to the arbitrary portion solved in the disclosure The realization of any or all of shortcoming.
Accompanying drawing explanation
Fig. 1 shows the exemplary enforcement of perspective (see-through) display device in an example uses environment Example.
Fig. 2 shows that the example of Fig. 1 uses the embodiment of the augmented reality image in environment.
Fig. 3 shows the exemplary embodiment of modular virtual architecture set of segments.
Fig. 4 shows the precalculated lighting effect of the part being applied to a modular virtual architecture fragment Signal describes.
Fig. 5 A illustrates increase dynamic point light effect in the augmented reality image of Fig. 2, and Fig. 5 B illustrate have pre- The effect example of the dynamic point light effect in the part of one modular virtual architecture fragment of precalculated lighting effect.
Fig. 6 shows the stream of the embodiment of the method describing the virtual environment for building the physical environment being adapted to detection Cheng Tu.
Fig. 7 shows the block diagram of the exemplary embodiment of perspective display device.
Fig. 8 shows the block diagram of the example embodiment of calculating system.
Detailed description of the invention
As mentioned above, the lighting effect true to nature for virtual environment is typically pre-after virtual environment is fabricated First calculated, then stored, such as, be stored as the light map for virtual environment.This virtual environment generally uses solid Fixed geometry is built, and described fixing geometry is not suitable for the surrounding of user.
On the contrary, augmented reality display system can be configured so that virtual image is adapted to the surrounding of user.Such as, increase The corresponding physical arrangement of the physical environment that strong real video-game can make the virtual architecture in game be suitable to user.Therefore, The geometry of augmented reality image object can physical environment based on user and change.
Occur during using in real time owing to making augmented reality environment be suitable to physical environment, if high-quality illumination effect Fruit is applied to this environmentally after environment is created, then this illumination calculation also will occur at this moment.But, if will increase Strong real world images is calculated for this lighting effect of augmented reality environment after being suitable to physical environment, then depend on for Calculating the specific calculation system of lighting effect, owing to applying the calculating of lighting effect true to nature to spend, user may have to wait Within several hours, experience to experience augmented reality over these days.This can cause slow unacceptable Consumer's Experience.Further, at this The outward appearance of long timing period physical environment may change.This can cause between real world and virtual world not Joining, it may interfere significantly on augmented reality and experience.
Possible solution as one, dynamic illumination can be used to replace precalculated lighting effect for strengthening Actual environment.But, As mentioned above, dynamic illumination is likely to be of lower quality than precalculated illumination, therefore may be used The best Consumer's Experience can not be provided.Additionally, dynamic illumination is operationally expensive on operand, this is likely to reduced all Such as other visions and the otherwise computation budget of the experience playing game.
Therefore, disclosed embodiments relates to efficiently building enhancing with high-quality precalculated lighting effect now Real environment, it is suitable to the geometry of Local physical environment.Briefly, the embodiment of the disclosure uses modular virtual knot Tile section, described structure fragment can be disposed in position adjacent each other and think that augmented reality image forms virtual architecture, wherein This modular virtual architecture fragment includes high-quality precalculated lighting effect.Owing to lighting effect is each modularity Virtual architecture fragment precalculate, this lighting effect will be included in via modular virtual architecture fragment structure void Intend in structure.Further, in certain embodiments, local light can be detected according to feature, and is used to adjust for modular void Intend the outward appearance of structure fragment.This local light may include but be not limited to according to the example of feature: the color in Local physical environment is special Levy and the position of light source.
Fig. 1 show for augmented reality display system use environment 100 with the exemplary reality of the form in living room Execute example.User 102 is shown as observing this living room by perspective display device 104.Fig. 1 also illustrates the visual field 103 of user, It represents the part by having an X-rayed the visible environment of display device 104, and therefore this part of this environment can be shown by perspective The image of equipment 104 display is enhanced.In certain embodiments, the visual field 103 of user is substantially same with the actual visual field of user Prolong (coextensive), and in other embodiments, the visual field 103 of user occupies less portion in the actual visual field of user Point.
Will be described in greater detail below, perspective display device 104 can include one or more figure towards outside As sensor (such as, two-dimensional camera and/or depth camera), it is configured to when user passes by environment obtain expression and uses ring The view data in border 100 (such as, colour/gray level image, depth image/cloud data/grid data etc.).This view data Can be used to obtain about the layout of the such as environment of ceiling 106 and wall 108 and its architectural feature and other features Information.
Perspective display device 104 is further configured to through by putting Overlapping display on the physical object of equipment visibility Virtual objects to create augmented reality image.Such as, with reference to Fig. 2, example augmented reality image is illustrated, the most virtual Wall stud 202, the lamination that the virtual room frame structure 200 of back timber 204 etc. is shown as on the wall of user (overlay).The infrastructure image of such as pipeline 206, conduit/cable etc. and any other suitable virtual architecture are same Sample can be shown.Similar structure (not shown) can be that ceiling is shown equally.Additionally, image can be shown corresponding to room In furniture or other destructuring objects, and the empty space being shown as occupying in room.It will be appreciated that in order to more Augmented reality environment is intactly described, the augmented reality image described in Fig. 2 is not limited to the visual field of the user shown in Fig. 1.
The virtual wall frame structure of Fig. 2 is being geometrically adapted for following physical arrangement (such as, wall 108).Due to often The Local physical environment of individual user is likely to difference, and the whole virtual architecture for the Local physical environment of each player is obtaining The view data of Local physical environment (such as, three-dimensional depth view data, structured light view data, flight time (time- Of-flight) view data or other depth image data) after be fabricated rather than be pre-designed.So, if overall Lighting effect is fabricated at virtual architecture and is applied to this virtual architecture afterwards, then before activity can be playing, Wan Jiake Can be forced to wait less desirable for a long time, and can be limited during playing and change environment and (such as, go to different rooms Between), unfavorable long period may be spent because building and applying light to shine new environment.
Therefore, as mentioned above, virtual architecture 200 is by the modular virtual knot with precalculated lighting effect Structure set of segments assembles, and the example of the most modular virtual architecture fragment can be disposed in position adjacent each other quilt Process (such as, rotate, scaling etc.) and form the outward appearance of unified virtual architecture.Fig. 3 shows modular virtual architecture The exemplary embodiment of set of segments 300, it includes wall stud fragment 302, with the wall stud fragment of the pipeline that is connected 304, with the wall stud fragment 306 of horizontal pipe, a pair door frame fragment 308,310, and a pair window frame fragment 312,314.With reference to Fig. 2, it should be noted that virtual architecture 200 can be completely by the virtual wall wall flaps selected from set 300 The example of section is through scaling, rotate, prune, deforming and/or the other suitable process of layout based on each particular instance Build.
Although describing the relatively simple set of modular virtual architecture fragment in figure 3, it is to be understood that mould The set of the virtual architecture fragment of massing can have the selection of the fragment of the complexity of any suitable quantity and any desired.Enter One step ground, it is to be understood that the set with the modular virtual architecture fragment of precalculated illumination can be used to structure Build any other the suitable structure being different from the virtual architecture being suitable to wall to adapt to any other desired physical features, Include but not limited to: be suitable to ceiling, and such as furniture, wall curtain, plant, external object, counter top, face, bar desk etc. The virtual architecture of non-structural feature.Such as, the modular virtual architecture fragment illuminated in advance can include the void illuminated in advance Intend sofa, desk, TV, mirror and other generally found in physical environment objects, but they are at different things Reason environment can have different shapes and/or outward appearance.It is to be appreciated that in some instances, for this of this object Kind of virtual architecture fragment can include individually " fragment " so that single virtual architecture element be conditioned size, rotation and its His process is adapted to desired physical arrangement and combines without with adjacent fragment.Additionally, it should be appreciated that in physical rings An empty space in border can be considered to be a physical features of environment, and modular virtual architecture fragment can be arranged The part not being occupied in space in room or other use environment builds virtual objects.
Some modular virtual architecture fragments can include that constraint can be connected to the set of other fragments of this fragment Connect and limit.Such as, in figure 3, each window fragment and door fragment can be connected to separately in side (such as, window/door side) One window or door fragment rather than wall stud fragment 302 are in this side, and otherwise fragment may be mated improperly.Further Ground, the pipeline 304 of connection and horizontal pipe 306 fragment can be confined to be connected to the fragment with complementary pipe section.Need It being understood that these connect to limit be the purpose in order to illustrate and be described, arbitrarily other suitable connections limit and can be made With.
Any suitable precalculated lighting effect can be applied to modular virtual architecture fragment.Such as, one In a little embodiments, modular virtual architecture set of segments is intended to be used in the photoenvironment of any local, and without reference to environment In the position of physical light.In such an embodiment, directional light effect may be utilized.One such example is displayed on Fig. 4 In, directional light is incident in the part 400 of virtual wall stud fragment.The virtual optical of application is according to having any suitable side To.Such as, for the block of Horizontal Tile, light can be perpendicular to trunnion axis, and for the block of vertical tile, light can be perpendicular to vertically Axle.Additionally, for while horizontal and vertical tiling modular piece, light can be both perpendicular to two axles.This can help to Ensure for each fragment, there is common illumination feature for the precalculated shade of fragment and illumination, and therefore can help to prevent The parallax of only the most adjacent fragment intersection and/or other are discontinuous.In the fig. 4 embodiment, directional light is shown as with relatively It is employed in about 45 degree of angles of vertical, but it is understood that this be the purpose in order to illustrate and be described, and arbitrarily Other suitable angles can be used.
In other examples, modular virtual architecture set of segments can be configured to and specific illumination feature one Rise use-such as, the point source on the single crown, adjacent to the lamp etc. of wall.In such an embodiment, any suitable kind The virtual optical of class is according to can be used for precalculating lighting effect.Under any circumstance, after precalculating lighting effect, calculate After light map will be with the high-grade preservation of information together with the modular virtual architecture fragment being associated so that by fragment The image of the virtual architecture assembled has lighting effect true to nature.
The kind of any suitable Lighting information can be that modular virtual architecture fragment is stored.Such as, precalculate Lighting effect can be stored as light map, cube graph, ball harmonic (spherical harmonic) (such as, precalculate Radiation transmission function) and/or any other suitable form.The use of precalculated radiation transmission function can allow in void The illumination true to nature on object intended and shade are generated, such as, and position based on the physical light in using environment detected Put, by the location application virtual point illumination of the physical light in physical environment, as shown by the virtual point source 500 in Fig. 5 A Go out.Fig. 5 B shows that what the outward appearance virtual point source based on Fig. 5 A of the wall stud part shown in Fig. 4 was changed shows Example.Additionally, program or dynamic illumination equally can be by (such as, dynamic by display in augmented reality image of light of application in real time State virtual objects causes).
Local physical illumination feature can be used to outside the virtual architecture fragment of adjustment module in other manners equally See.Such as, the precalculated lighting effect of modular virtual architecture fragment can based on white light should be used for being calculated.So After, when for this specific physical environment create virtual image time, the color characteristic of the physical light photograph in physical environment can from by The view data that perspective display device obtains is analyzed, and the color characteristic (such as, colourity, saturation, reflectance) being determined Can be applied to virtual lighting effect makes this virtual optical according to closer coupling Local physical illumination.In this way, in advance Virtual wall/ceiling the fragment first illuminated, the virtual furnishings illuminated in advance and any other suitable void illuminated in advance The example of the display intending object can closer be matched the outward appearance of this physical environment.
Fig. 6 show the example of the virtual architecture fragment by application module to the physical arrangement detected for structure Build the embodiment of the method 600 of augmented reality environment.Method 600 includes, 602, receives the local of capture perspective display device The view data of the image of environment.This view data can include any suitable data, includes but not limited to depth image 604 And/or two dimensional image, and can from perspective display device on or perspective display device exterior imageing sensor at receive. This depth image data can receive at any suitable Depth Imaging system, includes but not limited to stereo imaging system, flight Time (time-of-flight) imaging system and structured light imaging system.
Then method 600 includes, 606, from the physical features of this view data mark local environment.This physical features Can be the most identified.Such as, in certain embodiments, the grid representation of physical environment is from depth image data Determine, and perform network analysis 608, with the major surfaces in 610 mark physical environments.Example includes but not limited to, wall Wall 612, the wall of other outthrust/cuttings in ceiling 614, and such as door, window, skylight, pillar, room etc. and sky The feature of card.Additionally, the open space in geometry can be identified, such as, to allow desired virtual architecture to be suitable to institute The open space of mark.
Method 600 can include equally, and 616, one or more local lights of mark physical environment are according to feature.Local light shines The example of feature may include but be not limited to, color characteristic 618 and the position of local light source 620.
Method 600 farther includes, and 622, builds augmented reality image, and this enhancing display image includes existing for display Virtual architecture feature on the physical features detected, it spatially registrates with physical features.It is as previously mentioned and refers to 624 Showing, this virtual architecture can be by arranging that multiple modular virtual architecture fragments build, modular virtual architecture fragment Each include precalculated lighting effect.This virtual architecture fragment can be arranged in any suitable manner, including but Be not limited to, by rotating, scale, deforming, pruning etc. block adapt to physical geometry interested.Similarly, this module The virtual architecture fragment changed can include any suitable precalculated information about precalculated lighting effect.Example bag Include but be not limited to, light map 626 and/or radiation transmission function 628.Further, as described above, when selecting and arranging mould The virtual architecture fragment of massing ensures when the feature of complementation is properly connected in adjacent fragment, can be in 630 application limits System may be connected to the connection limit of the set of other modular virtual architecture fragments of the modular virtual architecture fragment of selection System.
Additionally, As mentioned above, local light can be used to build augmented reality image according to feature.Such as, as referred to 632 Showing, in certain embodiments, the outward appearance of modular virtual architecture fragment can be conditioned according to feature based on local light.This is outer Sight can be conditioned in any suitable manner.Such as, as in 634 instructions, locally the color of photoenvironment can be applied in advance The lighting effect calculated.Similarly, as at 636 instructions, virtual light source, such as virtual point source, being employed in the environment The position of physical light-source.
In other embodiments, substitute the outward appearance changing modular virtual architecture fragment, there is different illumination features The different set of multiple modular virtual architecture fragment can be available.Such as, modular virtual architecture fragment A set can include the precalculated lighting effect corresponding to the point source on the crown, and another set can include pre- The precalculated lighting effect corresponding to the directional light from side window.In this example, as in 638 instructions, local light shines Feature can be used to select the set with the modular virtual architecture fragment of the illumination feature of correspondence so that generation virtual Structure can have the illumination feature similar with the physical light in environment.
Once constructing augmented reality image, method 600 includes exporting this augmented reality image to perspective display device, as In 640 instructions.From perspective display device sensing data (such as, internal or the imageing sensor of outside) can by with Detect eye position and the direction of gaze of user, and be used to equally detect the physical object in the user visual field, and right Showing virtual architecture on the physical features answered, this virtual architecture and this physical features spatially registrate and give user this thing The augmented reality visual angle of reason environment.
As mentioned above, method described above can perform via any suitable display device.Example include but not It is limited to the wear-type perspective display device 104 having an X-rayed in display device, such as Fig. 1, and other have one or more image The display device of the such as smart phone and notebook of sensor.Fig. 7 shows the example of perspective display device 104 The block diagram of configuration.
Perspective display device 104 can include one or more camera lens 702, and described camera lens defines nearly eye perspective display subsystem A part for system 704.Perspective display device 104 can farther include one or more imageing sensor 706 faced out, described Imageing sensor is configured to obtain the image of the background scene observed by user, and can include one or more mike 708, described mike is configured to detect sound, such as from the voice command of user.The imageing sensor 706 faced out can Including one or more depth transducers (including but not limited to three-dimensional depth imaging device) and/or one or more two dimensional image Sensor.
Perspective display device 104 farther includes gaze detection subsystem 710, and it is configured to detect each eye of user The direction of gaze of eyeball, as described above.This gaze detection subsystem 710 can be configured to determine in any suitable manner The direction of gaze of each eyes of user.Such as, in the embodiment of this description, this gaze detection subsystem 710 includes one Or multiple flash source 712, such as infrared light supply, it is configured such that the light of corneal reflex of each eyes from user dodges Light, and it is configured to capture one or more imageing sensors 714 of the image of one or more eyes of user.From via The flash of light that determines of view data that imageing sensor 714 is collected and the image of pupil can be used to determine the optical axis of each eyes. It is understood that gaze detection subsystem 710 can have any right quantity and the light source of layout and imageing sensor.
Perspective display device 104 can farther include extra sensor.Such as, perspective display device 104 can include entirely Ball location (GPS) subsystem 716 is to allow the position having an X-rayed display device 104 to be determined.
It is aobvious that perspective display device 104 can include that one or more motion sensor 718 wears perspective user further The motion of detection user's head when showing equipment 104.Exercise data can be used for, such as, image stabilization help correct from Obscuring in the image of the imageing sensor 706 faced out.Similarly, motion sensor 718, and mike 708 and watching attentively Detection subsystem 710, may equally be used the input equipment being used as user so that user can pass through eyes, neck and/or head The posture in portion and coming with to have an X-rayed display subsystem 704 mutual via verbal order.It is understood that the biography of diagram in Fig. 7 Sensor be the purpose in order to illustrate be shown, it is not intended to limit by any way because any other suitable sensor and/ Or the combination of sensor may be utilized.
Perspective display device 104 farther includes calculating equipment 720, and calculating equipment 720 has and sensor, gaze detection Subsystem 710 and Transparence Display subsystem 704 communication logic subsystem 722 and storage subsystem 724.Storage subsystem 724 include that instruction is stored thereon, and this instruction is performed by logic subsystem 722, such as, with the office from capture perspective display device The imageing sensor 706 faced out of the image of portion's environment receives view data, and by view data mark local environment Physical features.This instruction can also be performed to build by the virtual architecture fragment of multiple modules is arranged in adjacent position The augmented reality image of virtual architecture, each modular virtual architecture fragment includes precalculated global illumination effect, and Showing this augmented reality image on physical features, this enhancing display image is from the visual angle of user with this physical features spatially Registration.This instruction can also be performed to detect local light and shine feature, comes based on local light according to this augmented reality image of adjustment of features, And on physical features, showing this augmented reality image, this augmented reality image is by perspective display subsystem 704 and this physics Feature spatially registrates.
Relate to the exemplary hardware of logic subsystem 722, storage subsystem 724 and other assembly as mentioned above Further information will be described below with reference to Fig. 8.
Will be understood that the perspective display device 104 of description is provided by way of example, be therefore not intended limit System.It is therefore to be understood that display device can include and those shown different sensings that are additional and/or that substitute Device, camera, mike, input equipment, outut device etc. are without departing from the scope of the present disclosure.The physical configuration of display device with And its various sensors and subassembly can be to use multiple different form without departing from the scope of the present disclosure.
Furthermore, it is to be appreciated that a calculating system being configured to by perspective display device display augmented reality image System can use any suitable form except head-mounted display apparatus, includes but not limited to that mainframe computer, server calculate Machine, desk computer, laptop computer, tablet PC, home entertaining computer, network computing device, game station, shifting Dynamic calculating equipment, mobile communication equipment (such as, smart phone), other wearable computer etc..It will be further appreciated that , method described above and process can be implemented as computer applied algorithm or service, application programming interface (API), storehouse And/or other computer program.
Fig. 8 shows computer system 800 unrestricted that can perform one or more method described above and process Embodiment.Calculating system 800 is shown in simplified form, and can represent as mentioned above any suitable equipment and/ Or the combination of equipment, include but not limited to what those described hereinbefore with reference to Fig. 1-8.
Calculating equipment 800 includes logic subsystem 802 and storage subsystem 804.Calculating equipment 800 can include alternatively Display subsystem 806, input equipment subsystem 808, communication subsystem 810 and/or other the most unshowned assembly.Meter Calculation system 800 also can include one or more user input device alternatively or be connected with them, eye as described above Eyeball tracing system, and such as, keyboard, mouse, game console, camera (degree of depth and/or two dimension), mike and/or touch Screen.These user input devices can form the part of input equipment subsystem 808 or be connected with input equipment subsystem 808.
Logic subsystem 802 includes configuring the one or more physical equipments performing instruction.Such as, this logic subsystem Can be configured to carry out machine readable instructions, this instruction is one or more application, service, program, routine, storehouse, object, group The part of part, data structure or other logical constructs.This instruction can be implemented to perform task, realize data type, biography The state of defeated one or more assembly or otherwise reach desired result.
Logic subsystem 802 can include the one or more processors being configured to carry out software instruction.Additionally or Alternatively, logic subsystem 802 can include one or more being configured to carry out hardware or the hardware of firmware instructions or firmware is patrolled Collect machine.The processor of logic subsystem 802 can be monokaryon or multinuclear, and the program performed thereon can be configured to use In order, parallel or distributed treatment.Logic subsystem 802 can include the list being distributed between two or more equipment alternatively Solely assembly, it can be positioned at remotely and/or is arranged to collaborative process.The mode of logic subsystem can be virtualized and quilt It is configured to be performed by the remote accessible network computing device being in cloud computing configuration.
Storage subsystem 804 includes the non-transient computer readable storage devices of one or more physics, and it is configured to Preserve data and/or the instruction that can be performed by logic subsystem to realize method described herein and process.When this method and When process is implemented, the state of storage subsystem 804 can be changed, and such as, preserves different data.
Storage subsystem 804 can include removable medium and/or built-in device.Storage subsystem 804 can include that optics is deposited Storage device (such as, CD, DVD, HD-DVD, blu-ray disc etc.), semiconductor memory devices (such as, RAM, EPROM, EEPROM etc.) and/or magnetic storage device (such as, hard disk drive, floppy disk, tape drive, MRAM etc.) And other.Storage subsystem 804 can include volatibility, non-volatile, dynamic, static, read/write, read-only, deposit at random Take, sequential access, position addressable, file addressable and/or content addressable equipment.In certain embodiments, logic subsystem System 802 and storage subsystem 804 can be integrated in one or more unified equipment, such as special IC (ASIC) or SOC(system on a chip).
It is understood that storage subsystem 804 includes the non-transient equipment of one or more physics.But, at some In embodiment, the aspect of instruction described herein can be with transient fashion by not preserved finite duration by physical equipment Pure signal (such as, electromagnetic signal, optical signal etc.) is transmitted.Additionally, the data relevant with the disclosure and/or out of Memory shape Formula can be transmitted by pure signal.
Term " program " can be used to describe the realization of calculating system 800 to perform the aspect of specific function.In some feelings Under condition, program can be performed, by logic subsystem 802, the instruction preserved by storage subsystem 804 and be instantiated.It is appreciated that Be that different programs can be from instantiations such as identical application, service, code block, object, storehouse, routine, API, functions.Equally Ground, identical program can be from instantiations such as different application, service, code block, object, routine, API, functions.Term " program " Single executable file, data file, storehouse, driving, script, data-base recording etc. individuality or their set can be comprised.
Multiple user conversation executable application journey can be across it is understood that " service " as used herein Sequence.One service can be available to one or more system components, program and/or other service.In some implementations, one Individual service may operate on one or more server computing device.
When included, display subsystem 806 can be used to present the vision table of the data preserved by storage subsystem 804 Show.This visual representation can use the form of graphic user interface (GUI).When method described herein and process change are by storing son Data that system preserves when therefore changing the state of storage subsystem, the state of display subsystem 806 can be altered to equally Visually represent the change of bottom data.Display subsystem 806 can include have employed any kind of technology on substantially One or more display devices.This display device can combine one with logic subsystem 802 and/or storage subsystem 804 In individual shared shell, or this display device can be peripheral display device.
When included, communication subsystem 810 can be configured to communicatively coupling calculating system 800 and one or more its Its calculating equipment.Communication subsystem 810 can include from one or more different communication protocols compatibilities wiredly and/or wirelessly Communication equipment.As nonrestrictive example, this communication subsystem can be configured to wireless telephony network or wired Or WLAN or wan communication.In certain embodiments, this communication subsystem can allow calculating system 800 by such as The network of the Internet receives message to other equipment sending messages and/or from other equipment.
It is understandable that what configuration described herein and/or method were exemplary in nature, these specific embodiments It is not seen as in a limiting sense with example, because multiple modification is possible.Specific program described herein or method can Represent any number of and process the one or more of strategy.So, it is illustrated that and/or the various motion that describes can illustrate And/or the order described performs, perform in the other order, executed in parallel, or be omitted.Similarly, steps described above Order can be changed.
The theme of the disclosure includes plurality of step, system and configuration disclosed herein and other features, function, action And/or all novelties of attribute and non-obvious combination and sub-portfolio, and any and all equivalent.

Claims (9)

1. the method (600) showing the augmented reality image including lighting effect in the display device, described method (600) Including:
Receiving (602) view data, described view data captures the image of the local environment of described display device;
Physical features by described view data mark (606) described local environment;
Build (622) spatially to registrate from the visual angle of user with described physical features, for showing at described physical features The augmented reality image of virtual architecture, described augmented reality image includes being disposed in adjacent position to form virtual architecture The multiple modular virtual architecture fragment of feature, each modular virtual architecture fragment includes precalculated global illumination Effect;
The illumination feature of described local environment is identified by described view data;
The outward appearance of the plurality of modular virtual architecture fragment of described illumination adjustment of features based on described local environment;And
Output (640) described augmented reality image is to described display device.
Method the most according to claim 1, the physical features wherein identifying described local environment includes performing described office The network analysis of portion's environment.
Method the most according to claim 1, it is one or more that wherein said physical features includes in wall and ceiling.
Method the most according to claim 1, wherein said physical features is included in the non-structural pair in described local environment As.
Method the most according to claim 1, the empty space that wherein said physical features is included in described local environment.
Method the most according to claim 1, it is precalculated fixed that wherein said precalculated global illumination effect includes To lighting effect.
Method the most according to claim 1, wherein said precalculated global illumination effect includes precalculated spoke Penetrate transfer function.
Method the most according to claim 1, wherein said illumination feature includes the color characteristic of described local environment, and Wherein regulate the outward appearance of the plurality of modular virtual architecture fragment and include imposing on described color characteristic described modular Virtual architecture fragment.
Method the most according to claim 1, wherein said illumination feature includes the position of the physical light in described local environment Put, and the outward appearance wherein regulating the plurality of modular virtual architecture fragment includes calculating by described physical light position The lighting effect that causes of virtual point light.
CN201310757195.XA 2013-12-18 2013-12-18 For the method using precalculated illumination to build augmented reality environment Active CN103761763B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310757195.XA CN103761763B (en) 2013-12-18 2013-12-18 For the method using precalculated illumination to build augmented reality environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310757195.XA CN103761763B (en) 2013-12-18 2013-12-18 For the method using precalculated illumination to build augmented reality environment

Publications (2)

Publication Number Publication Date
CN103761763A CN103761763A (en) 2014-04-30
CN103761763B true CN103761763B (en) 2017-01-04

Family

ID=50528996

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310757195.XA Active CN103761763B (en) 2013-12-18 2013-12-18 For the method using precalculated illumination to build augmented reality environment

Country Status (1)

Country Link
CN (1) CN103761763B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9652897B2 (en) * 2015-06-25 2017-05-16 Microsoft Technology Licensing, Llc Color fill in an augmented reality environment
EP3533504B1 (en) * 2016-11-14 2023-04-26 Huawei Technologies Co., Ltd. Image rendering method and vr device
US11521253B2 (en) * 2017-02-03 2022-12-06 Columbia Insurance Company Autonomous system to assist consumers to select colors
CN109427101A (en) * 2017-08-29 2019-03-05 深圳市掌网科技股份有限公司 A kind of method and system obtaining augmented reality image
CN111448568B (en) * 2017-09-29 2023-11-14 苹果公司 Environment-based application presentation
US10922878B2 (en) * 2017-10-04 2021-02-16 Google Llc Lighting for inserted content
CN108830923B (en) * 2018-06-08 2022-06-17 网易(杭州)网络有限公司 Image rendering method and device and storage medium
CN111176452B (en) * 2019-12-30 2022-03-25 联想(北京)有限公司 Method and apparatus for determining display area, computer system, and readable storage medium
CN113534952A (en) * 2020-08-06 2021-10-22 黄得锋 AR system construction method and application

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6389375B1 (en) * 1999-01-22 2002-05-14 Interlego Ag Virtual reality modelling
CN101246600A (en) * 2008-03-03 2008-08-20 北京航空航天大学 Method for real-time generating reinforced reality surroundings by spherical surface panoramic camera

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080071559A1 (en) * 2006-09-19 2008-03-20 Juha Arrasvuori Augmented reality assisted shopping
US8558837B2 (en) * 2010-01-18 2013-10-15 Disney Enterprises, Inc. Modular radiance transfer

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6389375B1 (en) * 1999-01-22 2002-05-14 Interlego Ag Virtual reality modelling
CN101246600A (en) * 2008-03-03 2008-08-20 北京航空航天大学 Method for real-time generating reinforced reality surroundings by spherical surface panoramic camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
A stereoscopic video see-through augmented reality system based on real-time vision-based registration;Kanbara M et al.;《 VIRTUAL REALITY,2000. Processings.》;20000318;摘要,第3节,图1,7-8 *

Also Published As

Publication number Publication date
CN103761763A (en) 2014-04-30

Similar Documents

Publication Publication Date Title
CN103761763B (en) For the method using precalculated illumination to build augmented reality environment
US10803670B2 (en) Constructing augmented reality environment with pre-computed lighting
CN108369457B (en) Reality mixer for mixed reality
CN102540464B (en) Head-mounted display device which provides surround video
CN102419631B (en) Fusing virtual content into real content
Scarfe et al. Using high-fidelity virtual reality to study perception in freely moving observers
US9429912B2 (en) Mixed reality holographic object development
CN103149689B (en) The reality virtual monitor expanded
TWI567659B (en) Theme-based augmentation of photorepresentative view
CN105050670B (en) Mixed reality experience is shared
CN105981076B (en) Synthesize the construction of augmented reality environment
CN103761085B (en) Mixed reality holographic object is developed
EP2887322B1 (en) Mixed reality holographic object development
JP2020530157A (en) Video generation method and equipment
KR20140014160A (en) Immersive display experience
CN105264478A (en) Hologram anchoring and dynamic positioning
CN110517355A (en) Environment for illuminating mixed reality object synthesizes
CN107810634A (en) Display for three-dimensional augmented reality
KR102197504B1 (en) Constructing augmented reality environment with pre-computed lighting
JP6272687B2 (en) Construction of augmented reality environment with pre-calculated lighting
JP6875029B1 (en) Method, program, information processing device
ES2715023T3 (en) Construction of augmented reality environments with precalculated lighting
US20240127538A1 (en) Scene understanding using occupancy grids
Hamadouche Augmented reality X-ray vision on optical see-through head mounted displays
Morana Impact of Imaging and Distance Perception in VR Immersive Visual Experience

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
ASS Succession or assignment of patent right

Owner name: MICROSOFT TECHNOLOGY LICENSING LLC

Free format text: FORMER OWNER: MICROSOFT CORP.

Effective date: 20150727

C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20150727

Address after: Washington State

Applicant after: Micro soft technique license Co., Ltd

Address before: Washington State

Applicant before: Microsoft Corp.

C14 Grant of patent or utility model
GR01 Patent grant