CN103760972A - Cross-platform augmented reality experience - Google Patents

Cross-platform augmented reality experience Download PDF

Info

Publication number
CN103760972A
CN103760972A CN201310757229.5A CN201310757229A CN103760972A CN 103760972 A CN103760972 A CN 103760972A CN 201310757229 A CN201310757229 A CN 201310757229A CN 103760972 A CN103760972 A CN 103760972A
Authority
CN
China
Prior art keywords
user
player gaming
gaming session
augmented reality
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310757229.5A
Other languages
Chinese (zh)
Other versions
CN103760972B (en
Inventor
S·拉塔
D·麦克洛克
J·斯科特
K·盖斯那
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to CN201310757229.5A priority Critical patent/CN103760972B/en
Publication of CN103760972A publication Critical patent/CN103760972A/en
Application granted granted Critical
Publication of CN103760972B publication Critical patent/CN103760972B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to cross-platform augmented reality experience. Multiple game sessions are hosted on a server system. A first computing device of a first user is added into a first multiplayer game session, and the first computing device comprises a perspective display. Augmented information is set to the first computing device used for the first multiplayer game session so as to provide augmented reality experience to the first user. A second computing device of a second user is added into the first multiplayer game session. Experience information is sent to the second computing device used for the first multiplayer game session so as to provide cross-platform representation of the augmented reality experience to the second user.

Description

Cross-platform augmented reality is experienced
Technical field
The present invention relates to provide cross-platform, augmented reality to experience online in computing system.
Background technology
Large-scale multiplayer online gaming is usually configured to operate on single platform.Typically, user by selecting server and watch the virtual representation of game to participate in large-scale multiplayer online gaming on the fixing display such as HDTV.
Summary of the invention
Embodiment disclosed herein is for providing cross-platform, augmented reality to experience online at computing system.For example, server system can the multiple multi-player gaming session of trustship and computing equipment is added to multi-player gaming session.Server system can provide enhancing information to the computing system of having an X-rayed display device and/or providing experience information extremely to operate in the different platform from perspective display device.Perspective display device can provide augmented reality to experience by enhancing information.The computing system operating in different platform can provide by experience information the cross-platform expression of strong experience of reality.
It is the concept of the selection for the embodiment part following being further described with the form introduction of simplifying that content of the present invention is provided.Content of the present invention is not key feature or the essential feature for Identification Demand protection theme, neither be in order to be used for limiting the scope of claimed theme.In addition, theme required for protection is not limited to solve the realization of any or all of shortcoming of mentioning at disclosure arbitrary portion.
Accompanying drawing explanation
Fig. 1 schematically shows the example communication figure comprising according to the large-scale multiplayer online gaming system of the augmented reality of disclosure embodiment.
Fig. 2 shows according to the method for the multiple multi-player gaming sessions of the trustship of disclosure embodiment.
Fig. 3 shows according to the unaltered view of the physical environment that comprises multi-player gaming session participant of disclosure embodiment.
Fig. 4 shows the example first person view of the augmented reality experience of the physical environment of describing in Fig. 3.
Fig. 5 shows the cross-platform expression of example of the augmented reality experience of describing in Fig. 4.
Fig. 6 has shown according to the method for the multiple multi-player gaming sessions of the trustship of disclosure embodiment.
Fig. 7 schematically illustrates the example head mounted display according to disclosure embodiment.
Fig. 8 is the example calculations system according to disclosure embodiment.
Embodiment
Large-scale multiplayer online gaming is implemented as two dimension or three-dimensional virtual environment conventionally on the single platform such as game console, personal computer or mobile computing device.Previous cross-platform large-scale multiplayer online gaming only utilizes the platform for example, with same type game display (two dimension or three-dimensional picture and/or the animation, playing up on the traditional monitor such as TV, computer monitor and/or mobile phone screen and show).Correspondingly, player only can participate in game together with having other players of the platform that utilizes same type display, has therefore limited the player's who adds this multiplayer online gaming value volume and range of product.Therefore, disclosed embodiment experiences online for cross-platform, large-scale multiplayer, and it allows to have augmented reality, the user of see-through display participates in the reality game strengthening.For example, as described in more detail below, the user of perspective display device can participate in the experience of reality strengthening.The user of the computing equipment operating in different platform can participate in the cross-platform expression that augmented reality is experienced.Therefore the outward appearance that the user that cross-platform expression can be computing equipment brings augmented reality to experience, computing equipment typically can not provide this augmented reality to experience.
Fig. 1 schematically illustrates the example communication figure by cross-platform, the large-scale multiplayer online gaming system 100 of augmented reality that comprises multiple multi-player gaming sessions 101 of game server 102 trustships.The large-scale multiplayer online gaming system 100 of cross-platform, augmented reality can be held multiple cross-platform computing equipment, for example, have an X-rayed display device, be connected to the games system of fixed display devices, the personal computing devices that is connected to external display device, mobile computing device for example laptop computer, smart phone, flat computer etc.As shown in FIG. 1, the large-scale multiplayer online gaming system 100 of augmented reality can comprise perspective display device 104,106 and 108 and comprise the home entertainment system 110 of the games system 112 that is connected to fixed display devices 114.Each computing equipment can pass through network 116 (for example, the Internet) and be connected to game server 102 and participate in large-scale multiplayer online gaming.
Turn to now Fig. 2, shown the method 200 of the multiple multi-player gaming sessions of trustship.For example, the step of game server 102 executing methods 200 of Fig. 1 is with the multiple cross-platform multi-player gaming sessions of trustship.202, game server 102 is added to the first multi-player gaming session by the first computing equipment.In certain embodiments, as shown in 204, the first computing equipment can comprise see-through display.For example, the first computing equipment can be corresponding to the perspective display device 104 of Fig. 1.
206, game server sends enhancing information to the first computing equipment.In certain embodiments, as shown in 208, enhancing information can be corresponding to the virtual environment of multi-player gaming session.Enhancing information can comprise the view data corresponding to the gamespace of virtual environment or multi-player gaming session, and for placing the positional information of view data.In certain embodiments, can provide the positional information relevant to other view data.For example, for the enhancing of setting, can be described relevant to the enhancing for buildings.In certain embodiments, can provide the positional information relevant to physical location.For example, for the enhancing of setting, can be described the position of setting to physical world relevant.Perspective display device can show this enhancing at the ad-hoc location of see-through display based on position known in physical world or tree that detect.Therefore, the see-through display of the first computing equipment can be utilized and strengthen information exchange and cross the experience of reality that the enhancing that multi-player gaming conversational list is shown to the physical environment visual by see-through display presents enhancing.The explanation that the augmented reality of example is experienced is shown in Fig. 3 and 4.
First turn to Fig. 3, shown the unaltered view of example of physical environment 300.For example, physical environment 300 can comprise multiple real-world objects 302.Real-world objects can comprise the virtualized any object that is present in real world, includes but not limited to tree, buildings, continental embankment, people, vehicle etc.Physical environment 300 also can comprise the multiple computing equipments that participate in multi-player gaming session, such as the perspective display device 304 and 306 of being controlled by first user 308 and the second user 310 respectively.For example, perspective display device 304 and 306 can corresponding to the perspective display device 104 in Fig. 1 and 106 and physical environment 300 can be corresponding to the position 1 in Fig. 1.
Fig. 4 shows for the example augmented reality of multi-player gaming session and experiences.The enhancing view 400 of physical environment 300 is depicted as by perspective display device 304 and watches.When watching object by see-through display, strengthening view 400 can provide by presenting the outward appearance of one or more objects changes in real world the experience of reality of enhancing to user 308.For example, perspective display device 304 can show virtual objects 402 that enhancing as real-world objects 302 is to provide the augmented reality environment corresponding to multi-player gaming session.
User 310 outward appearance also can be enhanced to provide the role's who is controlled by user 310 augmented reality version.For example, the object based on role 404 that perspective display device 304 can be virtual in the position display corresponding to user's 310 relevant portions, for example armor and weapon.As depicted, virtual sword 404a can be in the position display corresponding to user's 310 right hands.Project based on role can comprise and is associated with role and is positioned in see-through display any suitable phantom item to strengthen user's outward appearance.
As Fig. 2 210 as shown in, Role Information 406 also can show near the position that strengthens the role that present in view.Role Information can comprise in fact and role and/or user-dependent any information, for example role name, user name, state, equipment, associated person information etc.Role Information can show with any suitable form, for example text, icon, image etc.In certain embodiments, such as pet or companion's the phantom item that belongs to user role, can be presented near user's position.For example, comprise corresponding to user 310 user name and the Role Information 406a of role name and be shown as pop-up box on user 310.
The enhancing of real-world objects can be configured to hide relevant real-world objects.In certain embodiments, these the one or more image masking objects that can describe background by use are realized.For example, when watching by see-through display the tree occurring at part sky above, can strengthen the tree occurring at part sky above at the position display sky image of the see-through display of the position corresponding to tree.In this example, when watching by see-through display, the tree of enhancing shows as sky, the illusion that has caused tree to disappear.
The enhancing view of physical environment also can comprise not the directly virtual objects corresponding to real-world objects in physical environment.For example, virtual role 408 can be shown the user in order to represent the long range positioning computing equipment that participates in multi-player gaming session.The virtual role that also can be long-distance user shows Role Information 406.For example, Role Information 406b is shown as comprising the user name corresponding with the user who is represented by role 408 and the pop-up box of role name.
Transfer back to Fig. 2, at 212 second computing equipments, be added to the first multi-player gaming session.In certain embodiments, the second computing equipment can and/or can operate by long range positioning on the platform different from the first computing equipment.214, game server sends experience information to the second computing equipment.As shown in 216, the cross-platform expression that experience information can provide the augmented reality that is provided to the first computing equipment to experience.Fig. 5 shows the cross-platform expression of example of the augmented reality experience of describing in Fig. 4.
Indoor environment 500 comprises the games system 502 that is connected to fixed outer display device 504.For example, games system 502 and display device 504 can be corresponding to the games system of Fig. 1 112 and display devices 114.Games system 502 can be incorporated in to multi-player gaming session based on any suitable criterion virtually.In certain embodiments, games system 502 can be incorporated in to multi-player gaming session to determine to participate in and/or select multi-player gaming session in response to user 506.In other embodiments, automatically the criterion based on such as technical manual, the preference setting of position, session number, games system and/or any other suitable criterion are incorporated in to multi-player gaming session to games system 502.
Multi-player gaming session can be by presenting to user 506 experiencing 508 demonstrations as the cross-platform expression of augmented reality experience on display device 504.For example, user 506 can see the virtual representation of gamespace, and one or more other players therein with perspective display device participate in multi-player gaming session.Fig. 5 has described first person pattern, and display device 504 shows user 308 and 310 virtual representation from the first person of the virtual role 408 shown in Fig. 4 therein.In certain embodiments, can use third person.For example, visual angle can be by selecting or automatically dynamically change with third person pattern and/or from any other suitable visual angle demonstration to experience 508.
Experiencing 508 can be the cross-platform expression that is configured to the augmented reality experience presenting via display device 504 visions.For example, experiencing 508 can be presented in response to the game server from for example game server 102 receives experience information.As Fig. 2 218 as shown in, experience information can comprise the each side corresponding to the physical environment of multi-player gaming session.Experience information can additionally or alternatively comprise the expression corresponding to the enhancing information of the virtual environment of multi-player gaming session or the information of enhancing.
The each side of physical environment can provide in any suitable manner, and can comprise any information about physical environment, such as the depth information of object in topological characteristic, physical environment, landmark location, haulage track, time, weather conditions etc. of one day.The each side of physical environment can be by being bonded to each side and/or the element relevant to each side multi-player gaming session for configuration, change and/or improving game playability.For example, game playability at night can be from different by day.In certain embodiments, game server can be stored the each side of physical environment, from third party database, receive about the information of physical environment, and/or from participant's reception information of multi-player gaming session.For example, perspective display device can detect via one or more sensors of perspective display device the each side of physical environment, and the each side detecting is sent to the computing equipment of server and/or other participation multi-player gaming session.
In certain embodiments, game server can identify the information that gamespace all and for multi-player gaming session is relevant and be particular platform configuration information.Gamespace information can be described multi-player gaming session by comprising such as the information of physical context information and corresponding virtual environment information.For example, game server can send experience information to computing equipment by the form of the two dimension of gamespace or three dimensional representation, and such as games system 502, thereby it can be displayed on display device 504.Therefore the cross-platform expression that, augmented reality is experienced can provide the experience corresponding to multi-player gaming session to the user who has different computing equipments from other users in multi-player gaming session.
The cross-platform expression that augmented reality is experienced and augmented reality is experienced can allow multiple users of the multiple computing equipments that operate in different platform to participate in the multi-player gaming session of large-scale multiplayer online gaming.For example, for the gamespace of multi-player gaming session, can be provided to perspective display device in the mode different from games system.As discussed above, enhancing information can be provided to perspective display device, and such as the perspective display device 304 in Fig. 3, and experience information can be provided to the computing equipment operating in different platform, such as the games system 502 in Fig. 5.Correspondingly, enhancing information can be different data layouts, comprises different data structures, has different data sources, and/or different from experience information in any suitable manner, with in order to be configured for the experience presenting on the display device of platform separately.
Computer equipment can also be carried out Overpassing Platform by Using so that social aspect to be provided to game.For example, the communication information can send between the perspective display device 104 such as Fig. 1 and the first and second computing equipments of games system 112.The communication information can comprise from text and/or the voice of user's input of computing equipment.In certain embodiments, the communication information can directly send between computing equipment, sends, and/or send by any third party's computing equipment by game server.Computing equipment and/or game server can comprise the transfer capability of text to voice and/or voice-to-text.Therefore, voice output can derive and text output can be derived from phonetic entry from text input.For example, the user 308 in Fig. 3 can provide phonetic entry to having an X-rayed display device 304 to communicate by letter with the user 506 of Fig. 5.As response, perspective display device 304, game server 102, and/or games system 502 can convert phonetic entry to the text output showing on the display device 504 that will describe in Fig. 5.Communication can provide between any group of computing equipment, is included in private conversation between selected computing equipment, across the private conversation of whole gaming session, across private conversation of all gaming sessions etc.
User can provide user to input to control relevant user role and/or mutual with experience.For example, user can provide user to input with in order to create, to delete and/or otherwise change one or more elements in multi-player gaming session.User's input can be by the combine detection of any suitable user input device or input equipment, such as posture detection equipment, microphone, Inertial Measurement Unit etc.For example, the user 308 of Fig. 3 can point to a target, such as the virtual role 408 of describing in Fig. 4, and carries out corresponding to the posture of attacking and carrys out target of attack with the role of control association.Posture can be caught by one or more sensors of perspective display device 304, and such as imageing sensor, and posture can be matched with corresponding order so that the role who is associated with user 308 carries out desired attack.
User is user's input capable of being combined also, for example, and by pointing to target and saying strike order target is carried out to corresponding attack.For example, when the microphone of perspective display device 304 detects the voice command from user 308, the Inertial Measurement Unit that is connected to the perspective display device 304 of Fig. 3 can detect the posture from user 308.User's input also can allow the menu item of user and experience to carry out alternately.For example, the user 308 of Fig. 3 can point to the second user, such as the user 506 of Fig. 5, or long-distance user's virtual representation, such as the virtual role 408 of Fig. 4, to show the additional information about respective user and/or role.
In certain embodiments, experience can comprise that the map of multi-player gaming session is as coverage diagram and/or the enhancing of the physical environment map relevant to multi-player gaming session.For example, the position that map can provide the user of computing system to represent, it is dynamically updated with the current location corresponding to computing system.In other words, the travelling of real world can automatically be upgraded in experience.Map can provide to be experienced and/or any the destination locations virtually of real world, such as position of other users, search, virtual objects, real-world objects etc.In certain embodiments, map can be by customization.For example, map can be filtered only to show user's friend, enemy, search of selection etc.When exiting from multi-player gaming session and/or game server, map can be visual.For example, user can watch map with state in the game of identifying user physical location.
Multi-player gaming session can have the feature that depends on position.Such as, multi-player gaming session can be corresponding to one or more specific regions of real world.In certain embodiments, in search or other game, event can be tied to real world continental embankment, thereby in playing when user is in the adjacency threshold value of real world continental embankment, event is activated.When the specific real-world locations of a large number of users in multi-player gaming session and/or virtual location exceed threshold value, in game, event also can be triggered.In certain embodiments, in game, event can be created, deletes and/or otherwise by user, be changed.For example, user creatable will be placed on the virtual objects in the gamespace of multi-player gaming session.In response to the establishment of virtual objects, another user can receive the search that relates to this virtual objects.The virtual objects creating can retain and be regarded as the part of this relevant search.
Fig. 6 has shown that the position based on two computing equipments is added to two computing equipments the exemplary method 600 of multi-player gaming session.602, game server, such as the game server 102 of Fig. 1, can receive the position of the first computing equipment.As shown in 604, the first computing equipment can comprise the first see-through display, such as the perspective display device 104 of Fig. 1.The position of computing equipment can be determined in any suitable manner.For example, the position of the first computing equipment can be determined by the each side that detects physical environment via one or more sensors of the first computing equipment.
At 606, the first computing equipments, be added into the first multi-player gaming session.As shown in 608, the position that the first multi-player gaming session can be based on the first computing equipment and being selected.In fact about any suitable criterion of position, can instruct as the selection in 608 described multi-player gaming sessions.For example, multiple multi-player gaming sessions can be separately corresponding to different real-world area.
In certain embodiments, computing equipment can be arranged in the multi-player gaming session corresponding to the real-world area of placement computing equipment.In embodiment additional or that replace, position can be to be that computing equipment is selected a criterion in multiple criterions of multi-player gaming session.For example, computing equipment can join corresponding in the multi-player gaming session that approaches real-world area most having higher or lower than the number of number threshold value.As another example, computing equipment can join corresponding to having in the one or more good friends' of computing equipment user the multi-player gaming session that approaches real-world area most.
Adding after the first multi-player gaming session, 610, for the enhancing information of the first multi-player gaming session, can be sent to the first computing equipment.This enhancing information can be used to strengthen by see-through display the authenticity of the physical environment of watching by see-through display.
612, game server receives the position of the second computing equipment.As shown in 614, the second computing equipment can comprise the second see-through display, such as the perspective display device 106 or 108 of Fig. 1.616, game server determines that the position of the second computing equipment is whether in the adjacency threshold value of the position of the first computing equipment.For example, maximum and/or the minor increment between adjacency threshold value definable the first and second computing equipments.As another example, adjacency threshold value definable game border (for example urban district, Seattle).In certain embodiments, adjacency threshold value can be a criterion of the multiple criterions for the second computing equipment being joined to multi-player gaming session.
If the position of the second computing equipment, in the adjacency threshold value of the position of the first computing equipment, joins the second computing equipment in the first multi-player gaming session at 618 game servers.For example, the second computing equipment can be corresponding to the see-through display equipment 106 of Fig. 1.The dotted line frame that represents position 1 can illustrate that perspective display device 104 and 106 is in adjacency threshold value each other.Therefore, these equipment can be added in identical multi-player gaming session.
Alternatively, if the position of the second computing equipment not in the adjacency threshold value of the position of the first computing equipment, 620, game server joins the second computing equipment in the second multi-player gaming session that is different from the first multi-player gaming session.In this example, the second computing equipment can be corresponding to the perspective display device 108 in position 2 in Fig. 1.Because position 2 can be considered to outside the adjacency threshold value of perspective display device 104, perspective display device 108 can be added in the multi-player gaming session different from having an X-rayed display device 104.For example, the first multi-player gaming session can for example, corresponding to primary importance (, Seattle) and comprise the search (for example, the shooting to Seattle particular landmark) of the continental embankment that is tied to primary importance.The second multi-player gaming session can be corresponding to the second place (for example, Portland) and comprise the search (for example,, for searching near the treasure hunt game of the virtual objects particular landmark that is hidden in Portland) of the continental embankment that is tied to the second place.
In certain embodiments, if when the second computing equipment moves to the position in the adjacency threshold value of the first computing equipment position, the second computing equipment can be injected towards in the first multi-player gaming session.For example, the second computing equipment can exit game server and/or the second multi-player gaming session and move to a new physical location in the adjacency threshold value of the first computing equipment.Once log in back game server and/or the second multi-player gaming session, the second computing equipment can be added in the first multi-player gaming session.In another example, the second computing equipment can move and from multi-player gaming conversation, not open connection in the adjacency threshold value of the first computing equipment position.As response, the second computing equipment can automatically be added the first multi-player gaming session.
Fig. 7 has shown the non-limiting example of the perspective display device 104 that comprises see-through display 702.For example, perspective display device 104 can be wear-type formula perspective display device.In certain embodiments, such as the perspective display device of perspective display device 104, can be integrated as shown in FIG. 7.In the embodiment replacing, perspective display device can be modularization computing system.This computing system can comprise see-through display and is coupled to communicatedly one or more other assemblies of see-through display.
See-through display 702 is transparent at least in part, thereby allows light to reach eyes of user by see-through display.In addition the outward appearance that, see-through display is configured visually to strengthen physical space is to watching the user of physical space by see-through display.For example, when user watches by see-through display, see-through display can show the visible virtual objects of user.So, user can watch at the non-existent virtual objects of physical space when watching physical space.This has manufactured virtual objects is the illusion of the part of physical space.
Perspective display device 104 also comprises virtual reality engine 704.Virtual reality engine 704 can be configured to make see-through display visually to present the enhancing of one or more virtual objects as real-world objects.Virtual objects can simulate real world object outward appearance.For the user who watches physical space by see-through display, virtual objects seems to integrate with physical space and/or with real-world objects.For example, virtual objects and/or other image showing via see-through display can be located relatively with user's eyes, thereby make shown virtual objects and/or image allow user seem to occupy the ad-hoc location in physical space.By this way, user can watch in physical space the not object of necessary being.Virtual reality engine can comprise software, hardware, firmware or their combination in any.
Perspective display device 104 can comprise speaker subsystem 706 and sensor subsystem 708.In different embodiment, sensor subsystem can comprise various sensor.As nonrestrictive example, sensor subsystem can comprise the infrared and/or visible light camera 712 of microphone 710, one or more preposition (away from users) and/or one or more postposition (towards user) is infrared and/or visible light camera 714.Preposition camera can comprise one or more depth cameras, and/or rearmounted camera can comprise one or more eye tracking cameras.In certain embodiments, on-line sensor subsystem can be communicated by letter with the one or more off-line sensors that send observation information to on-line sensor subsystem.For example, the depth camera being used by game console can send depth map and/or modeling dummy skeleton to the sensor subsystem of head mounted display.
Perspective display device 104 also can comprise that allowing to have an X-rayed display device is worn on by the one or more features in account.In the example illustrating, perspective display device 104 adopts the form of glasses and comprises nose frame 716 and ear frame 718a and 718b.In other embodiments, head mounted display can comprise having cap or the helmet of having an X-rayed safety goggles in front.In addition, although describe in the context of wear-type see-through display, concept described herein can be applied to not being that the see-through display worn (for example, windshield) and be not perspective display in (for example, opaque display, it uses the not virtual objects in the camera visual field to play up the real object of being observed by camera).
Perspective display device 104 also can comprise communication subsystem 720.Communication subsystem 720 can be configured to and one or more calculated off-line devices communicatings.As an example, communication subsystem can be configured to that receiver, video stream wirelessly, audio stream, coordinate information, virtual objects are described and/or out of Memory to play up enhancing information as the experience of reality strengthening.
In certain embodiments, above-mentioned Method and Process can be tied to the computing system of one or more computing equipments.Especially, this Method and Process can be implemented as computer applied algorithm or service, application programming interface (API), storehouse and/or other computer program.
Fig. 8 has schematically shown the non-limiting example that can carry out the one or more computing systems 800 among said method and process.Computing system 800 illustrates in simplified form.Should be appreciated that, in fact any computer architecture can be used and do not departed from the scope of the present disclosure.In different embodiment, the form that computing system 800 can be taked wear-type perspective display device (for example, perspective display device 104), game station (for example, games system 502), mobile computing device, mobile communication equipment (for example, smart phone), desk-top computer, laptop computer, flat computer, home entertaining computing machine, network computing device, mainframe computer, server computer etc.
Computing system 800 comprises logic subsystem 802 and storage subsystem 804.Computing system 800 comprises display subsystem 806 (for example, see-through display), input subsystem 808, communication subsystem 810 and/or other assembly not showing in Fig. 8 alternatively.
Logic subsystem 802 comprises the one or more physical equipments that are configured to carry out instruction.For example, logic subsystem can be configured to carry out instruction, and this instruction is the part of one or more application, service, program, routine, storehouse, object, assembly, data structure or other logical construct.Such instruction can be implemented to execute the task, realizes data type, converts the state of one or more assemblies or otherwise reach required result.
Logic subsystem can comprise the one or more processors that are configured to executive software instruction.Additionally or alternatively, logic subsystem can comprise the one or more hardware or the firmware logic machine that are configured to carry out hardware or firmware instructions.The processor of logic subsystem can be monokaryon or multinuclear, and the program of carrying out thereon can be arranged to serial, that walk abreast or distributed treatment.Logic subsystem is included in the independent assembly distributing between two or more equipment alternatively, and these independent assemblies can long range positionings and/or are arranged to Coordination Treatment.The each side of logic subsystem can the virtual and execution by the networking computing equipment that configures the remote accessible being configured with cloud computing.
Storage subsystem 804 comprises one or more physics, non-transient equipment, and this one or more physics, non-transient equipment are configured to keep the data that can be carried out by logic subsystem and/or instruction to realize Method and Process described here.When such Method and Process is implemented, the state of storage subsystem 804 can be converted---for example, and in order to keep different data.
Storage subsystem 804 can comprise removable medium and/or built-in device.Storage subsystem 804 (for example can comprise light storage device, CD, DVD, HD-DVD, Blu-ray disc etc.), semiconductor memory apparatus (for example, RAM, EPROM, EEPROM etc.) and/or magnetic storage apparatus is (for example, hard disk drive, floppy disk, tape drive, MRAM etc.), and other.Storage subsystem 804 can comprise volatibility, non-volatile, dynamic, static, read/write, read-only, random access, sequential access, position addressable, file addressable and/or content addressable equipment.
Will be appreciated that storage subsystem 804 comprises one or more physics, non-transient equipment.But in certain embodiments, the each side of instruction described herein can be by transient state mode for example, by not kept the pure signal (, electromagnetic signal, light signal etc.) in a limited time limit to propagate by physical equipment.In addition, the information of the data relevant with the disclosure and/or other form can be propagated by pure signal.
In certain embodiments, the each side of logic subsystem 802 and storage subsystem 804 can together be integrated into one or more hardware-logic modules, by described assembly, carry out described herein functional.These hardware logic assemblies can comprise: for example, and field programmable gate array (FPGA), program and application specific integrated circuit (PASIC/ASIC), program and application specific standardized product (PSSP/ASSP), SOC (system on a chip) (SOC) system and complex programmable logic equipment (CPLD).
Term " program " and " engine " can be used to describe the one side that is implemented as the computing system 800 of carrying out specific function.In some situation, program or engine can be carried out the instruction being kept by storage subsystem 804 and be initialised via logic subsystem 802.Should be understood that distinct program and/or engine can initialization from identical application, service, code block, object, storehouse, routine, API, function etc.Similarly, identical program and/or engine can be by initialization such as different application, service, code block, object, routine, API, functions.Term " program " and " engine " can comprise single or one group of executable file, data file, storehouse, driving, script, data-base recording etc.
Will be appreciated that " service " as used herein can be can be across the executable application program of multiple user conversations.Service can be used for one or more system components, program and/or other service.In some implementations, service may operate on one or more server computing devices.
When being included, display subsystem 806 can be used to present the visual representation of the data that kept by storage subsystem 804.This visual representation can take to be revealed as the form of the image that strengthens physical space, therefore creates the illusion of augmented reality.As Method and Process described herein, changed the data that kept by storage subsystem, and therefore changed the state of storage subsystem, the state of display subsystem 806 can be converted visually to represent the change in bottom data equally.Display subsystem 806 can comprise the one or more display devices that in fact utilize any type of technology.Such display device can be combined in and for example share, in encapsulation (, head mounted display) with logic subsystem 802 and/or storage subsystem 804, or such display device can be peripheral display device.
When being included, input subsystem 808 can comprise one or more user input devices or mutual with one or more user input devices, user input device such as game console, posture input checkout equipment, voice recognition unit, Inertial Measurement Unit, keyboard, mouse or touch-screen.In certain embodiments, input subsystem can comprise selected natural user's input (NUI) assembly or input (NUI) component interaction with selected natural user.This assembly can be integrated or external, and the conversion of input action and/or process can operate online or off-line.The NUI assembly of example can comprise for speaking and/or the microphone of voice recognition; For infrared, colored, solid and/or the depth camera of machine vision and/or gesture recognition; For head-tracker, eye tracker, accelerometer and/or the gyroscope of motion detection and/or intention identification; And for estimating the electric field sensing assembly of brain activity.
When being included, communication subsystem 810 can be configured by computer system 800 and one or more other computing device communication be coupled.Communication subsystem 810 can comprise the wired and/or wireless communication facilities with one or more different communication protocol compatibilities.As nonrestrictive example, communication subsystem can be configured via wireless telephony network or wired or wireless local or wan communication.In certain embodiments, communication subsystem can allow computing system 800 to send information and/or receive information from miscellaneous equipment to miscellaneous equipment via network (such as the Internet).
Should be understood that configuration described herein and/or method are exemplary in essence, and these specific embodiments or example be not circumscribed, because may there be a large amount of variations.Concrete routine described herein or method can represent processing policy one or more of arbitrary number.So, shown and/or describe each action can by shown and/or describe order carry out, by other order carry out, concurrently carry out or be left in the basket.Similarly, the order of said process can be changed.
Theme of the present disclosure comprises various process disclosed herein, system and configuration, and further feature, function, action and/or characteristic with and all novelties of any and whole equivalents with non-obvious combination and sub-portfolio.

Claims (10)

1. the method (200) for the multiple gaming sessions of trustship (101) on server system (102), described method comprises:
Add (202) to the first multi-player gaming session first computing equipment (104) of first user, the first computing equipment comprises (204) see-through display (702);
(206) the enhancing information of transmission to for the first computing equipment of the first multi-player gaming session to provide (208) augmented reality to experience to first user;
Add (212) to the first multi-player gaming session the second computing equipment (112) of the second user; And
Send (214) experience information to the cross-platform expression of the second computing equipment for the first multi-player gaming session to provide (216) augmented reality to experience to the second user.
2. method as claimed in claim 1, the cross-platform expression that wherein said augmented reality is experienced is configured to carry out vision via the display device that is connected to described the second computing equipment and presents.
3. method as claimed in claim 1, wherein said experience information comprises the each side of the physical environment being detected by see-through display.
4. method as claimed in claim 1, wherein said enhancing information comprises the Role Information that strengthens described the 3rd user's outward appearance when watching the 3rd user by described see-through display.
5. method as claimed in claim 1, wherein said augmented reality is experienced and is comprised event in the game being activated when time in the adjacency threshold value of described the first computing equipment in real world continental embankment.
6. for participating in the computing system (104) of multi-player gaming, described computing system comprises:
See-through display (702);
One or more sensors (708);
Logic subsystem (802);
Storage subsystem (804), described storage subsystem (804) is configured to store instruction, and described instruction makes described computing system when being performed:
Determine the position of described computing system;
Send the position of described computing system to game server;
From the described game server for the first multi-player gaming session, receive enhancing information, described the first multi-player gaming session is corresponding to the position of described computing system;
In described see-through display, by enhancing information, present (208) augmented reality and experience, described augmented reality is experienced the enhancing that described multi-player gaming conversational list is shown to the physical environment that can watch by described see-through display;
Via one or more sensors, detect the each side of (218) described physical environment; And
Transmission is corresponding to the extremely cross-platform expression of described game server to provide (216) augmented reality to experience to the one or more remote computing device that participate in described the first multi-player gaming session of experience information of the each side of the described physical environment detecting.
7. computing system as claimed in claim 6, the cross-platform expression that wherein said augmented reality is experienced is configured to carry out vision via the display device that is connected to one or more remote computing device and presents.
8. computing system as claimed in claim 6, the position of wherein said computing system is that the each side that detects described computing system physical environment via one or more sensors is determined.
9. computing system as claimed in claim 6, wherein the first multi-player gaming session is in multiple multi-player gaming sessions, each in multiple multi-player gaming sessions is corresponding to different real-world area.
10. computing system as claimed in claim 6, wherein said instruction further makes described system when watching user by described see-through display, in described see-through display, show the enhancing of Role Information as the user of the described multi-player gaming session of participation when being performed.
CN201310757229.5A 2013-12-18 2013-12-18 Cross-platform augmented reality experience Expired - Fee Related CN103760972B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310757229.5A CN103760972B (en) 2013-12-18 2013-12-18 Cross-platform augmented reality experience

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310757229.5A CN103760972B (en) 2013-12-18 2013-12-18 Cross-platform augmented reality experience

Publications (2)

Publication Number Publication Date
CN103760972A true CN103760972A (en) 2014-04-30
CN103760972B CN103760972B (en) 2017-03-01

Family

ID=50528224

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310757229.5A Expired - Fee Related CN103760972B (en) 2013-12-18 2013-12-18 Cross-platform augmented reality experience

Country Status (1)

Country Link
CN (1) CN103760972B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106897157A (en) * 2017-01-18 2017-06-27 北京商询科技有限公司 A kind of method based on the shared content of mixed reality
CN108245891A (en) * 2016-12-29 2018-07-06 中兴通讯股份有限公司 System and method is realized in a kind of headset equipment, game interaction platform, table trip
WO2018234866A3 (en) * 2017-06-23 2019-02-14 Zyetric Virtual Reality Limited First-person role playing interactive augmented reality
CN110837299A (en) * 2019-11-11 2020-02-25 上海萃钛智能科技有限公司 Activity management intelligent device, system and method
CN110869096A (en) * 2017-05-05 2020-03-06 索尼互动娱乐有限责任公司 Live streaming of mobile user interfaces without installing applications
CN113728362A (en) * 2019-02-25 2021-11-30 奈安蒂克公司 Augmented reality moving edge computing
US11757761B2 (en) 2019-12-20 2023-09-12 Niantic, Inc. Data hierarchy protocol for data transmission pathway selection
US11833420B2 (en) 2018-06-27 2023-12-05 Niantic, Inc. Low latency datagram-responsive computer network protocol
WO2024052782A1 (en) * 2022-09-06 2024-03-14 Niantic, Inc. Dynamically generated local virtual events

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120032977A1 (en) * 2010-08-06 2012-02-09 Bizmodeline Co., Ltd. Apparatus and method for augmented reality
CN102436663A (en) * 2010-08-12 2012-05-02 株式会社泛泰 User equipment, server, and method for selectively filtering augmented reality
US20130127907A1 (en) * 2011-11-22 2013-05-23 Samsung Electronics Co., Ltd Apparatus and method for providing augmented reality service for mobile terminal
CN103221953A (en) * 2010-09-23 2013-07-24 诺基亚公司 Methods, apparatuses and computer program products for grouping content in augmented reality
CN103366708A (en) * 2012-03-27 2013-10-23 冠捷投资有限公司 Transparent display with real scene tour-guide function
CN103380631A (en) * 2010-12-22 2013-10-30 英特尔公司 Techniques for mobile augmented reality applications

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120032977A1 (en) * 2010-08-06 2012-02-09 Bizmodeline Co., Ltd. Apparatus and method for augmented reality
CN102436663A (en) * 2010-08-12 2012-05-02 株式会社泛泰 User equipment, server, and method for selectively filtering augmented reality
CN103221953A (en) * 2010-09-23 2013-07-24 诺基亚公司 Methods, apparatuses and computer program products for grouping content in augmented reality
CN103380631A (en) * 2010-12-22 2013-10-30 英特尔公司 Techniques for mobile augmented reality applications
US20130127907A1 (en) * 2011-11-22 2013-05-23 Samsung Electronics Co., Ltd Apparatus and method for providing augmented reality service for mobile terminal
CN103366708A (en) * 2012-03-27 2013-10-23 冠捷投资有限公司 Transparent display with real scene tour-guide function

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108245891A (en) * 2016-12-29 2018-07-06 中兴通讯股份有限公司 System and method is realized in a kind of headset equipment, game interaction platform, table trip
CN108245891B (en) * 2016-12-29 2023-07-18 南京中兴新软件有限责任公司 Head-mounted equipment, game interaction platform and table game realization system and method
CN106897157A (en) * 2017-01-18 2017-06-27 北京商询科技有限公司 A kind of method based on the shared content of mixed reality
CN110869096B (en) * 2017-05-05 2023-09-15 索尼互动娱乐有限责任公司 Instant streaming of mobile user interfaces without installing applications
CN110869096A (en) * 2017-05-05 2020-03-06 索尼互动娱乐有限责任公司 Live streaming of mobile user interfaces without installing applications
WO2018234866A3 (en) * 2017-06-23 2019-02-14 Zyetric Virtual Reality Limited First-person role playing interactive augmented reality
US11833420B2 (en) 2018-06-27 2023-12-05 Niantic, Inc. Low latency datagram-responsive computer network protocol
US11794101B2 (en) 2019-02-25 2023-10-24 Niantic, Inc. Augmented reality mobile edge computing
CN113728362A (en) * 2019-02-25 2021-11-30 奈安蒂克公司 Augmented reality moving edge computing
CN113728362B (en) * 2019-02-25 2024-04-16 奈安蒂克公司 Augmented reality movement edge computation
CN110837299A (en) * 2019-11-11 2020-02-25 上海萃钛智能科技有限公司 Activity management intelligent device, system and method
US11757761B2 (en) 2019-12-20 2023-09-12 Niantic, Inc. Data hierarchy protocol for data transmission pathway selection
WO2024052782A1 (en) * 2022-09-06 2024-03-14 Niantic, Inc. Dynamically generated local virtual events

Also Published As

Publication number Publication date
CN103760972B (en) 2017-03-01

Similar Documents

Publication Publication Date Title
CN103760972A (en) Cross-platform augmented reality experience
CN112090069B (en) Information prompting method and device in virtual scene, electronic equipment and storage medium
US8894484B2 (en) Multiplayer game invitation system
US20140128161A1 (en) Cross-platform augmented reality experience
CN111408133B (en) Interactive property display method, device, terminal and storage medium
US20150070274A1 (en) Methods and systems for determining 6dof location and orientation of head-mounted display and associated user movements
US20140125698A1 (en) Mixed-reality arena
CN111672099B (en) Information display method, device, equipment and storage medium in virtual scene
JP2015116336A (en) Mixed-reality arena
JPWO2019130864A1 (en) Information processing equipment, information processing methods and programs
CN111672110B (en) Control method, device, storage medium and equipment for virtual role in virtual world
KR102432011B1 (en) Systems and methods for transcribing user interface elements of a game application into haptic feedback
CN111714886A (en) Virtual object control method, device, equipment and storage medium
CN110585706B (en) Interactive property control method, device, terminal and storage medium
CN103785169A (en) Mixed reality arena
CN111569414B (en) Flight display method and device of virtual aircraft, electronic equipment and storage medium
CN113018862A (en) Virtual object control method and device, electronic equipment and storage medium
US20230271087A1 (en) Method and apparatus for controlling virtual character, device, and storage medium
CN112156463B (en) Role display method, device, equipment and medium
CN115430153A (en) Collision detection method, device, apparatus, medium, and program in virtual environment
EP2886171A1 (en) Cross-platform augmented reality experience
JP2015116339A (en) Cross-platform augmented reality experience
JP6545761B2 (en) Information processing method, apparatus, and program for causing a computer to execute the information processing method
WO2024021781A1 (en) Interaction method and apparatus for virtual objects, and computer device and storage medium
CN112843682B (en) Data synchronization method, device, equipment and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
ASS Succession or assignment of patent right

Owner name: MICROSOFT TECHNOLOGY LICENSING LLC

Free format text: FORMER OWNER: MICROSOFT CORP.

Effective date: 20150728

C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20150728

Address after: Washington State

Applicant after: Micro soft technique license Co., Ltd

Address before: Washington State

Applicant before: Microsoft Corp.

GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170301

Termination date: 20171218