Embodiment
Large-scale multiplayer online gaming is implemented as two dimension or three-dimensional virtual environment conventionally on the single platform such as game console, personal computer or mobile computing device.Previous cross-platform large-scale multiplayer online gaming only utilizes the platform for example, with same type game display (two dimension or three-dimensional picture and/or the animation, playing up on the traditional monitor such as TV, computer monitor and/or mobile phone screen and show).Correspondingly, player only can participate in game together with having other players of the platform that utilizes same type display, has therefore limited the player's who adds this multiplayer online gaming value volume and range of product.Therefore, disclosed embodiment experiences online for cross-platform, large-scale multiplayer, and it allows to have augmented reality, the user of see-through display participates in the reality game strengthening.For example, as described in more detail below, the user of perspective display device can participate in the experience of reality strengthening.The user of the computing equipment operating in different platform can participate in the cross-platform expression that augmented reality is experienced.Therefore the outward appearance that the user that cross-platform expression can be computing equipment brings augmented reality to experience, computing equipment typically can not provide this augmented reality to experience.
Fig. 1 schematically illustrates the example communication figure by cross-platform, the large-scale multiplayer online gaming system 100 of augmented reality that comprises multiple multi-player gaming sessions 101 of game server 102 trustships.The large-scale multiplayer online gaming system 100 of cross-platform, augmented reality can be held multiple cross-platform computing equipment, for example, have an X-rayed display device, be connected to the games system of fixed display devices, the personal computing devices that is connected to external display device, mobile computing device for example laptop computer, smart phone, flat computer etc.As shown in FIG. 1, the large-scale multiplayer online gaming system 100 of augmented reality can comprise perspective display device 104,106 and 108 and comprise the home entertainment system 110 of the games system 112 that is connected to fixed display devices 114.Each computing equipment can pass through network 116 (for example, the Internet) and be connected to game server 102 and participate in large-scale multiplayer online gaming.
Turn to now Fig. 2, shown the method 200 of the multiple multi-player gaming sessions of trustship.For example, the step of game server 102 executing methods 200 of Fig. 1 is with the multiple cross-platform multi-player gaming sessions of trustship.202, game server 102 is added to the first multi-player gaming session by the first computing equipment.In certain embodiments, as shown in 204, the first computing equipment can comprise see-through display.For example, the first computing equipment can be corresponding to the perspective display device 104 of Fig. 1.
206, game server sends enhancing information to the first computing equipment.In certain embodiments, as shown in 208, enhancing information can be corresponding to the virtual environment of multi-player gaming session.Enhancing information can comprise the view data corresponding to the gamespace of virtual environment or multi-player gaming session, and for placing the positional information of view data.In certain embodiments, can provide the positional information relevant to other view data.For example, for the enhancing of setting, can be described relevant to the enhancing for buildings.In certain embodiments, can provide the positional information relevant to physical location.For example, for the enhancing of setting, can be described the position of setting to physical world relevant.Perspective display device can show this enhancing at the ad-hoc location of see-through display based on position known in physical world or tree that detect.Therefore, the see-through display of the first computing equipment can be utilized and strengthen information exchange and cross the experience of reality that the enhancing that multi-player gaming conversational list is shown to the physical environment visual by see-through display presents enhancing.The explanation that the augmented reality of example is experienced is shown in Fig. 3 and 4.
First turn to Fig. 3, shown the unaltered view of example of physical environment 300.For example, physical environment 300 can comprise multiple real-world objects 302.Real-world objects can comprise the virtualized any object that is present in real world, includes but not limited to tree, buildings, continental embankment, people, vehicle etc.Physical environment 300 also can comprise the multiple computing equipments that participate in multi-player gaming session, such as the perspective display device 304 and 306 of being controlled by first user 308 and the second user 310 respectively.For example, perspective display device 304 and 306 can corresponding to the perspective display device 104 in Fig. 1 and 106 and physical environment 300 can be corresponding to the position 1 in Fig. 1.
Fig. 4 shows for the example augmented reality of multi-player gaming session and experiences.The enhancing view 400 of physical environment 300 is depicted as by perspective display device 304 and watches.When watching object by see-through display, strengthening view 400 can provide by presenting the outward appearance of one or more objects changes in real world the experience of reality of enhancing to user 308.For example, perspective display device 304 can show virtual objects 402 that enhancing as real-world objects 302 is to provide the augmented reality environment corresponding to multi-player gaming session.
User 310 outward appearance also can be enhanced to provide the role's who is controlled by user 310 augmented reality version.For example, the object based on role 404 that perspective display device 304 can be virtual in the position display corresponding to user's 310 relevant portions, for example armor and weapon.As depicted, virtual sword 404a can be in the position display corresponding to user's 310 right hands.Project based on role can comprise and is associated with role and is positioned in see-through display any suitable phantom item to strengthen user's outward appearance.
As Fig. 2 210 as shown in, Role Information 406 also can show near the position that strengthens the role that present in view.Role Information can comprise in fact and role and/or user-dependent any information, for example role name, user name, state, equipment, associated person information etc.Role Information can show with any suitable form, for example text, icon, image etc.In certain embodiments, such as pet or companion's the phantom item that belongs to user role, can be presented near user's position.For example, comprise corresponding to user 310 user name and the Role Information 406a of role name and be shown as pop-up box on user 310.
The enhancing of real-world objects can be configured to hide relevant real-world objects.In certain embodiments, these the one or more image masking objects that can describe background by use are realized.For example, when watching by see-through display the tree occurring at part sky above, can strengthen the tree occurring at part sky above at the position display sky image of the see-through display of the position corresponding to tree.In this example, when watching by see-through display, the tree of enhancing shows as sky, the illusion that has caused tree to disappear.
The enhancing view of physical environment also can comprise not the directly virtual objects corresponding to real-world objects in physical environment.For example, virtual role 408 can be shown the user in order to represent the long range positioning computing equipment that participates in multi-player gaming session.The virtual role that also can be long-distance user shows Role Information 406.For example, Role Information 406b is shown as comprising the user name corresponding with the user who is represented by role 408 and the pop-up box of role name.
Transfer back to Fig. 2, at 212 second computing equipments, be added to the first multi-player gaming session.In certain embodiments, the second computing equipment can and/or can operate by long range positioning on the platform different from the first computing equipment.214, game server sends experience information to the second computing equipment.As shown in 216, the cross-platform expression that experience information can provide the augmented reality that is provided to the first computing equipment to experience.Fig. 5 shows the cross-platform expression of example of the augmented reality experience of describing in Fig. 4.
Indoor environment 500 comprises the games system 502 that is connected to fixed outer display device 504.For example, games system 502 and display device 504 can be corresponding to the games system of Fig. 1 112 and display devices 114.Games system 502 can be incorporated in to multi-player gaming session based on any suitable criterion virtually.In certain embodiments, games system 502 can be incorporated in to multi-player gaming session to determine to participate in and/or select multi-player gaming session in response to user 506.In other embodiments, automatically the criterion based on such as technical manual, the preference setting of position, session number, games system and/or any other suitable criterion are incorporated in to multi-player gaming session to games system 502.
Multi-player gaming session can be by presenting to user 506 experiencing 508 demonstrations as the cross-platform expression of augmented reality experience on display device 504.For example, user 506 can see the virtual representation of gamespace, and one or more other players therein with perspective display device participate in multi-player gaming session.Fig. 5 has described first person pattern, and display device 504 shows user 308 and 310 virtual representation from the first person of the virtual role 408 shown in Fig. 4 therein.In certain embodiments, can use third person.For example, visual angle can be by selecting or automatically dynamically change with third person pattern and/or from any other suitable visual angle demonstration to experience 508.
Experiencing 508 can be the cross-platform expression that is configured to the augmented reality experience presenting via display device 504 visions.For example, experiencing 508 can be presented in response to the game server from for example game server 102 receives experience information.As Fig. 2 218 as shown in, experience information can comprise the each side corresponding to the physical environment of multi-player gaming session.Experience information can additionally or alternatively comprise the expression corresponding to the enhancing information of the virtual environment of multi-player gaming session or the information of enhancing.
The each side of physical environment can provide in any suitable manner, and can comprise any information about physical environment, such as the depth information of object in topological characteristic, physical environment, landmark location, haulage track, time, weather conditions etc. of one day.The each side of physical environment can be by being bonded to each side and/or the element relevant to each side multi-player gaming session for configuration, change and/or improving game playability.For example, game playability at night can be from different by day.In certain embodiments, game server can be stored the each side of physical environment, from third party database, receive about the information of physical environment, and/or from participant's reception information of multi-player gaming session.For example, perspective display device can detect via one or more sensors of perspective display device the each side of physical environment, and the each side detecting is sent to the computing equipment of server and/or other participation multi-player gaming session.
In certain embodiments, game server can identify the information that gamespace all and for multi-player gaming session is relevant and be particular platform configuration information.Gamespace information can be described multi-player gaming session by comprising such as the information of physical context information and corresponding virtual environment information.For example, game server can send experience information to computing equipment by the form of the two dimension of gamespace or three dimensional representation, and such as games system 502, thereby it can be displayed on display device 504.Therefore the cross-platform expression that, augmented reality is experienced can provide the experience corresponding to multi-player gaming session to the user who has different computing equipments from other users in multi-player gaming session.
The cross-platform expression that augmented reality is experienced and augmented reality is experienced can allow multiple users of the multiple computing equipments that operate in different platform to participate in the multi-player gaming session of large-scale multiplayer online gaming.For example, for the gamespace of multi-player gaming session, can be provided to perspective display device in the mode different from games system.As discussed above, enhancing information can be provided to perspective display device, and such as the perspective display device 304 in Fig. 3, and experience information can be provided to the computing equipment operating in different platform, such as the games system 502 in Fig. 5.Correspondingly, enhancing information can be different data layouts, comprises different data structures, has different data sources, and/or different from experience information in any suitable manner, with in order to be configured for the experience presenting on the display device of platform separately.
Computer equipment can also be carried out Overpassing Platform by Using so that social aspect to be provided to game.For example, the communication information can send between the perspective display device 104 such as Fig. 1 and the first and second computing equipments of games system 112.The communication information can comprise from text and/or the voice of user's input of computing equipment.In certain embodiments, the communication information can directly send between computing equipment, sends, and/or send by any third party's computing equipment by game server.Computing equipment and/or game server can comprise the transfer capability of text to voice and/or voice-to-text.Therefore, voice output can derive and text output can be derived from phonetic entry from text input.For example, the user 308 in Fig. 3 can provide phonetic entry to having an X-rayed display device 304 to communicate by letter with the user 506 of Fig. 5.As response, perspective display device 304, game server 102, and/or games system 502 can convert phonetic entry to the text output showing on the display device 504 that will describe in Fig. 5.Communication can provide between any group of computing equipment, is included in private conversation between selected computing equipment, across the private conversation of whole gaming session, across private conversation of all gaming sessions etc.
User can provide user to input to control relevant user role and/or mutual with experience.For example, user can provide user to input with in order to create, to delete and/or otherwise change one or more elements in multi-player gaming session.User's input can be by the combine detection of any suitable user input device or input equipment, such as posture detection equipment, microphone, Inertial Measurement Unit etc.For example, the user 308 of Fig. 3 can point to a target, such as the virtual role 408 of describing in Fig. 4, and carries out corresponding to the posture of attacking and carrys out target of attack with the role of control association.Posture can be caught by one or more sensors of perspective display device 304, and such as imageing sensor, and posture can be matched with corresponding order so that the role who is associated with user 308 carries out desired attack.
User is user's input capable of being combined also, for example, and by pointing to target and saying strike order target is carried out to corresponding attack.For example, when the microphone of perspective display device 304 detects the voice command from user 308, the Inertial Measurement Unit that is connected to the perspective display device 304 of Fig. 3 can detect the posture from user 308.User's input also can allow the menu item of user and experience to carry out alternately.For example, the user 308 of Fig. 3 can point to the second user, such as the user 506 of Fig. 5, or long-distance user's virtual representation, such as the virtual role 408 of Fig. 4, to show the additional information about respective user and/or role.
In certain embodiments, experience can comprise that the map of multi-player gaming session is as coverage diagram and/or the enhancing of the physical environment map relevant to multi-player gaming session.For example, the position that map can provide the user of computing system to represent, it is dynamically updated with the current location corresponding to computing system.In other words, the travelling of real world can automatically be upgraded in experience.Map can provide to be experienced and/or any the destination locations virtually of real world, such as position of other users, search, virtual objects, real-world objects etc.In certain embodiments, map can be by customization.For example, map can be filtered only to show user's friend, enemy, search of selection etc.When exiting from multi-player gaming session and/or game server, map can be visual.For example, user can watch map with state in the game of identifying user physical location.
Multi-player gaming session can have the feature that depends on position.Such as, multi-player gaming session can be corresponding to one or more specific regions of real world.In certain embodiments, in search or other game, event can be tied to real world continental embankment, thereby in playing when user is in the adjacency threshold value of real world continental embankment, event is activated.When the specific real-world locations of a large number of users in multi-player gaming session and/or virtual location exceed threshold value, in game, event also can be triggered.In certain embodiments, in game, event can be created, deletes and/or otherwise by user, be changed.For example, user creatable will be placed on the virtual objects in the gamespace of multi-player gaming session.In response to the establishment of virtual objects, another user can receive the search that relates to this virtual objects.The virtual objects creating can retain and be regarded as the part of this relevant search.
Fig. 6 has shown that the position based on two computing equipments is added to two computing equipments the exemplary method 600 of multi-player gaming session.602, game server, such as the game server 102 of Fig. 1, can receive the position of the first computing equipment.As shown in 604, the first computing equipment can comprise the first see-through display, such as the perspective display device 104 of Fig. 1.The position of computing equipment can be determined in any suitable manner.For example, the position of the first computing equipment can be determined by the each side that detects physical environment via one or more sensors of the first computing equipment.
At 606, the first computing equipments, be added into the first multi-player gaming session.As shown in 608, the position that the first multi-player gaming session can be based on the first computing equipment and being selected.In fact about any suitable criterion of position, can instruct as the selection in 608 described multi-player gaming sessions.For example, multiple multi-player gaming sessions can be separately corresponding to different real-world area.
In certain embodiments, computing equipment can be arranged in the multi-player gaming session corresponding to the real-world area of placement computing equipment.In embodiment additional or that replace, position can be to be that computing equipment is selected a criterion in multiple criterions of multi-player gaming session.For example, computing equipment can join corresponding in the multi-player gaming session that approaches real-world area most having higher or lower than the number of number threshold value.As another example, computing equipment can join corresponding to having in the one or more good friends' of computing equipment user the multi-player gaming session that approaches real-world area most.
Adding after the first multi-player gaming session, 610, for the enhancing information of the first multi-player gaming session, can be sent to the first computing equipment.This enhancing information can be used to strengthen by see-through display the authenticity of the physical environment of watching by see-through display.
612, game server receives the position of the second computing equipment.As shown in 614, the second computing equipment can comprise the second see-through display, such as the perspective display device 106 or 108 of Fig. 1.616, game server determines that the position of the second computing equipment is whether in the adjacency threshold value of the position of the first computing equipment.For example, maximum and/or the minor increment between adjacency threshold value definable the first and second computing equipments.As another example, adjacency threshold value definable game border (for example urban district, Seattle).In certain embodiments, adjacency threshold value can be a criterion of the multiple criterions for the second computing equipment being joined to multi-player gaming session.
If the position of the second computing equipment, in the adjacency threshold value of the position of the first computing equipment, joins the second computing equipment in the first multi-player gaming session at 618 game servers.For example, the second computing equipment can be corresponding to the see-through display equipment 106 of Fig. 1.The dotted line frame that represents position 1 can illustrate that perspective display device 104 and 106 is in adjacency threshold value each other.Therefore, these equipment can be added in identical multi-player gaming session.
Alternatively, if the position of the second computing equipment not in the adjacency threshold value of the position of the first computing equipment, 620, game server joins the second computing equipment in the second multi-player gaming session that is different from the first multi-player gaming session.In this example, the second computing equipment can be corresponding to the perspective display device 108 in position 2 in Fig. 1.Because position 2 can be considered to outside the adjacency threshold value of perspective display device 104, perspective display device 108 can be added in the multi-player gaming session different from having an X-rayed display device 104.For example, the first multi-player gaming session can for example, corresponding to primary importance (, Seattle) and comprise the search (for example, the shooting to Seattle particular landmark) of the continental embankment that is tied to primary importance.The second multi-player gaming session can be corresponding to the second place (for example, Portland) and comprise the search (for example,, for searching near the treasure hunt game of the virtual objects particular landmark that is hidden in Portland) of the continental embankment that is tied to the second place.
In certain embodiments, if when the second computing equipment moves to the position in the adjacency threshold value of the first computing equipment position, the second computing equipment can be injected towards in the first multi-player gaming session.For example, the second computing equipment can exit game server and/or the second multi-player gaming session and move to a new physical location in the adjacency threshold value of the first computing equipment.Once log in back game server and/or the second multi-player gaming session, the second computing equipment can be added in the first multi-player gaming session.In another example, the second computing equipment can move and from multi-player gaming conversation, not open connection in the adjacency threshold value of the first computing equipment position.As response, the second computing equipment can automatically be added the first multi-player gaming session.
Fig. 7 has shown the non-limiting example of the perspective display device 104 that comprises see-through display 702.For example, perspective display device 104 can be wear-type formula perspective display device.In certain embodiments, such as the perspective display device of perspective display device 104, can be integrated as shown in FIG. 7.In the embodiment replacing, perspective display device can be modularization computing system.This computing system can comprise see-through display and is coupled to communicatedly one or more other assemblies of see-through display.
See-through display 702 is transparent at least in part, thereby allows light to reach eyes of user by see-through display.In addition the outward appearance that, see-through display is configured visually to strengthen physical space is to watching the user of physical space by see-through display.For example, when user watches by see-through display, see-through display can show the visible virtual objects of user.So, user can watch at the non-existent virtual objects of physical space when watching physical space.This has manufactured virtual objects is the illusion of the part of physical space.
Perspective display device 104 also comprises virtual reality engine 704.Virtual reality engine 704 can be configured to make see-through display visually to present the enhancing of one or more virtual objects as real-world objects.Virtual objects can simulate real world object outward appearance.For the user who watches physical space by see-through display, virtual objects seems to integrate with physical space and/or with real-world objects.For example, virtual objects and/or other image showing via see-through display can be located relatively with user's eyes, thereby make shown virtual objects and/or image allow user seem to occupy the ad-hoc location in physical space.By this way, user can watch in physical space the not object of necessary being.Virtual reality engine can comprise software, hardware, firmware or their combination in any.
Perspective display device 104 can comprise speaker subsystem 706 and sensor subsystem 708.In different embodiment, sensor subsystem can comprise various sensor.As nonrestrictive example, sensor subsystem can comprise the infrared and/or visible light camera 712 of microphone 710, one or more preposition (away from users) and/or one or more postposition (towards user) is infrared and/or visible light camera 714.Preposition camera can comprise one or more depth cameras, and/or rearmounted camera can comprise one or more eye tracking cameras.In certain embodiments, on-line sensor subsystem can be communicated by letter with the one or more off-line sensors that send observation information to on-line sensor subsystem.For example, the depth camera being used by game console can send depth map and/or modeling dummy skeleton to the sensor subsystem of head mounted display.
Perspective display device 104 also can comprise that allowing to have an X-rayed display device is worn on by the one or more features in account.In the example illustrating, perspective display device 104 adopts the form of glasses and comprises nose frame 716 and ear frame 718a and 718b.In other embodiments, head mounted display can comprise having cap or the helmet of having an X-rayed safety goggles in front.In addition, although describe in the context of wear-type see-through display, concept described herein can be applied to not being that the see-through display worn (for example, windshield) and be not perspective display in (for example, opaque display, it uses the not virtual objects in the camera visual field to play up the real object of being observed by camera).
Perspective display device 104 also can comprise communication subsystem 720.Communication subsystem 720 can be configured to and one or more calculated off-line devices communicatings.As an example, communication subsystem can be configured to that receiver, video stream wirelessly, audio stream, coordinate information, virtual objects are described and/or out of Memory to play up enhancing information as the experience of reality strengthening.
In certain embodiments, above-mentioned Method and Process can be tied to the computing system of one or more computing equipments.Especially, this Method and Process can be implemented as computer applied algorithm or service, application programming interface (API), storehouse and/or other computer program.
Fig. 8 has schematically shown the non-limiting example that can carry out the one or more computing systems 800 among said method and process.Computing system 800 illustrates in simplified form.Should be appreciated that, in fact any computer architecture can be used and do not departed from the scope of the present disclosure.In different embodiment, the form that computing system 800 can be taked wear-type perspective display device (for example, perspective display device 104), game station (for example, games system 502), mobile computing device, mobile communication equipment (for example, smart phone), desk-top computer, laptop computer, flat computer, home entertaining computing machine, network computing device, mainframe computer, server computer etc.
Computing system 800 comprises logic subsystem 802 and storage subsystem 804.Computing system 800 comprises display subsystem 806 (for example, see-through display), input subsystem 808, communication subsystem 810 and/or other assembly not showing in Fig. 8 alternatively.
Logic subsystem 802 comprises the one or more physical equipments that are configured to carry out instruction.For example, logic subsystem can be configured to carry out instruction, and this instruction is the part of one or more application, service, program, routine, storehouse, object, assembly, data structure or other logical construct.Such instruction can be implemented to execute the task, realizes data type, converts the state of one or more assemblies or otherwise reach required result.
Logic subsystem can comprise the one or more processors that are configured to executive software instruction.Additionally or alternatively, logic subsystem can comprise the one or more hardware or the firmware logic machine that are configured to carry out hardware or firmware instructions.The processor of logic subsystem can be monokaryon or multinuclear, and the program of carrying out thereon can be arranged to serial, that walk abreast or distributed treatment.Logic subsystem is included in the independent assembly distributing between two or more equipment alternatively, and these independent assemblies can long range positionings and/or are arranged to Coordination Treatment.The each side of logic subsystem can the virtual and execution by the networking computing equipment that configures the remote accessible being configured with cloud computing.
Storage subsystem 804 comprises one or more physics, non-transient equipment, and this one or more physics, non-transient equipment are configured to keep the data that can be carried out by logic subsystem and/or instruction to realize Method and Process described here.When such Method and Process is implemented, the state of storage subsystem 804 can be converted---for example, and in order to keep different data.
Storage subsystem 804 can comprise removable medium and/or built-in device.Storage subsystem 804 (for example can comprise light storage device, CD, DVD, HD-DVD, Blu-ray disc etc.), semiconductor memory apparatus (for example, RAM, EPROM, EEPROM etc.) and/or magnetic storage apparatus is (for example, hard disk drive, floppy disk, tape drive, MRAM etc.), and other.Storage subsystem 804 can comprise volatibility, non-volatile, dynamic, static, read/write, read-only, random access, sequential access, position addressable, file addressable and/or content addressable equipment.
Will be appreciated that storage subsystem 804 comprises one or more physics, non-transient equipment.But in certain embodiments, the each side of instruction described herein can be by transient state mode for example, by not kept the pure signal (, electromagnetic signal, light signal etc.) in a limited time limit to propagate by physical equipment.In addition, the information of the data relevant with the disclosure and/or other form can be propagated by pure signal.
In certain embodiments, the each side of logic subsystem 802 and storage subsystem 804 can together be integrated into one or more hardware-logic modules, by described assembly, carry out described herein functional.These hardware logic assemblies can comprise: for example, and field programmable gate array (FPGA), program and application specific integrated circuit (PASIC/ASIC), program and application specific standardized product (PSSP/ASSP), SOC (system on a chip) (SOC) system and complex programmable logic equipment (CPLD).
Term " program " and " engine " can be used to describe the one side that is implemented as the computing system 800 of carrying out specific function.In some situation, program or engine can be carried out the instruction being kept by storage subsystem 804 and be initialised via logic subsystem 802.Should be understood that distinct program and/or engine can initialization from identical application, service, code block, object, storehouse, routine, API, function etc.Similarly, identical program and/or engine can be by initialization such as different application, service, code block, object, routine, API, functions.Term " program " and " engine " can comprise single or one group of executable file, data file, storehouse, driving, script, data-base recording etc.
Will be appreciated that " service " as used herein can be can be across the executable application program of multiple user conversations.Service can be used for one or more system components, program and/or other service.In some implementations, service may operate on one or more server computing devices.
When being included, display subsystem 806 can be used to present the visual representation of the data that kept by storage subsystem 804.This visual representation can take to be revealed as the form of the image that strengthens physical space, therefore creates the illusion of augmented reality.As Method and Process described herein, changed the data that kept by storage subsystem, and therefore changed the state of storage subsystem, the state of display subsystem 806 can be converted visually to represent the change in bottom data equally.Display subsystem 806 can comprise the one or more display devices that in fact utilize any type of technology.Such display device can be combined in and for example share, in encapsulation (, head mounted display) with logic subsystem 802 and/or storage subsystem 804, or such display device can be peripheral display device.
When being included, input subsystem 808 can comprise one or more user input devices or mutual with one or more user input devices, user input device such as game console, posture input checkout equipment, voice recognition unit, Inertial Measurement Unit, keyboard, mouse or touch-screen.In certain embodiments, input subsystem can comprise selected natural user's input (NUI) assembly or input (NUI) component interaction with selected natural user.This assembly can be integrated or external, and the conversion of input action and/or process can operate online or off-line.The NUI assembly of example can comprise for speaking and/or the microphone of voice recognition; For infrared, colored, solid and/or the depth camera of machine vision and/or gesture recognition; For head-tracker, eye tracker, accelerometer and/or the gyroscope of motion detection and/or intention identification; And for estimating the electric field sensing assembly of brain activity.
When being included, communication subsystem 810 can be configured by computer system 800 and one or more other computing device communication be coupled.Communication subsystem 810 can comprise the wired and/or wireless communication facilities with one or more different communication protocol compatibilities.As nonrestrictive example, communication subsystem can be configured via wireless telephony network or wired or wireless local or wan communication.In certain embodiments, communication subsystem can allow computing system 800 to send information and/or receive information from miscellaneous equipment to miscellaneous equipment via network (such as the Internet).
Should be understood that configuration described herein and/or method are exemplary in essence, and these specific embodiments or example be not circumscribed, because may there be a large amount of variations.Concrete routine described herein or method can represent processing policy one or more of arbitrary number.So, shown and/or describe each action can by shown and/or describe order carry out, by other order carry out, concurrently carry out or be left in the basket.Similarly, the order of said process can be changed.
Theme of the present disclosure comprises various process disclosed herein, system and configuration, and further feature, function, action and/or characteristic with and all novelties of any and whole equivalents with non-obvious combination and sub-portfolio.