CN102939139A - Calibration of portable devices in shared virtual space - Google Patents

Calibration of portable devices in shared virtual space Download PDF

Info

Publication number
CN102939139A
CN102939139A CN201180028950XA CN201180028950A CN102939139A CN 102939139 A CN102939139 A CN 102939139A CN 201180028950X A CN201180028950X A CN 201180028950XA CN 201180028950 A CN201180028950 A CN 201180028950A CN 102939139 A CN102939139 A CN 102939139A
Authority
CN
China
Prior art keywords
equipment
space
portable set
reference point
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201180028950XA
Other languages
Chinese (zh)
Other versions
CN102939139B (en
Inventor
G.韦辛
T.米勒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment LLC
Original Assignee
Sony Computer Entertainment America LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/973,827 external-priority patent/US8537113B2/en
Application filed by Sony Computer Entertainment America LLC filed Critical Sony Computer Entertainment America LLC
Publication of CN102939139A publication Critical patent/CN102939139A/en
Application granted granted Critical
Publication of CN102939139B publication Critical patent/CN102939139B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1018Calibration; Key and button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • A63F2300/1093Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/205Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform for detecting the geographical location of the game platform
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/407Data transfer via internet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/408Peer to peer connection
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • A63F2300/6676Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera by dedicated player input
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/69Involving elements of the real world in the game world, e.g. measurement in live races, real video
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Abstract

Methods, systems, and computer programs for generating an interactive space viewable through at least a first and a second device are presented. The method includes an operation for detecting from the first device a location of the second device or vice versa. Further, synchronization information data is exchanged between the first and the second device to identify a reference point in a three-dimensional (3D) space relative to the physical location of the devices in the 3D space. The devices establish the physical location in the 3D space of the other device when setting the reference point. The method further includes an operation for generating views of an interactive scene in the displays of the first and second devices. The interactive scene is tied to the reference point and includes virtual objects. The view in the display shows the interactive scene as observed from the current location of the corresponding device. Moving the device in the 3D space causes the view to change according to the perspective from the current location.

Description

The calibration of portable set in shared virtual space
Technical field
The present invention relates to for control method, equipment and the computer program of the view of virtual scene with portable set, and relate more specifically to for enable method, equipment and the computer program of multiplayer's interaction at virtual or augmented reality.
Background technology
Virtual reality (VR) is that the user passes through the Computer Simulation environment that the use of standard input device or special-purpose multidirectional input equipment can be interactive with virtual environment or visual human's divine force that created the universe (virtual artifact), no matter this environment is the simulation of real world or the world of the imagination.The environment of emulation can be similar to real world, and for example, for the emulation of pilot or battle drill, or it can be different from reality significantly, as in the VR game.Virtual reality generally is used for describing and is associated with publicly the extensive application that it is in its border, highly visual, three-dimensional (3D) environment.The development of CAD (CAD) software, graphics hardware acceleration, head-mounted display, database gloves (database gloves) and miniaturization has helped to promote this concept.
Augmented reality (AR) provides the live view of physics real world, and the image that the key element of this real world and virtual machine generate merges (or by its enhancing) to create mixed reality.Strengthening is real-time and to have the semantic background of environmental key-element, such as the sports score on TV between match period traditionally.For example, by means of senior AR technology (, increasing computer vision and object identification), about the information of real world around the user, become interactive and be digitally available.
Term strengthens virtual (AV) and also is used in virtual reality world and is similar to AR.Strengthening virtual also the finger merges to real world objects in virtual world.As the intermediate state in virtual continuously (Virtuality Continuum), AV refers to wherein physical objects for example or people's physical element dynamic integrity in virtual time, and can with the main Virtual Space of virtual world real-time interactive.Term VR, in this application as collective term, except as otherwise noted, also comprises AR and AV.
The VR game requires a large amount of computer resources usually.The very rare and existing game of realization in the handheld device of VR game would rather be have basic VR effect simple.In addition, multiplayer AR game allows the interaction of player in virtual world, and still, this interaction is limited to the object (for example, automobile, racket, ball etc.) of being handled by the player in virtual world.Virtual world is that generate and the position that be independent of player and portable set of computer.When creating " reality " virtual reality and experience, do not consider between the player mutually and the player about the relative position around theirs.
Under this background, inventive embodiment has appearred just.
Summary of the invention
Embodiments of the invention are provided for generating method, system and the computer program in the interactive space that can watch by least the first and second equipment.Should approve, the present invention can realize in many ways, such as the method on processing, device, system, equipment or computer-readable medium.A plurality of invention embodiment of the present invention is below described.
In one embodiment, method comprises from the position of first Equipment Inspection the second equipment or the operation that vice versa.In addition, the synchronizing information data the first and second exchanged between equipment with identification in three-dimensional (3D) space with respect to the reference point in the 3d space of the physical location of equipment.Equipment is set up the physical location in the 3d space of other equipment when setting this reference point.Method further comprises the operation for the view of the display generating interactive scene at the first and second equipment.Interactive scene is bound to reference point and comprises dummy object.View illustrates all or part of interactive scene and the interactive scene of observing as the current location from relevant device is shown.In 3d space, mobile device makes according to current location and changes view.
In another embodiment, be provided for generating the method in the interactive space that can watch by least the first and second equipment.In a method operation, rapping between the first equipment and the second equipment detected, after rapping, when the first and second exchanged between equipment synchronizing information data are rapped to detect in 3d space identification with respect to the reference point in the 3d space of the physical location of the first and second equipment.In addition, the view of method generating interactive scene in the respective display of the first and second equipment.This interaction scene is bound to reference point and comprises dummy object, and wherein each view illustrates the interactive scene of observing as the current location from relevant device.Equipment moving in 3d space makes corresponding views change according to current location.
In an embodiment again, be presented in the middle of portable set the portable set of sharing virtual reality.This portable set comprises position module, communication module, view generation device and display.Position module is followed the tracks of the position of portable set and is configured to the position of detection with reference to the second equipment of the position of portable set.Communication module is used in portable set and the second exchanged between equipment synchronizing information data.Based on synchronizing information, in 3d space, identify about the reference point in the 3d space of the physical location of portable set.In addition, the view generation device creates the view that is bound to reference point and comprises the interactive scene of dummy object.View illustrates all or part of interactive scene and the interactive scene of observing as the current location from relevant device is shown.When equipment is moved in 3d space by the player, view changes according to the current location of equipment.Display is provided to show view.
Other aspects of invention will become clear by following detailed description with the accompanying drawing, and the principle of invention is shown in the mode of example.
The accompanying drawing explanation
Can understand best the present invention by reference to following description taken together with the accompanying drawings, in the accompanying drawings:
Fig. 1 describes synchronous portable set to the user before the reference point in space according to an embodiment;
The virtual reality scenario that Fig. 2 diagram is observed with portable set;
Similar effect when how the movement that Fig. 3 illustrates portable set according to an embodiment has as mobile camera in Virtual Space on display;
Fig. 4 illustrates the two dimension of the change in the image illustrated on display when rotating portable set to be showed according to an embodiment;
How Fig. 5 connects and to play interactive game by network according to an embodiment diagram;
Fig. 6 raps two portable sets with the synchronously processing of their position according to an embodiment diagram;
Fig. 7 illustrates two portable sets after rapping according to an embodiment;
Fig. 8 illustrates around the establishment of the virtual scene of reference point according to an embodiment of invention;
Fig. 9 describes two players in same space according to an embodiment, and in same space, virtual reality creates around the reference point on desktop;
Figure 10 A-Figure 10 B illustrates the processing of using image recognition to detect the position of another portable set according to an embodiment;
Figure 11 illustrates how by finding the light source in the second equipment to detect the second equipment according to an embodiment;
Figure 12 diagram detects the second equipment according to an embodiment by the display that finds the second equipment;
Figure 13 illustrates for follow the tracks of the embodiment of portable set via reckoning positioning (dead reckoning);
How Figure 14 adjusts reckoning positioning with the static nature in background according to an embodiment diagram;
Figure 15 diagram is for an embodiment of the calibration steps of multiplayer's environment;
Figure 16 describes multiplayer's reality-virtualizing game according to an embodiment;
Figure 17 illustrates for generating the algorithm flow in the interactive space that can watch by least the first and second equipment according to an embodiment of invention;
Figure 18 diagram can be used to realize the structure of the equipment of inventive embodiment;
Figure 19 is according to one embodiment of the present of invention, the graphical representation of exemplary of each user A to the scenario A of user E and game client 1102 interactions to scene E wherein, this client 1102 via Internet connection to server process; And
The embodiment of Figure 20 pictorial information service provider structure.
The specific embodiment
The following example is described for generating method, device and the computer program in the interactive space that can watch by least the first and second equipment.Yet, it will be obvious to those skilled in the art that can not have in these details some or all put into practice the present invention.In other examples, there is no to describe in detail the processing of knowing and operate so that do not make the present invention unnecessarily fuzzy.
Fig. 1 describes synchronous portable set to the user before the reference point in space according to an embodiment.Portable set 102 is placed on desk prepares to reference point for synchronous portable set.User 102 portable set has been placed on by the point as reference point or anchor point (anchor) to set up the virtual reality around this point.In situation shown in Figure 1, portable set is positioned at the approximate centre of desk, once and synchronous portable set just set up the virtual world around the desk center.Portable set can be synchronous in every way, such as pressing button on portable set 104, touch touch sensitive screen on portable set, allow this equipment transfixion a period of time (for example, 5 seconds), sound import order etc.
Once receiving, portable set wants synchronous input, the position tracking module in the portable set that just resets.This portable set can comprise a plurality of positions tracking module of describing with reference to figure 18 as following, such as accelerometer, magnetometer, global positioning system (GPS) equipment, camera, depth camera, compass, gyroscope etc.
Portable set can be in a lot of types, such as, handheld portable game station, mobile phone, flat board, notebook, net book, PDA(Personal Digital Assistant) etc.Describe inventive embodiment with reference to portable game device, but principle can be applied to any portable electric appts with display.The principle of invention also can be applied to game console or be connected to other input equipments with the computing equipment of display.
The virtual reality scenario that Fig. 2 diagram is observed with portable set.After about reference point 106 synchronizers 104, portable set will start the view of display virtual real 108.The camera at the rear portion by the emulation portable set is mobile view in creating display in the 3d space around reference point 106.Fig. 2 describes to comprise the virtual reality of chessboard.Portable set 104 can detect motion and determine the relative position of its relative reference point 106 when equipment moves around.Position and location positioning can complete with distinct methods and different accuracy rank.For example, the image that can catch by analyzing camera, or by the data that obtain from inertia system, GPS, ultrasonic triangulation, WiFi communication, reckoning positioning (DR) etc., or above combination comes detection position.
When user's input command, when setting reference point, the total movement sensing equipment makes zero or is calibrated to that position in space.For example, the user can place on the table and press the button to calibrate total movement sensing data (accelerometer, gyroscope, GPS etc.) by equipment.Hereafter, whole the caught positional information for equipment is recorded and follows the tracks of and processed with respect to the initial calibration position via inertial navigation.All follow-up caught positional information is considered to respect to equipment Alignment position (reference point).
In one embodiment, equipment keep about reference point 106, at the track of Zhong De position, the space of portable set, and the Zhong De position, space of portable set.This position is used to determine the viewing angle of camera, that is, portable set is with accomplishing the camera in virtual scene.If portable set is towards the right side, view will turn to the right side, etc.In other words, viewing angle is defined as initial point at the center of display (or other parts of equipment) and has vertical and away from the vector of the direction of this display.In another embodiment, only follow the tracks of Zhong De position, space, and the view in calculation display, just as camera, from the portable set place and towards Zhong De position, the space of reference point, point to.
In some existing realizations, the AR label is placed on desk, and as the trustee's mark (fiduciary marker) for generation of augmented reality.This AR label can be object or picture, and it is identified in appearing at the image stream of catching of true environment the time.This AR label is as the definite trustee's mark that enables the position in true environment.Because follow the tracks of the position of the calibration in 3d space and portable set, inventive embodiment has been removed the needs of AR label.In addition, positional information allows the real 3D virtual experience of game delivery in portable set.In addition, the array of the portable set of networking can be used for creating shared virtual world, as what describe below with reference to Figure 15 and Figure 16.
Similar effect when how the movement that Fig. 3 illustrates portable set according to an embodiment has as mobile camera in Virtual Space on display.Fig. 3 illustrates the automobile 302 in virtual sphere.Suppose portable set be from the point in spheroid point to automobile 302, along with portable set movement in spheroid can obtain a plurality of views of automobile.For example, from the view of " arctic point ", the top of automobile will be shown, and the bottom of automobile will be shown from the view of " Geophysical South Pole ".Same shown in Figure 3 is the view of automobile side, front, back.
In one embodiment, the player can input command to change or the view of upset virtual world.For example, in the situation that automobile, the player turns to the back of seeing automobile from the front of seeing automobile, seemingly scene rotation 180 ° and axle vertically through reference point.Like this, the player needn't mobile space to obtain different viewing angles.Other inputs can produce different-effect, such as 90 ° turn to, the convergent-divergent (making virtual world seem smaller or greater) of view, about x, y or the rotation of z axle etc.In another embodiment, the upset (i.e. 180 ° of rotations) at user's portable set on hand will make the view of virtual world spin upside down.
Fig. 4 illustrates the two dimension of the change on the image illustrated on display when rotating portable set to be showed according to an embodiment.Portable set 402, to watch angle α towards body of wall, causes the projection 410 on body of wall.Therefore, the view on portable set 402 will be corresponding to projection 410.When equipment 402 rotational angle β, portable set is parked in position 404.View also rotational angle β maintains camera viewing angle α simultaneously.Consequently, the view on portable set is corresponding to projection 412.It should be noted in the discussion above that the view on screen is independent of the eye position such as position 408 and 406, and where irrelevant this view and player be.Image on display depends on the position as the portable set of virtual camera.
How Fig. 5 connects and to play interactive game by network according to an embodiment diagram.Very eurypalynous game is possible in the communal space.For example, portable set can be played table tennis game as racket.Move around equipment seemingly there racket can hit virtual ball.The player sees that this ball is floating between screen and opponent's screen.In the embodiment of war game, the player sees portable set and the barrier towards the enemy by ejector.The player pulls back equipment to load ejector, and then presses the button with the barrier emission to the enemy by ejector.
As shown in Figure 5, also can create the communal space as the player during at diverse location.The player has set up network and has connected to play games.Synchronous his equipment of each player is to the reference point in player's space, and creates the virtual reality such as ping pong table.After the opponent is illustrated in his that side of desk, therein, the motion of the shifted matching opponent's of opponent's equipment racket.For even more real game experiencing, game also can be added incarnation to hold racket.During playing, each equipment keeps the motion of equipment and the tracking of position.This information and other equipment are shared so that other equipment can be placed the virtual paddle of the motion of matching unit.Also share position and other mobile game informations such as ball.
Fig. 6 raps two portable sets with the synchronously processing of their position according to an embodiment diagram.Comprise and rap two portable sets for setting a common method virtual or that strengthen space.Rap and mean lightly and clash into another with an equipment.In the situation shown in Fig. 6, two players hold just respectively portable set 604 and 606.In order to calibrate two equipment to identical reference point, the player is by putting together to rap equipment by two equipment.In Fig. 6, lean against and place privately two equipment, but for the calibration any position be all possible, such as front to rear.The key that detection is rapped is that one or two equipment is noticed the flip-flop in motion, such as the deceleration suddenly of equipment.
For example, two equipment can move to the other side, and, when they rap, two equipment all stop.Inertia module in equipment such as gyroscope 612 and accelerometer 614 is noticed the change of momentum and can be set up and rap subsequently.Under another situation, portable set is static and another portable set moves to this static equipment.When equipment raps, mobile device will be noticed suddenly changing of momentum and, because the little change of momentum can ascribe the proper motion of player's hand to, static equipment may or may not detect and raps.In order to detect, rap, an Equipment Inspection is just enough to rapping, and does not require that two equipment all detect and rap simultaneously.In one embodiment, and if two equipment all detect to rap, detect the basic while, determine to rap to have occurred.Simultaneously whether in order to determine, to detect, devices exchange is about the timing information of this event.
In another embodiment, once portable set 602 and 602 is synchronized to identical Virtual Space, their movement is followed the tracks of with accelerometer 614, enables to create stable and lasting augmented reality environment, no matter user's mobile portable equipment how.In an embodiment again, the graph data that the inertia mobile message can be caught with camera 606 or 610 supplements.Graph data can be used for detecting the distance between another equipment estimating apparatus, as what discuss below with reference to Figure 10 A-Figure 12.
Fig. 7 illustrates two portable sets after rapping according to an embodiment.Rap and can contact by making portable set, but rap also and can complete not making two actual contacts of equipment.All desired, the change of their motions occurred in the roughly the same time.For example, if player's finger, after the first equipment, detects and raps when the second equipment contact player's finger, cause the change of the motion of equipment.
Once all be detected and rap by arbitrary equipment or two equipment, portable set occurs to confirm to rap with regard to swap data.For example, equipment can communicate via WiFi or supersonic communication.As what discuss before, create reference point in Fig. 1-Fig. 2.Reference point can be positioned at some places at the back side of equipment, such as the center of each equipment when equipment raps.In another embodiment, can use the TransferJet interface to carry out the exchange of data.TransferJet is the interface that enables communication when between two equipment, close proximity being detected.
The actual reference point that it should be noted in the discussion above that each equipment can not be actual identical point in the space for two equipment.In other words, each equipment can have different reference points, although in most of the cases, reference point is by near each other.Important thing is that two equipment are set up reference point and started subsequently to follow the tracks of virtual or enhancing space movement on every side.Result is common Virtual Space.Reference point can be set in central authorities, camera location, accelerometer location of central authorities, the display at the back of portable set etc.
Once the setting reference point, motion tracking module reset-to-zero comes the position in measurement space to set initial point.This operation refers to the calibration portable set here, and this initial point of the computing reference of the three-dimensional position of portable set calculates.
Fig. 8 illustrates around the establishment of the virtual scene of reference point according to an embodiment of invention.When two portable sets of Fig. 7 are opened, create virtual or augmented reality game area.Specify the dummy object such as soldier 804 and 806 corresponding to the position in the 3d space at portable set place.The origin of coordinates in 3d space is reference point 808.Because the point location in the dummy object space, so do not need SLAM or ARTAG for maintaining augmented reality.
By around gaming world physically mobile portable equipment come control association in the virtual camera of the view from display, wherein this gaming world has been placed on the fixed position with reference to real world.It should be noted in the discussion above that virtual world is not limited to the space between portable set, but can expand and cover on any portable set, under and zone afterwards.
Fig. 9 describes two player 906a-906b in same space according to an embodiment, and in this space, virtual reality creates around the reference point 902 on desk 904.Player 906a-906b synchronously their equipment 908a and 908b to the common reference point 902 that is positioned at desk 904 tops.Because some P0902 is reference point, so being also the origin of coordinates and it, P0 there is coordinate (X0=0, Y0=0, Z0=0).The player is in room, but virtual reality also refers to virtual scene here, extends to outside the physical boundary in room.
In an example embodiment, virtual scene is tied to reference point, because the geometry of virtual scene (seeing by the screen of equipment) is at least in part based on reference point.For example, the coordinate of the dummy object in virtual scene can be determined about reference point.In one embodiment, reference point is the origin of coordinates, so reference point has coordinate (0,0,0).
Can measure coordinate with any measurement standard.But, for the vision example is provided, and do not limit used actual coordinate, if the coordinate of virtual scene is measured with rice, the object that has coordinate (1,0,0) will be positioned at one meter, reference point the right.Certainly, the coordinate of true or virtual object can dynamically upgrade when scene change, such as when dummy object moves in scene.And, can pass through action that action that computer (for example, interactive program) sets, user's action drives or both combinations and define change.In addition, for purpose clearly, interactive program can be the program of any type, such as video-game, commercial program, internet interface or simple graphical user interface, this graphical user interface is provided to data, to other users, to program or to may or not shown or the access of the object of projection by the speaker.
Further, other embodiment can have different coordinates or use convergent-divergent.For example, the coordinate system of replacement Descartes system can be polar coordinates, spherical coordinates, parabolic coordinates etc.In addition, reference point needs not to be the initial point of coordinate system, and can be placed on different local.For the purpose that example is provided, reference point can be positioned at coordinate (5,5,5) to enable on each direction the buffering of 5 meters before using the negative coordinate value in must the point outside 5 meters.Under another situation, set up dummy object with convergent-divergent and coordinate also with a proportion measurement.For example, dummy object can be set up with the ratio of 1:10, and geometrical axis also can have the ratio of 1:10, thus the object of coordinate (1,0,0) in " truly " world, be 1 meter far away and in virtual world, be 10 meters far.
In Fig. 9, dummy object comprises helicopter 914a-914c, cloud, bird, the sun 906 etc.Along with player 906a and 906b move their equipment, the view of virtual scene changes, and the player just takes camera in virtual world seemingly.It should be noted in the discussion above that at the view shown in equipment 908a and 908b and can comprise or can not comprise reference point.For example, although the view in the location computing device 908a based on equipment 908a and reference point 902, the equipment 908a that player 906a holds away from reference point 902 ground towards, so reference point 902 can not be watched in equipment 908a.In addition, can in player's display, see actual reference point as certain mark (such as, " X "), to allow the player know the place of reference point.In other embodiments, reference point can not watch and only exist as geographical position and without any concrete mark.
Except desk 904, room also comprises other stationary bodies, such as TV 912 and window 910.The sun 918 is visible by window 910.It should be noted in the discussion above that the virtual sun 916 needn't be corresponding to the real sun 918.In one embodiment, the virtual sun 916 can be placed on the location of the true sun 918, or light source is in Zhong De location, room.Like this, virtual or strengthen illumination in the world and shade and will create the illumination of mating in room and the real effect of shade.
As seen in fig. 9, only because portable set has been synchronized to the point on the desk, so dummy object needn't be on desk or near reference point.Dummy object can be positioned at space Anywhere.When portable set comprises camera, the static nature in room can use the inertia measurement of adjusting them with the view by with from their cameras to maintain the accurate measurement of current location by portable set.Graphical analysis in portable set can detect edge, the drawing on body of wall, TV of edge, light source, the desk of window etc.Below with reference to Figure 13-Figure 14, more details are described.
In one embodiment, the player can be by equipment again is placed on reference point and does the quickly calibrated of portable set, and then input command with the motion detection block in homing device.
Figure 10 A-Figure 10 B illustrates the processing of using image recognition to detect the position of another portable set according to an embodiment.Seen at Figure 10 A, if two portable sets are all identical models and comprise camera 1006, next synchronous two equipment of the image that can catch with camera.For example, portable set 1002 is with camera 1006 photographic images.Because portable set 1002 has received synch command, so portable set 1002 is to the rear portion scan image of other portable sets 1004.Schematic diagram at the captured image shown in Figure 10 B.
Portable set 1022 has detected rectangle 1010, the characteristic of the portable set that its coupling is just searched.Rectangle 1010 has trunnion axis 1012 and vertical axis 1014.Trunnion axis 1012 relative level 1018 tilt angle alpha.Because portable set 1002 is known the size of other portable sets 1004, so portable set 1002 is made suitable mathematical computations, by the size of the portable set in image with true measure relatively distance to determine portable set 1004, position and towards.
Once portable set is known the relative position of portable set 1004, portable set 1002 and 1004 exchange position informations are to set common reference point and to set up this reference point common virtual or augmented reality on every side.
Figure 11 illustrates how by finding the light source in the second equipment to detect the second equipment according to an embodiment.Except portable set 1102 has the light source detected by other portable sets, the synchronous processing of Figure 11 is similar to the above processing of describing with reference to Figure 10.When portable set is in synchronous the processing, open the detection of light source with the light in the image that promotes the camera shooting.Light source can be light emitting diode (LED), infrared light, camera flash-light etc.
In another embodiment, the depth camera in portable set is used to measure the distance of other equipment.Once know distance, reference point just can be set the position based on arbitrary equipment.Also can be used for the calculating of relative position of ancillary equipment such as the excessive data of view data.
It should be noted in the discussion above that it is possible each other that two equipment all detected in the roughly the same time.In the case, the calculating of the relative position of portable set can or complete by another by a portable set, can be maybe the combination of the measurement undertaken by two portable sets.
Figure 12 diagram detects the second equipment according to an embodiment by the display that finds the second equipment.When calibration portable set 1206 and 1208, when the common space, such as one of the equipment of portable set 1208 " being turned over ", thereby display 1202 faces the camera in portable set 1206.For example, portable set 1208 has illustrated the message that requires the user " equipment to be turned over so that display faces another portable set ".
Can put bright display 1202 subsequently, such as for example display white screen.The brightness of display promotes by portable set 1206 detection display devices.The display of lighting illustrates high-contrast to most background, comprises under many circumstances, holds the place ahead of the player of equipment.Can add other patterns or color to display to improve its detection.For example, Figure 12 illustrates the inner circle with square pattern for more easily using geometry identification display.
Can be U. S. application that the title submitted on December 24th, 2009 is " WIRELESS DEVICE PAIRING METHODS(wireless device paring method) " number 12/647 for the additive method of calibrating two equipment (also assignment to), 291(attorney docket SONYP095A); The U. S. application that the title of submitting on December 24th, 2009 is " WIRELESS DEVICE PAIRING AND GROUPINGMETHODS(wireless device paring and group technology) " number 12/647,296(attorney docket SONYP095B) and the title of submitting on December 24th, 2009 U. S. application that is " WIRELESS DEVICEMULTIMEDIA FEED SWITCHING(wireless device multimedia is presented exchange) " number 12/647,299(attorney docket SONYP095C) in, find, they are incorporated herein by reference.
In another embodiment, portable set 1208 has two displays, one front and one rear.In the case, do not need portable set 1208 is turned over so that display faces other players.Portable set 1206 can detect the display at the display in the place aheads or rear to be calibrated from portable set 1208.
Figure 13 illustrates for follow the tracks of the embodiment of portable set via reckoning positioning.If portable set be equipped with leave user face towards camera, can know via reckoning positioning and inertial navigation the position of equipment.In addition, camera real world view can mix with the figure that computer generates.
Reckoning positioning is based on before definite location estimation current location or orientation (fix) and the speed known or that estimate based on elapsed time and route and estimates the processing of this position.The shortcoming of reckoning positioning be due to reposition only from position calculation before, so the error of processing is cumulative, thereby the error of location fix increases in time.
Figure 13 illustrates two portable sets of the point be calibrated on desk 1310.In the course of time, equipment 1302A moves ahead around room, as observed in the track 1306 of measuring in reckoning positioning.Point in track 1306 illustrates with position before the portable set 1302A after measuring last time and moves the number of times (times) of estimated position.On the other hand, actual path 1308 illustrates the actual path of portable set in space.Process in time, due to essence and the accumulation of error of reckoning positioning, the track of measurement trend from real trace more and more away from.But, can regulate reckoning positioning and follow the tracks of to proofread and correct the deviation with real trace, as following, with reference to figure 14, describe.
How Figure 14 is used static nature to adjust reckoning positioning according to an embodiment diagram in background.The reckoning positioning skew is usually very little, but the accumulation of error of a period of time can be created the larger deviation from actual path.In one embodiment, the video correction of the static pattern for the DR error the edge such as desk 1301 in the tracking room, window, TV etc., as described with reference to figure 9 before.
Error correction can be carried out with each DR position measurement or with specific interval or with the measurement of every specific quantity.In order to be proofreaied and correct, the image of the position of DR computing equipment and camera capture apparatus front.Portable set keeps the tracking of one or more static natures, and the physical location of this static nature and estimating position are compared.Because static nature does not move, so difference is owing to the DR reckon error.Recalculate position so that there is the static nature of expectation and the static nature of measurement in identical place.
In another embodiment, portable set is communicated by letter mutually and is coordinated the motion tracking of two equipment by exchange DR and characteristic information.Like this, can obtain the true 3D model of the stationary body in room.
Figure 15 diagram is for an embodiment of the calibration steps of multiplayer's environment.The positional information that will obtain from device senses device (accelerometer, GPS, compass, depth camera etc.) is sent to the equipment of other links to strengthen the data of the collaborative maintenance Virtual Space.In establishment, be synchronized in the embodiment of the common communal space of common reference point 1502, the first player 1504A by her device synchronization in the 3d space with respect to reference point 1502.Other players set up communication linkage with the first player with switch and game information in the communal space.Relative position can obtain by different way, such as using WiFi triangulation and ping test to determine relative position.In addition, visual information can be used for determining other positions, such as the face of detecting other players and the possible position of determining game station from their face.
In one embodiment, by means of supersonic communication and directional microphone, with audio frequency triangulation, determine relative position.Can carry out audio frequency triangulation by a plurality of frequencies.Once equipment is exchange position information, the radio communication such as ultrasonic, WiFi or bluetooth is used for synchronous remaining equipment to reference point 1502.After the armamentarium calibration, equipment is known reference point 1502 and they relative positions with respect to reference point 1502.Should approve, can calibrate many equipment to sharing reference point with additive method.For example, by equipment is placed on reference point successively, armamentarium can be calibrated to identical reference point.
By using the shade and the illumination that are determined by the light source in room can make virtual scene even truer.By using camera feed (feed), game environment and role make illumination and shade be subject to the impact of real world.This means, player's hand by cast shadow on virtual portrait or object, as hand enter in virtual world with the dummy object interaction.Gaming world shade and illumination are by real world shade and illumination adjustment, to obtain as far as possible optimum efficiency.
Figure 16 describes multiplayer's reality-virtualizing game according to an embodiment.When the position data of calibrating and image analysis data and the connective combination of high speed, positional information and game information can, in each exchanged between equipment, select described equipment to participate in communal space game experiencing.This system access that allows each player is from all other players' camera view and positional information, with synchronous their calibrating position shared virtual space (also referring to the communal space) together.
Player 1602A-1602C synchronous with reference to the point (such as the point on desk) in common 3d space or calibrate their portable set after, create common virtual scene 1604.Each player has the view of virtual scene 1604, on the desk of virtual scene (being the game of fight plate in the case) before the player, is real seemingly.Portable set is as camera, thereby, when the player moves around equipment, view changes.Consequently, the actual view on each display be independent of view in other displays and this view be only based on portable set with respect to the relative position of virtual scene, this virtual scene is anchored to the actual physical location in 3d space.
Determine position by a plurality of cameras, accelerometer and other plant equipment together with the high-speed communication with between portable set, can create 3D capturing movement class and experience, it allows the player see in believable mode and may touch virtual game personage and environment.
The connective next exchanged between equipment information participating in communal space game experiencing of the high speed of the communal space 1604 use equipment.By equipment being turned to stable " the magic window " continued on the space between each equipment, watch the communal space 1604 game area by this equipment, the combination of the height continuation by using motion tracking, graphical analysis and information between each equipment, even when equipment moves around, game area occurs in stable position.
Figure 17 illustrates for generating the algorithm flow in the interactive space that can watch by least the first and second equipment according to an embodiment of invention.The handling process that arthmetic statement is carried out on each portable set, and to the first equipment and the second device description flow process independently.About the first equipment, method, to operate 1702 beginnings, in operation 1702, detects the position of the second equipment or receives the notice that the second equipment has detected the first equipment.Method proceeds to operation 1704, wherein with the second devices exchange synchronizing information data with identification the reference point in 3d space.The physical location of the first and second equipment in the relevant space of this reference point.In addition, two equipment is all set up the physical location in the 3d space of other equipment when setting reference point.
Method advances to operation 1706 from operating 1704, for generate the view of the interactive scene that comprises reference point and one or more dummy objects at display.This view illustrates the perspective of the interactive scene of observing as the current location from equipment.After view shown in display, in operation 1708, checked to determine whether the first equipment moves.If the first equipment moves, method advances to operation 1710 and changes the visual angle for showing with the reposition according to the first equipment.If equipment does not move, method loops back operation 1706 to continue to upgrade demonstration, the change of the dummy object such as reaction between game play session.
During operation 1702,1704 and 1706, the first equipment can be played and the position trace information with the second devices exchange.The second equipment is similar to the method that the first equipment carries out in operation 1712,1714,1716,1718 and 1720.
Figure 18 diagram can be used for realizing the structure of the equipment of inventive embodiment.Portable set is computing equipment and comprises the typical module appeared in computing equipment, such as, processor, memory (RAM, ROM etc.), battery or other power supplys and permanent memory (such as hard disk).The exchange messages such as communication module permission portable set and other portable sets, other computers, server.Communication module comprises USB (USB) connector, communication link (such as Ethernet), supersonic communication, bluetooth and WiFi.
Input module comprises load button and sensor, microphone, touch sensitive screen, camera (forward, backward, depth camera) and card reader.Other input-output apparatus such as keyboard or mouse also can be connected to portable set via the communication link such as USB or bluetooth.Output module comprises display (with touch sensitive screen), light emitting diode (LED), vibrations tactile feedback and loudspeaker.Other output equipments also can be connected to portable set via communication module.
Can use information from distinct device to calculate the position of portable set by position module.These modules comprise magnetometer, accelerometer, gyroscope, GPS and compass.In addition, position module can be analyzed sound or the view data by camera and microphones capture, with calculating location.Further, position module can be carried out test such as WiFi ping test or ultrasonic tesint with the position of determining portable set or near the position of other equipment.
As previously described, the virtual reality maker uses the position of being calculated by position module to create virtual or augmented reality.The view generation device is created in based on virtual reality and position the view illustrated on display.This view generation device also can use the directionality effect that is applied to multi-loudspeaker system to produce the sound that is derived from the virtual reality maker.
Should approve, in Figure 21, illustrated embodiment is the exemplary realization of portable set.Other embodiment can be used the subset of disparate modules, module or specify relevant task to disparate modules.Therefore, in Figure 21, illustrated embodiment should not be interpreted as exclusive or restrictive, but exemplary or schematic.
Figure 19 is the graphical representation of exemplary to the scenario A of user E and game client 1102 interactions to scene E according to the wherein user A of one embodiment of the present of invention, this client 1102 via Internet connection to server process.Game client is to allow the equipment of user via internet connection service device application and processing.Game client allows user's access and the online entertainment content of playback, such as, but not limited to game, film, music and photo.In addition, game client can provide the access to the application of the online communication such as VOIP, text chat agreement and Email.
The user is via controller and game client interaction.In certain embodiments, controller is the game client nonshared control unit, and in other embodiments, controller can be the combination of keyboard and mouse.In one embodiment, game client is autonomous device, can output audio with vision signal by monitor/TV and associated audio equipment, to create multimedia environment.For example, game client can be but be not limited to thin client, inner PCI-express card, exterior PC I-express equipment, ExpresCard equipment, inside, outside or wireless USB apparatus or Firewire equipment etc.In other embodiments, game client and TV or other multimedia equipments such as DVR, Blu-ray player, DVD player or multi-channel receiver are integrated.
In the scenario A of Figure 22, user A is used the controller 100 and the client application interaction shown on monitor 106 with game client 1102A pairing.Similarly, in scenario B, user B is used the controller 100 and another client application interaction shown on monitor 106 with game client 1102B pairing.Scene C diagram when user C is seeing demonstration from the monitor of the game of game client 1102C and buddy list at the view of user C back.Although Figure 22 illustrates the single server processing module, in one embodiment, there are a plurality of server process modules that spread all over the world.Each server process module comprises a plurality of submodules of processing service for user session control, shared/communication logic, user geographical position and load balance.In addition, the server process module comprises network processes and distributed storage.
When game client 1102 connection server processing module, can the user talk with and control authentication of users.The user who verifies can the related virtual distributed storage of tool and virtual network processes.The example item of a part that can be stored as user's virtual distributed storage comprises bought media, such as, but not limited to game, video and music etc.In addition, distributed storage can be used for preserving the game state of a plurality of game, the personal settings of solitary play and the general setting of game client.In one embodiment, user's geographic position module of server process is used for determining the geographical position of user and their game client separately.Service can be processed by share/communication logic and load balance in user's geographical position, and both use, with the processing demands Optimal performance based on geographical position and multiserver processing module.It is dynamic mapping to (a plurality of) server process module of underutilization that in virtual network processes and the network storage one or both will allow from the Processing tasks of client.Therefore, load balance can be used for minimizing and the recalling of memory (recall from) and the delay that is associated with transfer of data between server process module and game client.
The server process module has the example of service application A and service application B.The server process module can be supported as server application X1 and the indicated a plurality of server application of server application X2.In one embodiment, server process is based on cluster computation structure, and this cluster computation structure allows a plurality of processor processing server application in cluster.In another embodiment, dissimilar multicomputer processing scheme is applied to the processing server application.This allows the scalable service device to process, so that hold, carries out a large amount of game clients that multi-client is applied and corresponding server is applied.Alternatively, can process to hold harsher image processing or game, video compress or apply the computation requirement of the necessary increase of complexity by the scalable service device.In one embodiment, the server process module is carried out main processing via the server application.This allows the cost of placing relatively costly assembly (such as image processor, RAM and general processor) at center and reducing game client.Handled server application data is sent it back to corresponding game client via internet, to show on monitor.
The exemplary application that scene C diagram can be carried out by game client and server processing module.For example, in one embodiment, game client 1102C allows user C to create and watch the buddy list 1120 that comprises user A, user B, user D and user E.As shown, in scene C, user C can see each user's realtime graphic or incarnation on monitor 106C.Server process is carried out each application of game client 1102C and user A, user B, user D and user E game client 1102 separately.Because server process is known game client B and is carried out application, so the buddy list of user A can indicate game user B playing which game.Further, in one embodiment, user A can be in the game video that directly comes from user B actual watching.This realizes by only the server application data of handled user B also being sent to game client A except sending to game client B.
Except can watching the video from the good friend, communications applications can allow the real-time Communication for Power between the good friend.During example before be applied to, this allows user A to provide when watching the real-time video of user B to encourage or prompting.In one embodiment, set up two-way real-time audio communication by client-server application.In another embodiment, client-server application enables text chat.In an embodiment again, client-server application is that text for showing on good friend's screen by speech conversion.
User D separately of scene D and scene E diagram and user E respectively with game console 1110D and 1110E interaction.Each game console 1110D and 1110E are connected to the server process module and illustrate service processing module game console and game client are coordinated to the network that game is carried out.
The embodiment of Figure 20 pictorial information service provider structure.Information service provider (ISP) 250 arrives the bulk information Service delivery user 262 who geographically disperses and connect via network 266.ISP can only transmit the class service such as stock price is upgraded or the many services such as broadcast medium, news, physical culture, game etc.In addition, the service by each ISP supply is that, service can be put and adds or withdraw at any time dynamically.Therefore, the ISP that the service of particular type is offered to concrete individuality can change along with the time.For example, as the user, when she is local, the user can be served by the ISP of proximal subscribers, and, when this user travels to different cities, this user can be served by different I SP.Local ISP will transmit desired information and data arrive new ISP, thereby user profile " is followed " user to new town, makes the more close user of data and is easier to access.In another embodiment, can be between the primary isp of the information of leading subscriber and the server ISP directly docked with the user under the control from manager ISP foundation master-server (master-server) relation.In other embodiments, when client's around-the-world moves, data are transferred to another ISP to the better locational ISP of service-user be the ISP that transmits these services from an ISP.
ISP 250 comprises application service provider (ASP) 252, and it offers the consumer by network by the computer based service.Use the software of ASP model supply to be also sometimes referred to as required software or software serve (SaaS).It is by using the standard agreement such as HTTP that simple form to the access of concrete application program (such as consumer's relation management) is provided.By the private client software that provided by the seller or other remote interfaces such as thin client, application software reside on seller's system and by the user by using the web browser access of HTML.
Cloud computing is often used in the service of transmitting on wide geographic area.Cloud computing is one type that calculates, and wherein, capable of dynamic convergent-divergent common virtualized resource is provided as the service on the internet.The user needs not be the expert of the technical foundation framework in " cloud " of supporting them.Cloud computing can be divided into different services, such as architecture, serves that (IaaS), platform serve (PaaS) and software serve (SaaS).Cloud computing service provides public business application usually online, and it is accessed from web browser, and software and data are stored on server.The term cloud is used as the metaphor of internet based on how in computer network figure, describing internet, and is the abstract of its complicated foundation framework of covering.
In addition, ISP 250 comprises game processing server (GPS) 254, and it is used for playing single or multiplayer video game by game client.In most of video-games of playing on the internet via to the attended operation of game server.Typically, the private server application is used in game, and this private server application is collected from player's data and it is distributed to other players.This is more efficient and effective than arranging end to end, but it requires separate server to carry out the management server application.In another embodiment, GPS sets up that communication between player and their Entertainment equipment separately carrys out exchange message and the GPS that do not rely on centralization.
Special-purpose GPS is the server that is independent of the client operation.Such server is normally operated on the specialized hardware be positioned in data center, provides more multi-band wide and dedicated processes ability.For great majority, the multiplayer based on PC is the method for optimizing of carrying game server to private server.The flood tide MMOG operates in usually on the private server by the software company's management that has the rights and interests of play, allows them to control and fresh content more.
Broadcast processing server (BPS) 256 is distributed to spectators by the audio or video signal.Spectators' broadcast to very narrow scope is sometimes referred to as narrow broadcast.The final tache of broadcast distribution is how signal reaches audience or beholder, and it can be as with radio station or TV tableland, by atmosphere, arrived antenna or receiver, or can be via radio station or directly come from network and reach by wired TV or wired radio (or " wireless cable ").Internet also can bring the recipient by radio or TV, particularly by means of the multiple access communication that allows to share signal and bandwidth.In history, broadcast is demarcated by geographic zone, such as government broadcasting or area broadcast.But, along with the expansion of quick internet, because content can reach almost any country in the world, need not geographical definition broadcast.
Storage service provider (SSP) 258 provides Computer Storage space and related management service.SSP is the backup of supply cycle property and filing also.By storing as service provision, the user can order desired more storages.Comprise that the hard drive of backup services and user's computer damages if another main advantage is SSP, the user will can not lose their total data.In addition, a plurality of SSP can have the copy wholly or in part of user data, allow the user to be independent of the user location or to be used for the efficient way visit data of equipment of visit data.For example, the user can access the personal document in home computer, and works as the user and access the personal document in advancing in mobile phone.
Communication provider 260 provides connectivity to the user.A kind of communication provider is the ISP (ISP) for the access of reply internet.ISP is used the data transmission technology be suitable for transmitting the Internet Protocol electronic message (such as, dialing, DSL, cable modem, wireless or specialized high-speed interconnection and so on) to connect its consumer.Communication provider also can providing message service, such as e-mail, instant message and SMS send short messages.Another kind of communication provider is Internet Service Provider (NSP), and it accesses to sell bandwidth or access to netwoks by providing to the direct backbone network of internet.The Internet Service Provider can consist of cable television operators of telecommunications company, Deta bearer business, radio communication provider, ISP, supply high-speed the Internet access etc.
A plurality of modules in exchanges data 268 interconnection ISP 253 and these modules are connected to user 262 via network 266.Exchanges data 268 can cover little zone, and wherein whole modules of ISP 250 are in closely adjacent, or, when the disparate modules local position distribution, can cover large geographic area.For example, exchanges data 268 can comprise the interior quick gigabit Ethernet (or faster) of rack of data center, or intercontinental Virtual Local Area Network (VLAN).
User 262 is by means of client device 264 access remote service, and this client device 264 comprises at least CPU, display and I/O.Client device can be PC, mobile phone, net book, PDA etc.In one embodiment, the type of the equipment that ISP 250 identify customer ends are used, and adjust the communication means adopted.In other cases, client device is used the standard traffic method such as html to visit ISP 250.
Information service provider (ISP) 250 is delivered to a plurality of information services the user 262 who geographically disperses and connect via network 266.ISP can only transmit the class service such as stock price is upgraded, or transmits the many services such as broadcast medium, news, physical culture, game etc.In addition, the service by each ISP supply is that, service can be put and adds or withdraw at any time dynamically.Therefore, the ISP that the service of particular type is offered to concrete individuality can change along with the time.For example, as the user, when she is local, the user can be served by the ISP of proximal subscribers, and, when this user travels to different cities, this user can be served by different I SP.Local ISP will transmit desired information and data arrive new ISP, thereby user profile " is followed " user to new town, makes the more close user of data and is easier to access.In another embodiment, can between the manager ISP of the information of leading subscriber and the server ISP directly docked with the user, set up manager-relationship server under the control from manager ISP.In other embodiments, when client's around-the-world moves, data are transferred to another ISP so that become the ISP that transmits these services at better location-based service user's ISP from an ISP.
ISP 250 comprises application service provider (ASP) 252, and it provides computer based to serve the consumer by network.Use the software of ASP model supply sometimes also referred to as software or software serve (SaaS) as required.It is by using the standard agreement such as HTTP that simple form to the access of concrete application program (such as consumer's relation management) is provided.By the specific purposes client software that provided by the seller or other remote interfaces such as thin client, application software reside in seller's system and by the user by using the web browser access of HTML.
Cloud computing is often used in the service of transmitting on wide geographic area.Cloud computing is one type that calculates, and wherein, capable of dynamic convergent-divergent common virtualized resource is being provided on the internet as service.The user needs not be the expert of the technical foundation framework in " cloud " of supporting them.Cloud computing can be divided into different services, such as architecture serve (IaaS), platform serves (PaaS) and software serve (SaaS).Cloud computing service provides public business application usually online, and it is accessed from web browser, and software and data are stored on server.The term cloud is used as the metaphor of internet based on how in computer network figure, describing internet, and is the abstract of its complicated foundation framework of covering.
In addition, ISP 250 comprises game processing server (GPS) 254, and it is used for playing single or multiplayer video game by game client.In most of video-games of playing on the internet via to the attended operation of game server.Typically, the private server application is used in game, and this private server application is collected from player's data and it is distributed to other players.This is more efficient and effective than arranging end to end, but it requires stand-alone service to carry out the management server application.In another embodiment, GPS sets up that communication between player and their Entertainment equipment separately carrys out exchange message and the GPS that do not rely on centralization.
Special-purpose GPS is the server that is independent of the client operation.Such server is normally operated on the specialized hardware in data center, provides more multi-band wide and dedicated processes ability.For great majority, the multiplayer based on PC is the method for optimizing of management game server to private server.The flood tide MMOG operates in usually on the private server by the software company's management that has the rights and interests of play, allows them to control and fresh content more.
Broadcast processing server (BPS) 256 is distributed to spectators by the audio or video signal.Spectators' broadcast to very narrow scope is sometimes referred to as narrow broadcast.The final tache of broadcast allocation is how signal reaches audience or beholder, and it can arrive antenna or receiver by atmosphere by means of radio station or TV platform, or can be via radio station or directly come from network and reach by wired TV or wired radio (or " wireless cable ").Internet also can bring the recipient by radio or TV, particularly by means of the multiple access communication that allows to share signal and bandwidth.In history, broadcast is demarcated by geographic zone, such as government broadcasting or area broadcast.But, along with the expansion in quick internet, because content can reach almost any country in the world, need not geographical definition broadcast.
Storage service provider (SSP) 258 provides Computer Storage space and related management service.SSP is the backup of supply cycle property and filing also.By storing as service provision, the user can order desired more storages.Comprise that the hard drive of backup services and user's computer damages if another main advantage is SSP, the user will can not lose their total data.In addition, a plurality of SSP can have the copy wholly or in part of user data, allow the user to be independent of the user location or to be used for the efficient way visit data of equipment of visit data.For example, the user can access the personal document in home computer, and works as the user and access the personal document in advancing in mobile phone.
Communication provider 260 provides connectivity to the user.A kind of communication provider is the ISP (ISP) for the access of reply internet.ISP use the data transmission technology be suitable for transmitting the Internet Protocol electronic message (such as, dialing, DSL, cable modem, wireless or specialized high-speed interconnection) connect its consumer.Communication provider also can providing message service, such as e-mail, instant message and SMS, sends short messages.Another kind of communication provider is Internet Service Provider (NSP), and it accesses to sell bandwidth or access to netwoks by providing to the direct backbone network of internet.The Internet Service Provider can consist of cable television operators of telecommunications company, Deta bearer business, radio communication provider, ISP, supply high-speed the Internet access etc.
A plurality of modules in exchanges data 268 interconnection ISP 253 and these modules are connected to user 262 via network 266.Exchanges data 268 can cover little zone, and wherein whole modules of ISP 250 are closely adjacent, or, when the disparate modules local position distribution, can cover large geographic area.For example, exchanges data 268 can comprise the interior quick gigabit Ethernet (or faster) of rack of data center, or intercontinental Virtual Local Area Network (VLAN).
User 262 is with client device 264 access remote service, and this client device 264 comprises at least CPU, display and I/O.Client device can be PC, mobile phone, net book, PDA etc.In one embodiment, the type of the equipment that ISP 250 identify customer ends are used, and adjust the communication means adopted.In other cases, client device is used the standard traffic method such as html to visit ISP 250.
Embodiments of the invention can be with the practice of various computer system configurations, and this computer system configurations comprises handheld device, microprocessor system, based on microprocessor or programmable consumer electronics device, microcomputer, host computer etc.Invent in the DCE that also can carry out at the teleprocessing equipment by by network linking task and put into practice.
By means of above embodiment, it should be appreciated that the operation that invention can adopt various calculating to realize, comprise the data that are stored in computer system.These operations are those operations that require the physical manipulation of physical quantity.Any of the operation of part described herein, as to form invention is useful machine operation.Invention also relates to equipment or the device for carrying out these operations.Device can be for the purpose required special structure, such as the specific purposes computer.Although be defined as the specific purposes computer, this computer can not be also that other processing, the program of the part of these specific purposes carried out or routine, and still can be operated for specific purposes.Alternatively, operation can be processed by general purpose computer, this general purpose computer of selective activation or configure this general purpose computer by the one or more computer programs that are stored in computer storage, buffer memory or obtain from network.When data obtain by network, these data can be by other computers on network, and for example, the cloud of computational resource is processed.
Embodiments of the invention also can be defined as and data are converted to the machine of another state from a state.The data of changing can be saved in storage device and be controlled by processor subsequently.Therefore processor is converted to another by data from a things.In addition, method can be processed by one or more machines that can connect by network or processor.Each machine can be converted to another from a state or things by data, also can deal with data, save data is to storage device, and by transmitted data on network, show result or result is passed to another machine.
The computer-readable code that one or more embodiment of the present invention also can be used as on computer-readable medium is implemented.Computer-readable medium is any data storage device that can store data, can be by computer system reads after these data.The example of computer-readable medium comprises hard drive, attached net storage device (NAS), read-only storage, random access storage device, CD-ROM, CD-R, CD-RW, tape and other light and non-optical data storage device.Computer-readable medium can comprise the computer-readable tangible medium on the computer system that is distributed in network-coupled, thereby computer-readable code is with distributed storage and execution.
Although the method operation is described with concrete order, but should be appreciated that, can between operation, carry out other auxiliary operations, thereby perhaps can adjusting operation they in the slightly different time, occur, perhaps can in allowing to process the system that operates the various intervals generation to be associated with processing, distribute, as long as carry out the processing of overlap operation in suitable mode.
Although invention is before described in detail for the purpose of clear understanding, clearly, can put into practice within the scope of the appended claims specific modification and improvement.Therefore, it is illustrative and not restrictive that the present embodiment is considered to, and invention is not limited to details given here, but can in the scope of claims and equivalent, revise.

Claims (20)

1. one kind for generating the method in the interactive space can watch by least the first and second equipment, and described method comprises:
From the position of described the second equipment of described the first Equipment Inspection or from the position of described the first equipment of described the second Equipment Inspection;
Described the first equipment and the second exchanged between equipment synchronizing information data with identification in three-dimensional (3D) space with respect to the reference point in the 3d space of the physical location of described the first and second equipment, wherein, described the first and second equipment are set up the physical location in the 3d space of other equipment when setting described reference point;
The view of generating interactive scene in the respective display of described the first and second equipment, described interactive scene is bound to described reference point and comprises dummy object, each view illustrates all or part of described interactive scene and interactive scene as viewed as the current location from relevant device is shown, and wherein the equipment moving in described 3d space makes corresponding views change according to described current location.
2. method according to claim 1 wherein, further comprises from the position of described the second equipment of described the first Equipment Inspection:
Detect described the first equipment and in the very first time, rapped the first object;
Receive described the second equipment and in the second time, rapped the information of the second object;
Basic with constantly when the described very first time and described the second time, determine that described the first equipment has rapped described the second equipment.
3. method according to claim 1 wherein, further comprises from the position of described the second equipment of described the first Equipment Inspection:
Described the second equipment of identification in the image of taking at the camera with in described the first equipment.
4. method according to claim 1 wherein, further comprises from the position of described the second equipment of described the first Equipment Inspection:
Detect the light source in described the second equipment in the image of taking at the camera with in described the first equipment.
5. method according to claim 1 wherein, further comprises from the position of described the second equipment of described the first Equipment Inspection:
Use depth camera to detect the distance between described the first equipment and described the second equipment.
6. method according to claim 1 wherein, further comprises from the position of described the second equipment of described the first Equipment Inspection:
The display of described the second equipment of identification in the image of taking at the camera with in described the first equipment.
7. method according to claim 1 wherein, further comprises from the position of described the second equipment of described the first Equipment Inspection:
Pattern in the image of taking at the camera with in described the first equipment in the display of described the second equipment of identification.
8. method according to claim 1 further comprises:
Follow the tracks of the motion of described the first equipment by described the first equipment.
9. method according to claim 8, wherein, the motion of following the tracks of described the first equipment further comprises:
Determine the current location of described the first equipment with reckoning positioning.
10. method according to claim 9, wherein, the motion of following the tracks of described the first equipment further comprises:
The position of using image information to follow the tracks of the static nature in described 3d space; With
Skew in position correction reckoning positioning based on described static nature.
11. method according to claim 10, wherein, select in the group that described static nature forms from the edge by desk, Zhong De corner, room, window or television set.
12. method according to claim 1 further comprises:
When recognizing described reference point, the position tracking module in reset described the first equipment and described the second equipment.
13. the method in the interactive space that can watch by least the first and second equipment for generation, described method comprises:
Detect rapping between described the first equipment and described the second equipment;
Described the first equipment and described the second exchanged between equipment synchronizing information data when described rapping being detected in three-dimensional (3D) space identification with respect to the reference point in the 3d space of the physical location of described the first and second equipment;
The view of generating interactive scene in the respective display of described the first and second equipment, described interactive scene is bound to described reference point and comprises dummy object, each view illustrates all or part of described interactive scene and interactive scene as viewed as the current location from relevant device is shown, wherein, the equipment moving in 3d space makes corresponding views change according to described current location.
14. method according to claim 13, wherein, detection is rapped further and is comprised:
Detect described the first equipment and in the very first time, rapped the first object;
Receive described the second equipment and in the second time, rapped the information of the second object;
Determine when described first and described the second time basic with constantly, described the first equipment has rapped described the second equipment.
15. method according to claim 14, wherein, the first equipment has rapped the first object and has further comprised:
Identify the flip-flop of the motion of described the first equipment.
16. method according to claim 15, wherein, identify the information of described flip-flop based on from 3 axis accelerometers.
17. method according to claim 13, wherein, configure each equipment for the interaction with described dummy object.
18. the portable set for shared virtual reality in the middle of portable set, described portable set comprises:
Position module, for following the tracks of the position of described portable set, described position module is configured to the position with reference to position probing second equipment of described portable set;
Communication module, in described portable set and described the second exchanged between equipment synchronizing information data, wherein, based on described synchronizing information in three-dimensional (3D) space identification with respect to the reference point in the 3d space of the physical location of portable set;
The view generation device, establishment is bound to described reference point and comprises the view of the interactive scene of dummy object, described view illustrate as from as described in the viewed all or part of interactive scene of current location of portable set, the portable set in wherein said 3d space moves and makes described view change according to described current location; With
Display, for illustrating described view.
19. portable set according to claim 18, wherein, described position module comprises the position tracking module, and described position tracking module comprises:
3 axis accelerometers;
3 axle gyroscopes; With
Camera.
20. portable set according to claim 19, wherein, described position tracking module further comprises:
The GPS module; With
Depth camera.
CN201180028950.XA 2010-04-13 2011-01-26 Calibration of portable devices in shared virtual space Active CN102939139B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US32376210P 2010-04-13 2010-04-13
US61/323,762 2010-04-13
US12/973,827 2010-12-20
US12/973,827 US8537113B2 (en) 2010-03-05 2010-12-20 Calibration of portable devices in a shared virtual space
PCT/US2011/022638 WO2011129907A1 (en) 2010-04-13 2011-01-26 Calibration of portable devices in a shared virtual space

Publications (2)

Publication Number Publication Date
CN102939139A true CN102939139A (en) 2013-02-20
CN102939139B CN102939139B (en) 2015-03-04

Family

ID=43828362

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201180028950.XA Active CN102939139B (en) 2010-04-13 2011-01-26 Calibration of portable devices in shared virtual space

Country Status (4)

Country Link
EP (1) EP2558176B1 (en)
CN (1) CN102939139B (en)
TW (1) TWI449953B (en)
WO (1) WO2011129907A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104657568A (en) * 2013-11-21 2015-05-27 深圳先进技术研究院 Multiplayer mobile game system and multiplayer mobile game method based on intelligent glasses
CN104679695A (en) * 2013-11-28 2015-06-03 联想(北京)有限公司 Information processing method and electronic device
CN105359054A (en) * 2013-06-10 2016-02-24 微软技术许可有限责任公司 Locating and orienting device in space
CN105892650A (en) * 2016-03-28 2016-08-24 联想(北京)有限公司 Information processing method and electronic equipment
CN106126723A (en) * 2016-06-30 2016-11-16 乐视控股(北京)有限公司 A kind of method and device of mobile destination object
CN107894828A (en) * 2016-10-04 2018-04-10 宏达国际电子股份有限公司 Virtual reality processing method and the electronic installation for handling virtual reality
CN108983624A (en) * 2018-07-17 2018-12-11 珠海格力电器股份有限公司 A kind of control method and terminal device of smart home device
TWI664533B (en) * 2017-01-18 2019-07-01 宏達國際電子股份有限公司 Positioning apparatus and method
CN110124305A (en) * 2019-05-15 2019-08-16 网易(杭州)网络有限公司 Virtual scene method of adjustment, device, storage medium and mobile terminal
CN110448892A (en) * 2019-07-18 2019-11-15 江西中业光文化科技有限公司 Game implementation method and system based on augmented reality
US10499044B1 (en) 2019-05-13 2019-12-03 Athanos, Inc. Movable display for viewing and interacting with computer generated environments
CN111201797A (en) * 2017-10-12 2020-05-26 微软技术许可有限责任公司 Point-to-point remote location for devices
CN112783328A (en) * 2016-07-20 2021-05-11 Colopl株式会社 Method for providing virtual space, method for providing virtual experience, program, and recording medium

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5960796B2 (en) 2011-03-29 2016-08-02 クアルコム,インコーポレイテッド Modular mobile connected pico projector for local multi-user collaboration
US20130050499A1 (en) * 2011-08-30 2013-02-28 Qualcomm Incorporated Indirect tracking
CA2854485C (en) * 2011-11-04 2020-04-28 8 Leaf Digital Productions Inc. Integrated digital play system
TWI482049B (en) * 2012-03-02 2015-04-21 Realtek Semiconductor Corp Multimedia interaction system and related computer program product capable of blocking multimedia interaction commands that against interactive rules
TWI499935B (en) * 2012-08-30 2015-09-11 Realtek Semiconductor Corp Multimedia interaction system and related computer program product capable of avoiding unexpected interaction behavior
DE102012203458A1 (en) * 2012-03-05 2013-09-05 E.G.O. Elektro-Gerätebau GmbH Remote control unit for a household appliance
JP5966510B2 (en) 2012-03-29 2016-08-10 ソニー株式会社 Information processing system
WO2013152455A1 (en) * 2012-04-09 2013-10-17 Intel Corporation System and method for avatar generation, rendering and animation
US9386268B2 (en) 2012-04-09 2016-07-05 Intel Corporation Communication using interactive avatars
US10905943B2 (en) 2013-06-07 2021-02-02 Sony Interactive Entertainment LLC Systems and methods for reducing hops associated with a head mounted system
US10137361B2 (en) 2013-06-07 2018-11-27 Sony Interactive Entertainment America Llc Systems and methods for using reduced hops to generate an augmented virtual reality scene within a head mounted system
EP2886172A1 (en) * 2013-12-18 2015-06-24 Microsoft Technology Licensing, LLC Mixed-reality arena
US9323323B2 (en) * 2014-01-06 2016-04-26 Playground Energy Ltd Augmented reality system for playground equipment incorporating transforming avatars
US9881422B2 (en) * 2014-12-04 2018-01-30 Htc Corporation Virtual reality system and method for controlling operation modes of virtual reality system
WO2016101131A1 (en) 2014-12-23 2016-06-30 Intel Corporation Augmented facial animation
US20160364011A1 (en) * 2015-06-15 2016-12-15 Microsoft Technology Licensing, Llc. Human machine interface controller
US10475225B2 (en) 2015-12-18 2019-11-12 Intel Corporation Avatar animation system
GB2550854B (en) 2016-05-25 2019-06-26 Ge Aviat Systems Ltd Aircraft time synchronization system
TWI660304B (en) * 2016-05-30 2019-05-21 李建樺 Virtual reality real-time navigation method and system
CN108111475B (en) * 2016-11-25 2020-05-05 阿里巴巴集团控股有限公司 Identity verification method and device
US10878616B2 (en) 2017-04-06 2020-12-29 Htc Corporation System and method for assigning coordinates in virtual reality environment
CN106997281A (en) * 2017-04-10 2017-08-01 北京小米移动软件有限公司 The method and smart machine of shared virtual objects
US10268263B2 (en) 2017-04-20 2019-04-23 Microsoft Technology Licensing, Llc Vestibular anchoring
TWI664995B (en) * 2018-04-18 2019-07-11 鴻海精密工業股份有限公司 Virtual reality multi-person board game interacting system, initeracting method, and server
GB201811249D0 (en) * 2018-07-09 2018-08-29 Digitalbridge System and method for virtual image alignment
CN109272454B (en) * 2018-07-27 2020-07-03 阿里巴巴集团控股有限公司 Coordinate system calibration method and device of augmented reality equipment
CN113542328B (en) * 2020-04-20 2023-08-29 上海哔哩哔哩科技有限公司 Virtual environment data synchronization method and device
CN112148122A (en) * 2020-08-25 2020-12-29 中国电子科技集团公司第三十八研究所 Third-party visual angle implementation method for wearable augmented/mixed reality equipment
US20230217101A1 (en) * 2022-01-06 2023-07-06 Htc Corporation Data processing system, data processing method, and computer readable storage medium
CN115100276B (en) * 2022-05-10 2024-01-19 北京字跳网络技术有限公司 Method and device for processing picture image of virtual reality equipment and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030156144A1 (en) * 2002-02-18 2003-08-21 Canon Kabushiki Kaisha Information processing apparatus and method
WO2004012141A2 (en) * 2002-07-26 2004-02-05 Zaxel Systems, Inc. Virtual reality immersion system
US20060258420A1 (en) * 2003-09-02 2006-11-16 Mullen Jeffrey D Systems and methods for location based games and employment of the same on location enabled devices
CN101185080A (en) * 2005-05-27 2008-05-21 皇家飞利浦电子股份有限公司 Playback device for playing digital content from devices in wireless communication
CN101208723A (en) * 2005-02-23 2008-06-25 克雷格·萨默斯 Automatic scene modeling for the 3D camera and 3D video

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070265089A1 (en) 2002-05-13 2007-11-15 Consolidated Global Fun Unlimited Simulated phenomena interaction game
US8323106B2 (en) 2008-05-30 2012-12-04 Sony Computer Entertainment America Llc Determination of controller three-dimensional location using image analysis and ultrasonic communication
TWI278772B (en) * 2005-02-23 2007-04-11 Nat Applied Res Lab Nat Ce Augmented reality system and method with mobile and interactive function for multiple users
US20060223635A1 (en) * 2005-04-04 2006-10-05 Outland Research method and apparatus for an on-screen/off-screen first person gaming experience
US20080220878A1 (en) 2007-02-23 2008-09-11 Oliver Michaelis Method and Apparatus to Create or Join Gaming Sessions Based on Proximity

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030156144A1 (en) * 2002-02-18 2003-08-21 Canon Kabushiki Kaisha Information processing apparatus and method
WO2004012141A2 (en) * 2002-07-26 2004-02-05 Zaxel Systems, Inc. Virtual reality immersion system
US20060258420A1 (en) * 2003-09-02 2006-11-16 Mullen Jeffrey D Systems and methods for location based games and employment of the same on location enabled devices
CN101208723A (en) * 2005-02-23 2008-06-25 克雷格·萨默斯 Automatic scene modeling for the 3D camera and 3D video
CN101185080A (en) * 2005-05-27 2008-05-21 皇家飞利浦电子股份有限公司 Playback device for playing digital content from devices in wireless communication

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105359054B (en) * 2013-06-10 2019-07-02 微软技术许可有限责任公司 Equipment is positioned and is orientated in space
CN105359054A (en) * 2013-06-10 2016-02-24 微软技术许可有限责任公司 Locating and orienting device in space
CN104657568B (en) * 2013-11-21 2017-10-03 深圳先进技术研究院 Many people's moving game system and methods based on intelligent glasses
CN104657568A (en) * 2013-11-21 2015-05-27 深圳先进技术研究院 Multiplayer mobile game system and multiplayer mobile game method based on intelligent glasses
CN104679695A (en) * 2013-11-28 2015-06-03 联想(北京)有限公司 Information processing method and electronic device
CN104679695B (en) * 2013-11-28 2018-03-27 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN105892650A (en) * 2016-03-28 2016-08-24 联想(北京)有限公司 Information processing method and electronic equipment
CN106126723A (en) * 2016-06-30 2016-11-16 乐视控股(北京)有限公司 A kind of method and device of mobile destination object
WO2018000615A1 (en) * 2016-06-30 2018-01-04 乐视控股(北京)有限公司 Method for moving target object, and electronic device
CN112783328B (en) * 2016-07-20 2024-03-15 Colopl株式会社 Method for providing virtual space, method for providing virtual experience, program, and recording medium
CN112783328A (en) * 2016-07-20 2021-05-11 Colopl株式会社 Method for providing virtual space, method for providing virtual experience, program, and recording medium
CN107894828B (en) * 2016-10-04 2021-01-29 宏达国际电子股份有限公司 Virtual reality processing method and electronic device for processing virtual reality
CN107894828A (en) * 2016-10-04 2018-04-10 宏达国际电子股份有限公司 Virtual reality processing method and the electronic installation for handling virtual reality
CN112764537A (en) * 2016-10-04 2021-05-07 宏达国际电子股份有限公司 Virtual reality processing method and electronic device for processing virtual reality
TWI664533B (en) * 2017-01-18 2019-07-01 宏達國際電子股份有限公司 Positioning apparatus and method
CN111201797A (en) * 2017-10-12 2020-05-26 微软技术许可有限责任公司 Point-to-point remote location for devices
CN111201797B (en) * 2017-10-12 2022-06-10 微软技术许可有限责任公司 Synchronization method and display device
CN108983624B (en) * 2018-07-17 2020-11-03 珠海格力电器股份有限公司 Control method of intelligent household equipment and terminal equipment
CN108983624A (en) * 2018-07-17 2018-12-11 珠海格力电器股份有限公司 A kind of control method and terminal device of smart home device
US10687051B1 (en) 2019-05-13 2020-06-16 Athanos, Inc. Movable display for viewing and interacting with computer generated environments
US10499044B1 (en) 2019-05-13 2019-12-03 Athanos, Inc. Movable display for viewing and interacting with computer generated environments
US11032537B2 (en) 2019-05-13 2021-06-08 Athanos, Inc. Movable display for viewing and interacting with computer generated environments
CN110124305A (en) * 2019-05-15 2019-08-16 网易(杭州)网络有限公司 Virtual scene method of adjustment, device, storage medium and mobile terminal
CN110448892B (en) * 2019-07-18 2023-08-22 江西中业光文化科技有限公司 Game realization method and system based on augmented reality
CN110448892A (en) * 2019-07-18 2019-11-15 江西中业光文化科技有限公司 Game implementation method and system based on augmented reality

Also Published As

Publication number Publication date
EP2558176B1 (en) 2018-11-07
TWI449953B (en) 2014-08-21
TW201205122A (en) 2012-02-01
WO2011129907A1 (en) 2011-10-20
CN102939139B (en) 2015-03-04
EP2558176A1 (en) 2013-02-20

Similar Documents

Publication Publication Date Title
CN102939139B (en) Calibration of portable devices in shared virtual space
US11244469B2 (en) Tracking position of device inside-out for augmented reality interactivity
TWI468734B (en) Methods, portable device and computer program for maintaining multiple views on a shared stable virtual space
US9947139B2 (en) Method and apparatus for providing hybrid reality environment
US11050977B2 (en) Immersive interactive remote participation in live entertainment
TWI594174B (en) Tracking system, method and device for head mounted display
CN106255916B (en) Track the method and system of head-mounted display (HMD) and the calibration for the adjustment of HMD headband
CN105573486B (en) Headset equipment (HMD) system with the interface interfaced with mobile computing device
CN107362532B (en) Direction input for video games
CN113633973B (en) Game picture display method, device, equipment and storage medium
TW202204018A (en) Scanning of 3d objects with a second screen device for insertion into a virtual environment
CN102781527A (en) Wireless device pairing methods
CN113274729B (en) Interactive observation method, device, equipment and medium based on virtual scene
CN113244616A (en) Interaction method, device and equipment based on virtual scene and readable storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant