WO2011109126A1 - Maintaining multiple views on a shared stable virtual space - Google Patents
Maintaining multiple views on a shared stable virtual space Download PDFInfo
- Publication number
- WO2011109126A1 WO2011109126A1 PCT/US2011/022288 US2011022288W WO2011109126A1 WO 2011109126 A1 WO2011109126 A1 WO 2011109126A1 US 2011022288 W US2011022288 W US 2011022288W WO 2011109126 A1 WO2011109126 A1 WO 2011109126A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- portable device
- virtual
- view
- virtual scene
- space
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
- A63F13/92—Video game devices specially adapted to be hand-held while playing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
- H04L12/18—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
- H04L12/1813—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
- H04L12/1827—Network arrangements for conference optimisation or adaptation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
- H04L51/046—Interoperability with other network applications or services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/07—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
- H04L51/10—Multimedia information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/07—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
- H04L51/18—Commands or executable codes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/211—Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/215—Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/216—Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/426—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/803—Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/812—Ball games, e.g. soccer or baseball
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1006—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals having additional degrees of freedom
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1087—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
- A63F2300/1093—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/20—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
- A63F2300/204—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform the platform being a handheld device
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/20—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
- A63F2300/205—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform for detecting the geographical location of the game platform
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
- A63F2300/6661—Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8082—Virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Definitions
- the present invention relates to methods, devices, and computer programs for controlling a view of a virtual scene with a portable device, and more particularly, methods, devices, and computer programs for enabling multiplayer interaction in a virtual or augmented reality.
- Virtual reality is a computer-simulated environment, whether that environment is a simulation of the real world or an imaginary world, where users can interact with a virtual environment or a virtual artifact (VA) either through the use of standard input devices or specialized multidirectional input devices.
- the simulated environment can be similar to the real world, for example, simulations for pilot or combat training, or it can differ significantly from reality, as in VR games.
- Virtual Reality is often used to describe a wide variety of applications, commonly associated with its immersive, highly visual, 3D environments.
- CAD Computer Aided Design
- Augmented Reality provides a live view of a physical real-world environment whose elements are merged with (or augmented by) virtual computer-generated imagery to create a mixed reality.
- the augmentation is conventionally in real-time and in semantic context with environmental elements, such as sports scores on television during a match.
- advanced AR technology e.g. adding computer vision and object recognition
- the information about the surrounding real world of the user becomes interactive and digitally usable.
- Augmented Virtuality is also used in the virtual reality world and is similar to AR. Augmented Virtuality also refers to the merging of real world objects into virtual worlds. As an intermediate case in the Virtuality Continuum, AV refers to predominantly virtual spaces, where physical elements, e.g. physical objects or people, are dynamically integrated into, and can interact with the virtual world in real-time.
- VR is used in this application as a generic term that also encompasses AR and AV, unless otherwise specified.
- VR games typically required a large amount of computer resources. Implementation in handheld devices of VR games is rare and the existing games are rather simplistic with rudimentary VR effects.
- multiplayer AR games allow for the interaction of players in a virtual world, but the interactions are limited to objects manipulated by the player in the virtual world (e.g., cars, rackets, balls, etc.)
- the virtual world is computer generated and independent of the location of the players and the portable devices. The relative location of the players with respect to each other and with respect to their surroundings is not taken into account when creating a "realistic" virtual reality experience.
- Embodiments of the present invention provide methods, apparatus, and computer programs for controlling a view of a virtual scene with a portable device. It should be appreciated that the present invention can be implemented in numerous ways, such as a process, an apparatus, a system, a device or a method on a computer readable medium. Several inventive embodiments of the present invention are described below.
- a signal is received and the portable device is synchronized to make the location of the portable device a reference point in a three-dimensional (3D) space.
- a virtual scene which includes virtual reality elements, is generated in the 3D space around the reference point.
- the method determines the current position in the 3D space of the portable device with respect to the reference point and creates a view of the virtual scene.
- the view represents the virtual scene as seen from the current position of the portable device and with a viewing angle based on the current position of the portable device.
- the created view is displayed in the portable device, and the view of the virtual scene is changed as the portable device is moved by the user within the 3D space.
- multiple players shared the virtual reality and interact among each other view the objects in the virtual reality.
- a method for sharing a virtual scene among devices.
- the method includes operations for synchronizing a first device to a reference point in a three-dimensional (3D) space and for calculating a location of a second device relative to the location of the first device.
- an operation of the method includes the exchange of information between the first device and the second device to have the second device synchronized to the reference point in the 3D space.
- the information includes the reference point and the locations of both first and second devices.
- a method operation is used for generating a virtual scene in the 3D space around the reference point.
- the virtual scene is common to both devices and changes simultaneously in both devices as the devices interact with the virtual scene.
- a view is created of the virtual scene as seen from a current position of the first device with a viewing angle based on the current position of the portable device, and the created view is displayed on the first device.
- the method continues by changing the displayed view of the virtual scene as the portable device moves within the 3D space.
- a method for controlling a view of a virtual scene with a first device.
- the method includes an operation for synchronizing the first device to a first reference point in a first three-dimensional (3D) space.
- a communications link is established between the first device and a second device.
- the second device is in a second 3D space outside the first 3D space and is synchronized to a second reference point in the second 3D space.
- a method operation is performed for generating a common virtual scene that includes virtual reality elements, where the common virtual scene is observable by both the first and the second devices.
- the first device builds the common virtual scene around the first reference point
- the second device builds the common virtual scene around the second reference point.
- Both devices are able to interact with the virtual reality elements. Further, the method includes operations for determining a current position in the first 3D space of the first device with respect to the reference point and for creating a view of the common virtual scene.
- the view represents the common virtual scene as seen from the current position of the first device and with a viewing angle based on the current position of the first device.
- the created view is displayed in the first device, and the displayed view of the common virtual scene changes as the first device moves within the first 3D space.
- method operations control a view of a virtual scene with a portable device.
- the portable device is synchronized to a reference point in a three-dimensional (3D) space where the portable device is located.
- the portable device includes a front camera facing the front of the portable device and a rear camera facing the rear of the portable device.
- an operation is performed for generating a virtual scene in the 3D space around the reference point.
- the virtual scene includes virtual reality elements.
- the current position in the 3D space of the portable device is determined with respect to the reference point.
- a view of the virtual scene is created.
- the view captures a representation of the virtual scene as seen from a current eye position in the 3D space of a player holding the portable device, which corresponds to what the player would see through a window into the virtual scene.
- the window's position in the 3D space is equal to the position in the 3D space of a display in the portable device.
- the method also includes operations for displaying the created view in the display, and for changing the displayed view of the virtual scene as the portable device or the player move within the 3D space.
- a portable device is used for interacting with an augmented reality.
- the portable device includes a position module, a virtual reality generator, a view generator, and a display.
- the position module is used for determining a position of the portable device in a 3D space where the portable device is located, where the position of the portable device is set as a reference point in the 3D space at a time when the portable device receives a signal to synchronize.
- the virtual reality generator creates a virtual scene in the 3D space around the reference point.
- the virtual scene includes virtual reality elements.
- the view generator that creates a view of the virtual scene, where the view represents the virtual scene as seen from the position of the portable device and with a viewing angle based on the position of the portable device.
- the display is used for showing the view of the virtual scene. The view shown in the display changes as the portable device moves within the 3D space.
- Figure 1 depicts a user before synchronization a portable device to a reference point in space, according to one embodiment.
- Figure 2 illustrates a virtual reality scene observed with the portable device.
- Figure 3 illustrates an augmented reality chess game with virtual board and blended player' s hand, in accordance with one embodiment.
- Figure 4 depicts a multi-player virtual reality game, according to one embodiment.
- Figure 5 illustrates one embodiment of a calibration method for a multi-player environment.
- Figure 6 illustrates how to play an interactive game over a network connection, according to one embodiment.
- Figure 7 shows an interactive game that is independent of the location of the portable device.
- Figure 8 shows an interactive game where the view in the display depends on the position of the portable device, in accordance with one embodiment.
- Figure 9 illustrates how movement of the portable device has a similar effect on the display as when moving a camera in the virtual space, according to one embodiment.
- Figure 10 shows a two-dimensional representation of the change in the image shown in the display when turning the portable device, according to one embodiment.
- Figure 11 shows a portable for playing a VR game, according to one embodiment.
- Figures 12A-12F illustrate how the position of the portable device affects the view in the display, according to one embodiment.
- Figures 13A-13B illustrate an augmented reality game played between users in distant locations, according to one embodiment.
- Figures 14A-14H depict the changes in the display as the portable device changes position, according to one embodiment.
- Figure 15 illustrates an embodiment for implementing a viewing frustum on a portable device using front and rear cameras.
- Figures 16A-16B illustrate the effects of changing the viewing frustum as the player moves, according to one embodiment.
- Figure 17 illustrates how to use a virtual camera to span a view of the virtual scene, according to one embodiment.
- Figures 18A-18H show a sequence of views for illustrating the viewing frustum effect, according to one embodiment.
- Figures 19A-19B illustrate embodiments for combining viewing frustum effect with a camera effect.
- Figure 20 shows the flow of an algorithm for controlling a view of a virtual scene with a portable device in accordance with one embodiment of the invention.
- Figure 21 illustrates the architecture of a device that may be used to implement embodiments of the invention.
- Figure 22 is an exemplary illustration of scene A through scene E with respective user A through user E interacting with game clients 1102 that are connected to server processing via the internet, in accordance with one embodiment of the present invention.
- Figure 23 illustrates an embodiment of an Information Service Provider architecture. DETAILED DESCRIPTION
- FIG. 1 depicts a user before synchronization a portable device to a reference point in space, according to one embodiment.
- Portable device 102 is standing on a table in preparation for synchronizing the portable device to a reference point.
- User 102 has placed the portable device in a point that will serve as a reference point or anchor to build a virtual reality around the point.
- the portable device is sitting in the approximate center of a table, and a virtual world is built around the center of the table once the portable device is synchronized.
- the portable device can be synchronized in a variety of ways, such as pushing a button on portable device 104, touching the touch- sensitive screen in the portable device, letting the device stand still for a period of time (e.g., five seconds), entering a voice command, etc.
- the portable device can include a variety of position tracking modules, as discussed below in reference to Figure 21, such as an accelerometer, a magnetometer, a Global Positioning System (GPS) device, a camera, a depth camera, a compass, a gyroscope, etc.
- GPS Global Positioning System
- the portable device can be one of many types, such as a handheld portable gaming device, a cell phone, a tablet, a notebook, a netbook, a Personal Digital Assistant (PDA), etc.
- PDA Personal Digital Assistant
- Embodiments of the invention are described in reference to a portable gaming device, but the principles can be applied to any portable electronic device with a display. Principles of the invention can also be applied to game controllers or other input devices connected to a computing device with a display.
- Figure 2 illustrates a virtual reality scene observed with the portable device. After synchronizing device 104 with respect to reference points 106, the portable device will start displaying a view of the virtual reality 108. The view in the display is created by simulating that a camera in the back of the portable device moves within the 3D space around reference point 106.
- Figure 2 depicts a virtual reality that includes a chess board.
- Portable device 104 is capable of detecting motion and determining its relative position with respect to reference point 106 as the device moves around. Location and position determination can be done with different methods and different levels of accuracy. For example, location can be detected by analyzing images captured with a camera, or data obtained from inertial systems, GPS, ultrasonic triangulation, WiFi communications, dead reckoning, etc., or a combination thereof.
- the device keeps track of the location in space of the portable device with respect to reference point 106, as well as the position in space of the portable device.
- the position is used to determine the viewing angle of the camera, that is, the portable device acts as a camera into the virtual scene. If the portable device is aimed towards the right, then the view will turn to the right, etc.
- the viewing angle is defined as a vector with origin in the center of the display (or other part of the device), and with a direction perpendicular to and away from the display.
- only the position in space is tracked, and the view in the display is calculated as if the camera is aiming from the location in space where the portable device is located and towards the reference point.
- an augmented reality (AR) tag is placed on a table, and utilized as a fiduciary marker for generating the augmented reality.
- the AR tag may be an object or figure that is recognized when present in the captured image stream of the real environment.
- the AR tag serves as a fiduciary marker which enables determination of a location within the real environment.
- Embodiments to the invention eliminate the need for AR Tags, because of the synchronizing into the 3D space and the tracking of the location of the portable device. Additionally, the location information allows games in the portable device to deliver a realistic 3D virtual experience. Additionally, an array of networked portable devices can be used to create a shared virtual world, as described below in reference to Figure 4.
- Figure 3 illustrates an augmented reality chess game with virtual board and blended player's hand, in accordance with one embodiment. Images of the 3D space are used to create an augmented reality by combining real and virtual elements with respect to the calibrated point and provide optical motion capture-like functionality. With a calibrated multi-camera technique it is possible to determine the position of a hand or an arm to enable players to "reach" into an augmented reality scene and interact with game objects (chess pieces).
- two cameras in the back of a single device are used to determine the location of objects into the 3D space.
- a depth-camera can also be used to obtain three- dimensional information.
- cameras from multiple devices are used to determine hand 306 location, as discussed below in reference to Figure 4. While holding portable 302 in one hand, players peer through screen 304 and reach into the play area that is being generated for them to touch 3D game objects and environments. Game play is completely tactile. It is possible for multiple players to reach into a play area at the same time and interact with game objects in sophisticated ways.
- hand 306 of a player can interact with a virtual object by interfacing, holding, pushing, pulling, grabbing, moving, smashing, squeezing, hitting, throwing, fighting, opening, closing, turning on or off, pushing a button, firing, eating, etc.
- Each portable device that is synchronized to a play area adds another potential camera, relative motion tracking and ping data, making it possible to see players hands and fingers from multiple perspectives to create an effective 3D camera-based motion capture field.
- the hand and the virtual space are blended together, where the virtual elements in the virtual space appear in the displayed view as if the virtual elements where part of the 3D space.
- the view of the virtual elements changes, from a geometrical perspective, in the same way as the view of the real elements changes when the portable device moves within the 3D space.
- Figure 4 depicts a multi-player virtual reality game, according to one embodiment.
- positional and game information can be exchanged between each of the devices that choose to participate in a shared-space game experience. This allows each player's system access to the camera view and positional information from all other players to synchronize their calibrated positions and share a virtual space, also referred to as shared space, together.
- the common virtual scene 404 is created.
- Each player has a view of the virtual scene 404 as if the virtual scene, a battle board game in this case, were real on a table in front of the players.
- the portable devices act as cameras, such that when the player moves the device around, the view changes.
- the actual view on each display is independent from the view in other displays and the view is based only on the relative position of the portable device in respect to the virtual scene, which is anchored to an actual physical location on the 3D space.
- Share Space 404 games utilize devices' high-speed connectivity to exchange information among devices participating in the Share Space game experience.
- the Share Space 404 play areas are viewed through the device by turning the device into a stable "magic window" that persists in a space between each of the devices.
- the play area appears in a stable position even if when devices move around.
- Figure 5 illustrates one embodiment of a calibration method for a multi-player environment.
- the positional information gained from the devices sensors is transmitted to other linked devices to enhance the collaboratively maintained data in the virtual space.
- a first player 504A synchronizes her device into the 3D space with respect to reference point 502.
- Other players in the shared space establish a communication link with the first player to exchange position and game information.
- the relative position can be obtain in different ways, such as using WiFi triangulation and ping tests to determine relative positions.
- visual information can be used to determine other locations, such as detecting faces of other players and from their faces, possible locations of gaming devices.
- audio triangulation is used to determine relative position, by means of ultrasonic communications and directional microphones. Multiple frequencies can be used to perform the audio triangulation.
- wireless communication such as ultrasonic, WiFi, or Bluetooth, is used to synchronize the rest of the devices to reference point 502. After all the devices are calibrated, the devices have knowledge of the reference point 502 and their relative position with respect to reference point 502. It should be appreciated that other methods can be used to calibrate multiple devices to a shared reference point. For example, all devices may calibrate to the same reference point by placing the device on the reference point in turns.
- the virtual scene can be made even more realistic by using shadows and lighting determined by the lighting sources in the room.
- game environments and characters have scene lighting and shadows influenced by the real world. This means that a player's hand will cast a shadow over virtual characters or objects as the hand are reaches into the virtual world to interact with the virtual objects.
- Game world shadows and lighting are adjusted by real world shadows and lighting to get the best effect possible.
- Figure 6 illustrates how to play an interactive game over a network connection, according to one embodiment.
- the portable device can be used as a paddle to play a game of ping-pong. The device is moved around as if it where a paddle to hit the ball. Players see the ball float between the screen and the opponent's screen.
- the player looks through the portable device and aims the catapult at the enemies' ramparts. The player pulls the device backwards to load the catapult, and then press a button to fire the catapult toward the enemies' castle.
- Shared Spaces can also be created when players are in different locations, as shown in Figure 6.
- the players have established a network connection to play the game.
- Each player synchronizes his device to a reference point in the player's space, and a virtual reality is created, such as a ping-pong table.
- the opponent is shown behind his side of the table, where the movement of an opponent's device is matched to the motions of the opponent's paddle.
- the game may also add an avatar to hold the paddle, for an even more realistic game experience.
- each device keeps track of the motion and position in space of the device. This information is shared with the other device to enable the other device to place a virtual paddle matching the motion of the device. Other game information is also shared, such as the location and movement of the ball.
- Figure 7 shows an interactive game that is independent of the location of the portable device.
- the game illustrated in Figure 7 shows the limitations for playing games that do not synchronize with respect to reference point 706.
- a multiplayer air hockey game is played simultaneously on two separate devices, 704C and 702A.
- the game includes a hockey ring 708, puck 714, and mallets 710 and 712.
- Each player controls the mallet by moving the finger on the display.
- the displays show the location of the puck and the mallets.
- the view on the display does not change when the portable device moves around because there is no geographical synchronization with a reference point. For example, when player 702A moves to location 702B, the view is the same, independent of where the device is located.
- the portable devices In order to play the game, the portable devices only exchange information regarding the movement of the puck and the location of the mallets. There is no virtual experience tied to a 3D space.
- Figure 8 shows an interactive game where the view in the display depends on the position of the portable device, in accordance with one embodiment.
- Devices 802A and 802B have been calibrated to a common space and the hockey ring has been created as a virtual element.
- the devices acts as cameras into the space, and the devices don't necessarily have to show the complete playing surface. For example, when the device is pulled away from the reference point, a zoom out effect takes place and a larger view of the ring is available. Further, if the device is tilted upward, the view shows the top of the ring and if the device is tilted down, the view in the device gets closer to the player's own goal.
- the views in the displays are independent of each other and are based on the current view of the playing surface from each portable device.
- Figure 9 illustrates how movement of the portable device has a similar effect on the display as when moving a camera in the virtual space, according to one embodiment.
- Figure 9 shows car 902 inside a virtual sphere. Assuming that a portable device is aimed from a point in the sphere towards car 902, multiple views of the car can be obtained as the portable device moves within the sphere. For example, a view from the "north pole" will show the roof of the car, and a view from the "south pole” will show the bottom of the car. Also shown in Figure 9 are views for the side, front, and rear of the car.
- the player can enter a command to change or flip the view of the virtual world.
- a player goes from seeing the front of the car to seeing the back of the car, as if the scene had rotated 180° around and axis running vertically through the reference point. This way, the player does not have to move around the room to get different viewing angles.
- Other inputs may produce different effects, such as a 90° turn, a scaling of the view (to make the virtual world seem smaller or greater), rotate with respect to the x, y, or z axis, etc.
- a flip of the portable device i.e., a 180° spin on the player's hand, will cause view of the virtual world to flip upside down.
- Figure 10 shows a two-dimensional representation of the change in the image shown in the display when turning the portable device, according to one embodiment.
- Portable device 152 is aimed towards a wall with a viewing angle a, resulting in a projection 160 on the wall.
- the view on portable device 152 will correspond to projection 160.
- the portable device ends in position 154.
- the view also turns an angle ⁇ while maintaining a camera viewing angle a.
- the view on the portable device corresponds to projection 162.
- the view on the screen is independent of the eye position, such as positions 158 and 156, and the view is independent of where the player is.
- the image on the display depends on the position of the portable device, which is acting as a virtual camera.
- Other embodiments described below include views on the display that change according to the position of the eye.
- Figure 11 shows a portable for playing a VR game, according to one embodiment.
- Figures 11 to 12F illustrate a racing game where the portable device can be used as a camera or to control the driving of the vehicle.
- the portable device shows a view of the race, where the race track is seen in the center with other racing cars and people sitting on the stands on the side of the track.
- Figures 12A-12F illustrate how the position of the portable device affects the view in the display, according to one embodiment. In this sequence, the portable device is being used as a camera, and not to drive the car.
- Figure 12A shows the player holding the portable device to play the racing game. The device is being held in front of the player at approximately arm's length.
- FIG 12 A The view of the game when the player is in the position shown in Figure 12 A is the one illustrated in Figure 12B, where the view on the display shows the race as seen by the car driver.
- the driver can see the track ahead and a part of the inside of the vehicle, including the steering wheel.
- Figure 12C shows the player turning left about 45° while still holding the portable device in front of him. As a result the portable device moves in space together with player. The result of the movement of the player is seen in Figure 12D, where the view of the race track has also turned about 45°. It can be seen that the portable device is acting as a camera and the view on the display changes as if the camera changed position in the 3D world.
- Figure 12E shows the player turning left an additional 45°.
- the head and the viewing angle of the portable device has changed about 90° with respect to the original position.
- the result on the display is pictured in Figure 12F, where the driver of the game has now a side view, which includes another racing car and the stands.
- Figures 13A-13B illustrate an augmented reality game played between users in distant locations, according to one embodiment.
- Figure 13A shows a portable device with camera 1302 facing the player holding the portable device.
- the player-facing camera has many uses, such as teleconferencing, viewing frustum applications (see figures 15-19B), incorporation of player's face in a game, etc.
- Figure 13B shows an embodiment of an augmented reality game that produces a realistic proximity effect.
- Player 1308 is in a remote location and exchanges game and environment information via a network connection.
- a camera in the remote location takes a picture of the player and his vicinity, such as background 1310. The image is sent to the opponent's device, where the image is blended with a virtual chess board 1306.
- camera 1304 takes a picture of the player holding the device and sends the image to the remote player. This way the players can share a space.
- Each player sees his view as an augmented reality that fades into a virtual reality fog as the view crosses over into the other player's scene. All movements for each player are still tracked relative to the synchronized calibrated positions for both devices.
- the game inserts the virtual chess board on top of the table offering a 3D experience.
- the portable device can move around to change the viewing angle and see the board from a different perspective, such as from the top, the side, the opponent's view, etc.
- communications and processing bandwidth required are decreased by updating the face and background from the opponent periodically, instead of using a live- feed. Additionally, it is possible to send only a portion of the remote image, such as the image of the player since the background may be static and less relevant. For example, the face of the remote player can be updated every five seconds, every time the player changes expression, when the player talks, etc.
- sound can also be exchanged between players to make the 3D experience more realistic.
- players have the option of changing views, such as switching between the blended 3D image and displaying only the chess board to improve the view of the board.
- image stabilization can be used to stabilize small image variations due to slight shaking of the player's hands.
- the face of the player holding the device can also be added to the display to show how the user appears to the opponent.
- Figures 14A-14H depict the changes in the display as the portable device changes position, according to one embodiment.
- a portable device is using a viewing frustum effect to determine how the augmented reality world is presented to the user.
- the viewing frustum or view frustum is the region of space in the modeled world that may appear on the screen.
- the viewing frustum is the field of view of the notional camera.
- the exact shape of this region varies depending on what kind of camera lens is being simulated, but typically it is a frustum of a rectangular pyramid (hence the name).
- the planes that cut the frustum perpendicular to the viewing direction are called the near plane and the far plane.
- the near plane corresponds to the surface of the display in the portable device. Objects closer to the camera than the near plane or beyond the far plane are not drawn.
- the viewing frustum is anchored (the top of the pyramid) in the eye (or between the eyes) of the player holding the portable device.
- the display acts as a window into the virtual reality. Therefore, the closer the "window” is to the eye, the larger the area of the virtual reality that is displayed. Conversely, the further away the "window” is from the eye, the smaller (and more detailed) the view of the virtual reality is.
- the effect is similar to getting closer to a rectangular old-fashion peephole without distortion optics. The closer the eye is to the peephole, the more of the outside that can be observed.
- Figure 14A shows the player holding the augmented-reality portable device inside a room. After the device has been synchronized into the room, the virtual reality generator has added a virtual triangle "painted" on a wall facing the player, and a square "painted” on a wall to the left of the player.
- the player is holding the device slightly below eye level with the arms almost completely extended.
- the view shown in the display is presented in Figure 14B, where a portion of the triangle in front of the player is shown.
- Figure 14C the player is in the same position and has bent the elbows in order to bring the portable device closer to the face. Due to the viewing frustum effect, as discussed above, the player sees a larger section of the wall.
- Figure 14D shows the view displayed in the device of Figure 14C. Because of the frustum effect a larger section of the wall is observed, as compared to the previous view of Figure 14B. The complete triangle is now seen on the display.
- Figure 14E shows the player moving the device down to see the bottom part of the opposing wall, as shown in Figure 14F. A bottom section of the triangle is shown on the display.
- Figure 14G the player as turned to the left and is using the "window" into the augmented world to see a corner of the room, as seen in Figure 14H.
- Figure 15 illustrates an embodiment for implementing a viewing frustum on a portable device using front and rear cameras.
- Figure 15 shows a 2D projection of the viewing frustum, and since it is a 2D projection, the viewing frustum pyramid is observed as a triangle.
- Portable device 1506 includes front and rear facing cameras 1514 and 1512, respectively.
- Camera 1512 is used to capture images of the space where the player is located.
- Camera 1514 is used to capture images of the player holding device 1506.
- Face recognition software allows the devices software to determine the location of the player's eyes in order to simulate the viewing frustum effect.
- the viewing frustum has the apex at the eye with the edges of the rectangular frustum pyramid extending from the eye and through the corners of the display in the handheld device.
- the player "sees" area 1510 of a wall facing the device. Lines starting from the eye, and touching corners of the display intersect the wall to define area 1510.
- the lines originating at the eye change as a result.
- the new lines define area 1508.
- portable device 1506 is kept stationary, a change of the position of the eye will cause a change in what is shown in the display.
- the view will also change because the viewing frustum changes as the edges of the pyramid intersect the corners of the display.
- FIG. 15 is an exemplary implementation of a viewing frustum.
- Other embodiments may utilize different shapes for the viewing frustum, may scale the viewing frustum effect, or may add boundaries to the viewing frustum.
- the embodiment illustrated in Figure 15 should therefore not be interpreted to be exclusive or limiting, but rather exemplary or illustrative.
- Figures 16A-16A illustrate the effects of changing the viewing frustum as the player moves, according to one embodiment.
- Figure 16A includes display 1606 in a portable device, where the surface of the display is parallel to the surface of the wall.
- a rectangular frustum pyramid is created with an apex somewhere in the face of the player (such as between the eyes), with base on the wall, and with edges extending from the eye and touching the corners of the display 1606.
- the viewing frustum creates a rectangular base 1610, which is what the player sees on display 1606.
- the viewing frustum changes as a result.
- the new base for the frustum is rectangle 1608, which is seen in display 1606. The result is that a change of the position of the player causes a change in the view of the virtual reality.
- Figure 16B illustrates the zoom effect created as the face moves away or closer to the display when using a viewing frustum effect.
- player sees rectangle 1638, as previously described. If the player moves away from display 1636, without moving the display, to position 1632, a new display is seen corresponding to rectangle 1640.
- the observed area of the virtual world shrinks causing a zoom-in effect because the observed area in the display is smaller and the objects in this observed area appear bigger on the display.
- An opposite motion where the player moves closer to display 1636 will cause the opposite zoom-out effect.
- Figure 17 illustrates how to use a virtual camera to span a view of the virtual scene, according to one embodiment.
- a virtual or augmented reality does not need to be confined within the limits of the room where the player is located, as we saw before in Figure 11 for the racing game.
- a virtual world that goes beyond the physical boundaries of the player can be also simulated.
- Figure 17 illustrates a player viewing a virtual concert. The actual stage is located beyond the walls of the room and can be simulated to be hundreds of feet away from the portable device, which is acting as a virtual camera in this case.
- a viewing frustum could also be simulated in the same manner.
- the first location is focused on the backup singers, the second one on the main artists, and the third location is aimed at the crowd.
- the virtual camera can also add zoom inputs to zoom in and out like a real camera would do.
- scaling is used to navigate through the virtual reality. For example, if a player moves forward one foot, the portable device will create a virtual view as if the player had advanced 10 feet. This way, a player can navigate a virtual world that is larger than the room where the player is.
- the player can enter commands to make the camera move within the virtual reality without actually moving the portable device. Since the portable device is synchronized with respect to a reference point, this movement of the camera without movement by the player has the effect of changing the reference point to a new location.
- This new reference point can be referred to as a virtual reference point, and does not have to be located within the actual physical space where the player is.
- the player could use a "move forward" command to move the camera backstage. Once the player "is" backstage, the player can start moving the portable device around to explore the view backstage, as previously described.
- Figures 18A-18H show a sequence of views for illustrating the viewing frustum effect, according to one embodiment.
- Figure 18A shows the player holding the portable device.
- the view on the display corresponds to the image of a forest shown in Figure 18B.
- Figure 18C the player moves his head to his right while keeping the portable device in approximately the same position as in Figure 18 A.
- Figure 18D corresponds to the view for the player in Figure 18C and shows how the perspective of the forest changes due to the viewing frustum effect.
- Figure 18E the player keeps turning the head to his right, while moving the portable device towards his left to accentuate the viewing frustum effect because the player wants to know if there is something behind the tree.
- Figure 18F shows the display corresponding to the player in Figure 18E. The perspective of the forest has changed again. There is an elf hiding behind one of the trees that was hidden in Figure 18B, but part of the elf is visible in Figure 18F as the player has changed the viewing angle of the forest.
- Figure 18G shows the player tilting the head far to his right and moving the portable device further away to his left. The effect, as seen in Figure 18H, is that the player is able to see now what was behind the tree, an elf that is now completely visible.
- Figures 19A-19B illustrate embodiments for combining viewing frustum effect with a camera effect.
- Combining viewing frustum and camera effects may be seen as impossible as the behaviors for building the virtual view are different. However, the combination is possible when there are rules to define when to use one effect or the other.
- the camera effect is used when the player moves the portable device
- the viewing frustum effect is used when the player moves the head with respect to the portable device. In the case where both events happen at the same time, one effect is chosen, such as viewing frustum.
- This combination means that given a position of the eye and the portable device, there can be different views on the display depending on how the eye and the camera reached that position. For example, when eye 1902 is looking thorough device 1906, a different view of the virtual reality is shown in Figures 19A and 19B, as discussed below.
- eye 1902 is originally looking thorough device 1904.
- the device using a viewing frustum effect, is "aiming" into the virtual reality straight forward. This causes an a angle originating at the top of the viewing frustum pyramid, and results in a camera angle of ⁇ .
- the player sees segment 1908 on the wall at this first position. The player then turns the device an angle ⁇ to place the device in position 1906. Since the player has moved the device, the portable device responds to the move with a camera effect, causing the virtual camera to also turn an angle ⁇ . The result is that the display now shows area 1910 of the wall.
- Figure 19B shows a player at an initial eye position 1912 looking through portable device 1906.
- a viewing frustum effect is being used and the result is the appearance on the display of area 1918.
- the player then moves to eye position 1902 without moving the portable device. Because the device has not moved, the viewing frustum effect takes place and player then sees area 1916 on the display.
- eye 1902 and display 1906 are in the same position in Figures 19A and 19B, the actual view is different because the sequence of events that caused the eye and the display to be in that position.
- Figure 20 shows flow 2000 of an algorithm for controlling a view of a virtual scene with a portable device in accordance with one embodiment of the invention.
- a signal is received to synchronize the portable device, such as a button press or a screen touch.
- the method synchronizes the portable device to make the location where the portable device is located a reference point in a three-dimensional (3D) space.
- the 3D space is the room where the player is located.
- the virtual reality includes the room as well as a virtual space that extends beyond the walls of the room.
- a virtual scene is generated in the 3D space around the reference point during operation 2006.
- the virtual scene includes virtual reality elements, such as the chess board of Figure 2.
- the portable device determines the current position in the 3D space of the portable device with respect to the reference point.
- a view of the virtual scene is created in operation 2010. The view represents the virtual scene as seen from the current position of the portable device and with a viewing angle based on the current position of the portable device. Further, the created view is shown on a display of the portable device during operation 2012.
- the portable device checks if the portable device has been moved by the user, i.e., if the current position has changed. If the portable device has moved, then the method flows back to operation 2008 to recalculate the current position. If the portable device has not moved, the portable device continues displaying the previously created view by flowing to operation 2012.
- FIG. 21 illustrates the architecture of a device that may be used to implement embodiments of the invention.
- the portable device is a computing device and include typical modules present in a computing device, such as a processor, memory (RAM, ROM, etc.), battery or other power source, and permanent storage (such as a hard disk).
- Communication modules allow the portable device to exchange information with other portable devices, other computers, servers, etc.
- the communication modules include a Universal Serial Bus (USB) connector, a communications link (such as Ethernet), ultrasonic communication, Bluetooth, and WiFi.
- USB Universal Serial Bus
- Input modules include input buttons and sensors, microphone, touch sensitive screen, cameras (front facing, rear facing, depth camera), and card reader. Other input/output devices, such as a keyboard or a mouse, can also be connected to the portable device via communications link, such as USB or Bluetooth. Output modules include a display (with a touch-sensitive screen), Light-Emitting Diodes (LED), vibro-tactile feedback, and speakers. Other output devices can also connect to the portable device via the communication modules.
- Position Module Information from different devices can be used by the Position Module to calculate the position of the portable device.
- These modules include a magnetometer, an accelerometer, a gyroscope, a GPS, and a compass. Additionally, the Position Module can analyze sound or image data captured with the cameras and the microphone to calculate the position. Further yet, the Position Module can perform tests to determine the position of the portable device or the position of other devices in the vicinity, such as WiFi ping test or ultrasound tests.
- a Virtual Reality Generator creates the virtual or augmented reality, as previously described, using the position calculated by the Position Module.
- a view generator creates the view that is shown on the display, based on the virtual reality and the position.
- the view generator can also produce sounds originated by the Virtual Reality Generator, using directional effects that are applied to a multi-speaker system.
- FIG. 21 is an exemplary implementation of a portable device. Other embodiments may utilize different modules, a subset of the modules, or assign related tasks to different modules. The embodiment illustrated in Figure 21 should therefore not be interpreted to be exclusive or limiting, but rather exemplary or illustrative.
- FIG 22 is an exemplary illustration of scene A through scene E with respective user A through user E interacting with game clients 1102 that are connected to server processing via the internet, in accordance with one embodiment of the present invention.
- a game client is a device that allows users to connect to server applications and processing via the internet.
- the game client allows users to access and playback online entertainment content such as but not limited to games, movies, music and photos. Additionally, the game client can provide access to online communications applications such as VOIP, text chat protocols, and email.
- a user interacts with the game client via controller.
- the controller is a game client specific controller while in other embodiments, the controller can be a keyboard and mouse combination.
- the game client is a standalone device capable of outputting audio and video signals to create a multimedia environment through a monitor/television and associated audio equipment.
- the game client can be, but is not limited to a thin client, an internal PCI-express card, an external PCI-express device, an ExpressCard device, an internal, external, or wireless USB device, or a Firewire device, etc.
- the game client is integrated with a television or other multimedia device such as a DVR, Blu-Ray player, DVD player or multi-channel receiver.
- FIG. 22 shows a single server processing module, in one embodiment, there are multiple server processing modules throughout the world. Each server processing module includes sub-modules for user session control, sharing/communication logic, user geo-location, and load balance processing service. Furthermore, a server processing module includes network processing and distributed storage.
- user session control may be used to authenticate the user.
- An authenticated user can have associated virtualized distributed storage and virtualized network processing. Examples items that can be stored as part of a user's virtualized distributed storage include purchased media such as, but not limited to games, videos and music etc. Additionally, distributed storage can be used to save game status for multiple games, customized settings for individual games, and general settings for the game client.
- the user geo-location module of the server processing is used to determine the geographic location of a user and their respective game client. The user's geographic location can be used by both the sharing/communication logic and the load balance processing service to optimize performance based on geographic location and processing demands of multiple server processing modules.
- Virtualizing either or both network processing and network storage would allow processing tasks from game clients to be dynamically shifted to underutilized server processing module(s).
- load balancing can be used to minimize latency associated with both recall from storage and with data transmission between server processing modules and game clients.
- the server processing module has instances of server application A and server application B.
- the server processing module is able to support multiple server applications as indicated by server application and server application X 2 .
- server processing is based on cluster computing architecture that allows multiple processors within a cluster to process server applications.
- a different type of multi-computer processing scheme is applied to process the server applications. This allows the server processing to be scaled in order to accommodate a larger number of game clients executing multiple client applications and corresponding server applications. Alternatively, server processing can be scaled to accommodate increased computing demands necessitated by more demanding graphics processing or game, video compression, or application complexity.
- the server processing module performs the majority of the processing via the server application. This allows relatively expensive components such as graphics processors, RAM, and general processors to be centrally located and reduces to the cost of the game client. Processed server application data is sent back to the corresponding game client via the internet to be displayed on a monitor.
- Scene C illustrates an exemplary application that can be executed by the game client and server processing module.
- game client 1102C allows user C to create and view a buddy list 1120 that includes user A, user B, user D and user E.
- user C is able to see either real time images or avatars of the respective user on monitor 106C.
- Server processing executes the respective applications of game client 1102C and with the respective game clients 1102 of users A, user B, user D and user E. Because the server processing is aware of the applications being executed by game client B, the buddy list for user A can indicate which game user B is playing. Further still, in one embodiment, user A can view actual in game video directly from user B. This is enabled by merely sending processed server application data for user B to game client A in addition to game client B.
- the communication application can allow real-time communications between buddies. As applied to the previous example, this allows user A to provide encouragement or hints while watching real-time video of user B.
- two-way real time voice communication is established through a client/server application.
- a client/server application enables text chat.
- a client/server application converts speech to text for display on a buddy's screen.
- Scene D and scene E illustrate respective user D and user E interacting with game consoles 1110D and 1110E respectively.
- Each game console 1110D and 1110E are connected to the server processing module and illustrate a network where the server processing modules coordinates game play for both game consoles and game clients.
- FIG 23 illustrates an embodiment of an Information Service Provider architecture.
- Information Service Providers (ISP) 250 delivers a multitude of information services to users 262 geographically dispersed and connected via network 266.
- An ISP can deliver just one type of service, such as stock price updates, or a variety of services such as broadcast media, news, sports, gaming, etc.
- the services offered by each ISP are dynamic, that is, services can be added or taken away at any point in time.
- the ISP providing a particular type of service to a particular individual can change over time. For example, a user may be served by an ISP in near proximity to the user while the user is in her home town, and the user may be served by a different ISP when the user travels to a different city.
- the home-town ISP will transfer the required information and data to the new ISP, such that the user information "follows" the user to the new city making the data closer to the user and easier to access.
- a master-server relationship may be established between a master ISP, which manages the information for the user, and a server ISP that interfaces directly with the user under control from the master ISP.
- the data is transferred from one ISP to another ISP as the client moves around the world to make the ISP in better position to service the user be the one that delivers these services.
- ISP 250 includes Application Service Provider (ASP) 252, which provides computer- based services to customers over a network.
- ASP Application Service Provider
- Software offered using an ASP model is also sometimes called on-demand software or software as a service (SaaS).
- SaaS software as a service
- a simple form of providing access to a particular application program is by using a standard protocol such as HTTP.
- the application software resides on the vendor's system and is accessed by users through a web browser using HTML, by special purpose client software provided by the vendor, or other remote interface such as a thin client.
- Cloud computing is a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet. Users do not need to be an expert in the technology infrastructure in the "cloud” that supports them. Cloud computing can be divided in different services, such as Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). Cloud computing services often provide common business applications online that are accessed from a web browser, while the software and data are stored on the servers.
- IaaS Infrastructure as a Service
- PaaS Platform as a Service
- SaaS Software as a Service
- Cloud computing services often provide common business applications online that are accessed from a web browser, while the software and data are stored on the servers.
- the term cloud is used as a metaphor for the Internet, based on how the Internet is depicted in computer network diagrams and is an abstraction for the complex infrastructure it conceals.
- ISP 250 includes a Game Processing Server (GPS) 254 which is used by game clients to play single and multiplayer video games.
- GPS Game Processing Server
- Most video games played over the Internet operate via a connection to a game server.
- games use a dedicated server application that collects data from players and distributes it to other players. This is more efficient and effective than a peer-to-peer arrangement, but it requires a separate server to host the server application.
- the GPS establishes communication between the players and their respective game-playing devices exchange information without relying on the centralized GPS.
- Dedicated GPSs are servers which run independently of the client. Such servers are usually run on dedicated hardware located in data centers, providing more bandwidth and dedicated processing power. Dedicated servers are the preferred method of hosting game servers for most PC -based multiplayer games. Massively multiplayer online games run on dedicated servers usually hosted by the software company that owns the game title, allowing them to control and update content.
- Broadcast Processing Server (BPS) 256 distributes audio or video signals to an audience. Broadcasting to a very narrow range of audience is sometimes called narrowcasting. The final leg of broadcast distribution is how the signal gets to the listener or viewer, and it may come over the air as with a radio station or TV station to an antenna and receiver, or may come through cable TV or cable radio (or "wireless cable”) via the station or directly from a network.
- the Internet may also bring either radio or TV to the recipient, especially with multicasting allowing the signal and bandwidth to be shared.
- broadcasts have been delimited by a geographic region, such as national broadcasts or regional broadcast. However, with the proliferation of fast internet, broadcasts are not defined by geographies as the content can reach almost any country in the world.
- SSP 258 provides computer storage space and related management services. SSPs also offer periodic backup and archiving. By offering storage as a service, users can order more storage as required. Another major advantage is that SSPs include backup services and users will not lose all their data if their computers' hard drives fail. Further, a plurality of SSPs can have total or partial copies of the user data, allowing users to access data in an efficient way independently of where the user is located or the device being used to access the data. For example, a user can access personal files in the home computer, as well as in a mobile phone while the user is on the move.
- Communications Provider 260 provides connectivity to the users.
- One kind of Communications Provider is an Internet Service Provider (ISP) which offers access to the Internet.
- the ISP connects its customers using a data transmission technology appropriate for delivering Internet Protocol datagrams, such as dial-up, DSL, cable modem, wireless or dedicated high-speed interconnects.
- the Communications Provider can also provide messaging services, such as e-mail, instant messaging, and SMS texting.
- Another type of Communications Provider is the Network Service provider (NSP) which sells bandwidth or network access by providing direct backbone access to the Internet.
- Network service providers may consist of telecommunications companies, data carriers, wireless communications providers, Internet service providers, cable television operators offering high-speed Internet access, etc.
- Data Exchange 268 interconnects the several modules inside ISP 253 and connects these modules to users 262 via network 266.
- Data Exchange 268 can cover a small area where all the modules of ISP 250 are in close proximity, or can cover a large geographic area when the different modules are geographically dispersed.
- Data Exchange 268 can include a fast Gigabit Ethernet (or faster) within a cabinet of a data center, or an intercontinental virtual area network (VLAN).
- VLAN virtual area network
- client device 264 which includes at least a CPU, a display and I/O.
- the client device can be a PC, a mobile phone, a netbook, a PDA, etc.
- ISP 250 recognizes the type of device used by the client and adjusts the communication method employed. In other cases, client devices use a standard communications method, such as html, to access ISP 250.
- Information Service Providers (ISP) 250 delivers a multitude of information services to users 262 geographically dispersed and connected via network 266.
- An ISP can deliver just one type of service, such as stock price updates, or a variety of services such as broadcast media, news, sports, gaming, etc.
- the services offered by each ISP are dynamic, that is, services can be added or taken away at any point in time.
- the ISP providing a particular type of service to a particular individual can change over time. For example, a user may be served by an ISP in near proximity to the user while the user is in her home town, and the user may be served by a different ISP when the user travels to a different city.
- the home-town ISP will transfer the required information and data to the new ISP, such that the user information "follows" the user to the new city making the data closer to the user and easier to access.
- a master-server relationship may be established between a master ISP, which manages the information for the user, and a server ISP that interfaces directly with the user under control from the master ISP.
- the data is transferred from one ISP to another ISP as the client moves around the world to make the ISP in better position to service the user be the one that delivers these services.
- ISP 250 includes Application Service Provider (ASP) 252, which provides computer- based services to customers over a network.
- ASP Application Service Provider
- Software offered using an ASP model is also sometimes called on-demand software or software as a service (SaaS).
- SaaS software as a service
- a simple form of providing access to a particular application program is by using a standard protocol such as HTTP.
- the application software resides on the vendor's system and is accessed by users through a web browser using HTML, by special purpose client software provided by the vendor, or other remote interface such as a thin client.
- Cloud computing is a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet. Users do not need to be an expert in the technology infrastructure in the "cloud” that supports them. Cloud computing can be divided in different services, such as Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). Cloud computing services often provide common business applications online that are accessed from a web browser, while the software and data are stored on the servers.
- IaaS Infrastructure as a Service
- PaaS Platform as a Service
- SaaS Software as a Service
- Cloud computing services often provide common business applications online that are accessed from a web browser, while the software and data are stored on the servers.
- the term cloud is used as a metaphor for the Internet, based on how the Internet is depicted in computer network diagrams and is an abstraction for the complex infrastructure it conceals.
- ISP 250 includes a Game Processing Server (GPS) 254 which is used by game clients to play single and multiplayer video games.
- GPS Game Processing Server
- Most video games played over the Internet operate via a connection to a game server.
- games use a dedicated server application that collects data from players and distributes it to other players. This is more efficient and effective than a peer-to-peer arrangement, but it requires a separate server to host the server application.
- the GPS establishes communication between the players and their respective game-playing devices exchange information without relying on the centralized GPS.
- Dedicated GPSs are servers which run independently of the client. Such servers are usually run on dedicated hardware located in data centers, providing more bandwidth and dedicated processing power. Dedicated servers are the preferred method of hosting game servers for most PC -based multiplayer games. Massively multiplayer online games run on dedicated servers usually hosted by the software company that owns the game title, allowing them to control and update content.
- Broadcast Processing Server (BPS) 256 distributes audio or video signals to an audience. Broadcasting to a very narrow range of audience is sometimes called narrowcasting. The final leg of broadcast distribution is how the signal gets to the listener or viewer, and it may come over the air as with a radio station or TV station to an antenna and receiver, or may come through cable TV or cable radio (or "wireless cable”) via the station or directly from a network.
- the Internet may also bring either radio or TV to the recipient, especially with multicasting allowing the signal and bandwidth to be shared.
- broadcasts have been delimited by a geographic region, such as national broadcasts or regional broadcast. However, with the proliferation of fast internet, broadcasts are not defined by geographies as the content can reach almost any country in the world.
- SSP 258 provides computer storage space and related management services. SSPs also offer periodic backup and archiving. By offering storage as a service, users can order more storage as required. Another major advantage is that SSPs include backup services and users will not lose all their data if their computers' hard drives fail. Further, a plurality of SSPs can have total or partial copies of the user data, allowing users to access data in an efficient way independently of where the user is located or the device being used to access the data. For example, a user can access personal files in the home computer, as well as in a mobile phone while the user is on the move.
- Communications Provider 260 provides connectivity to the users.
- One kind of Communications Provider is an Internet Service Provider (ISP) which offers access to the Internet.
- the ISP connects its customers using a data transmission technology appropriate for delivering Internet Protocol datagrams, such as dial-up, DSL, cable modem, wireless or dedicated high-speed interconnects.
- the Communications Provider can also provide messaging services, such as e-mail, instant messaging, and SMS texting.
- Another type of Communications Provider is the Network Service provider (NSP) which sells bandwidth or network access by providing direct backbone access to the Internet.
- Network service providers may consist of telecommunications companies, data carriers, wireless communications providers, Internet service providers, cable television operators offering high-speed Internet access, etc.
- Data Exchange 268 interconnects the several modules inside ISP 253 and connects these modules to users 262 via network 266.
- Data Exchange 268 can cover a small area where all the modules of ISP 250 are in close proximity, or can cover a large geographic area when the different modules are geographically dispersed.
- Data Exchange 268 can include a fast Gigabit Ethernet (or faster) within a cabinet of a data center, or an intercontinental virtual area network (VLAN).
- VLAN virtual area network
- client device 264 which includes at least a CPU, a display and I/O.
- the client device can be a PC, a mobile phone, a netbook, a PDA, etc.
- ISP 250 recognizes the type of device used by the client and adjusts the communication method employed. In other cases, client devices use a standard communications method, such as html, to access ISP 250.
- Embodiments of the present invention may be practiced with various computer system configurations including hand-held devices, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers and the like.
- the invention can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a network.
- the invention can employ various computer-implemented operations involving data stored in computer systems. These operations are those requiring physical manipulation of physical quantities. Any of the operations described herein that form part of the invention are useful machine operations.
- the invention also relates to a device or an apparatus for performing these operations.
- the apparatus may be specially constructed for the required purpose, such as a special purpose computer.
- the computer can also perform other processing, program execution or routines that are not part of the special purpose, while still being capable of operating for the special purpose.
- the operations may be processed by a general purpose computer selectively activated or configured by one or more computer programs stored in the computer memory, cache, or obtained over a network. When data is obtained over a network the data maybe processed by other computers on the network, e.g., a cloud of computing resources.
- the embodiments of the present invention can also be defined as a machine that transforms data from one state to another state.
- the transformed data can be saved to storage and then manipulated by a processor.
- the processor thus transforms the data from one thing to another.
- the methods can be processed by one or more machines or processors that can be connected over a network. Each machine can transform data from one state or thing to another, and can also process data, save data to storage, transmit data over a network, display the result, or communicate the result to another machine.
- One or more embodiments of the present invention can also be fabricated as computer readable code on a computer readable medium.
- the computer readable medium is any data storage device that can store data, which can be thereafter be read by a computer system. Examples of the computer readable medium include hard drives, network attached storage (NAS), read-only memory, random-access memory, CD-ROMs, CD-Rs, CD-RWs, magnetic tapes and other optical and non-optical data storage devices.
- the computer readable medium can include computer readable tangible medium distributed over a network-coupled computer system so that the computer readable code is stored and executed in a distributed fashion.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Computer Hardware Design (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201180022611.0A CN102884490B (zh) | 2010-03-05 | 2011-01-24 | 在共享的稳定虚拟空间上维持多视图 |
MX2012010238A MX2012010238A (es) | 2010-03-05 | 2011-01-24 | Mantenimiento de vistas multiples en un espacio virtual estable compartido. |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US31125110P | 2010-03-05 | 2010-03-05 | |
US61/311,251 | 2010-03-05 | ||
US12/947,290 US8730156B2 (en) | 2010-03-05 | 2010-11-16 | Maintaining multiple views on a shared stable virtual space |
US12/947,290 | 2010-11-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011109126A1 true WO2011109126A1 (en) | 2011-09-09 |
Family
ID=43923591
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2011/022288 WO2011109126A1 (en) | 2010-03-05 | 2011-01-24 | Maintaining multiple views on a shared stable virtual space |
Country Status (4)
Country | Link |
---|---|
CN (2) | CN102884490B (zh) |
MX (1) | MX2012010238A (zh) |
TW (1) | TWI468734B (zh) |
WO (1) | WO2011109126A1 (zh) |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102495959A (zh) * | 2011-12-05 | 2012-06-13 | 无锡智感星际科技有限公司 | 一种基于位置映射的增强现实平台系统及应用方法 |
CN102542165A (zh) * | 2011-12-23 | 2012-07-04 | 三星半导体(中国)研究开发有限公司 | 用于三维虚拟棋盘的操作方法和装置 |
WO2012135554A1 (en) * | 2011-03-29 | 2012-10-04 | Qualcomm Incorporated | System for the rendering of shared digital interfaces relative to each user's point of view |
WO2013132146A1 (en) * | 2012-03-09 | 2013-09-12 | Nokia Corporation | Method and apparatus for performing an operation at least partially based upon the relative positions of at least two devices |
WO2013142682A1 (en) | 2012-03-21 | 2013-09-26 | Google Inc. | Using camera input to determine axis of rotation and navigation |
WO2013145536A1 (en) * | 2012-03-29 | 2013-10-03 | Sony Corporation | Information processing apparatus, information processing system, and information processing method |
WO2014204756A1 (en) * | 2013-06-18 | 2014-12-24 | Microsoft Corporation | Shared and private holographic objects |
WO2015070063A1 (en) * | 2013-11-08 | 2015-05-14 | Qualcomm Incorporated | Face tracking for additional modalities in spatial interaction |
EP2886172A1 (en) * | 2013-12-18 | 2015-06-24 | Microsoft Technology Licensing, LLC | Mixed-reality arena |
EP2756872A4 (en) * | 2011-09-14 | 2015-07-08 | Bandai Namco Games Inc | PROGRAM, STORAGE MEDIUM, GAME DEVICE AND COMPUTER |
EP3005195A4 (en) * | 2013-05-24 | 2017-05-24 | Awe Company Limited | Systems and methods for a shared mixed reality experience |
WO2018128930A1 (en) * | 2017-01-09 | 2018-07-12 | Snap Inc. | Augmented reality object manipulation |
CN109426333A (zh) * | 2017-08-23 | 2019-03-05 | 腾讯科技(深圳)有限公司 | 一种基于虚拟空间场景的信息交互方法及装置 |
WO2019141879A1 (en) * | 2018-01-22 | 2019-07-25 | The Goosebumps Factory Bvba | Calibration to be used in an augmented reality method and system |
US10593116B2 (en) | 2016-10-24 | 2020-03-17 | Snap Inc. | Augmented reality object manipulation |
US10740978B2 (en) | 2017-01-09 | 2020-08-11 | Snap Inc. | Surface aware lens |
WO2020226832A1 (en) * | 2019-05-06 | 2020-11-12 | Apple Inc. | Device, method, and computer-readable medium for presenting computer-generated reality files |
CN112020719A (zh) * | 2018-03-22 | 2020-12-01 | 惠普发展公司,有限责任合伙企业 | 三维环境中的数字标记 |
US11164546B2 (en) | 2016-01-20 | 2021-11-02 | Samsung Electronics Co., Ltd. | HMD device and method for controlling same |
US11176737B2 (en) | 2018-11-27 | 2021-11-16 | Snap Inc. | Textured mesh building |
US11227442B1 (en) | 2019-12-19 | 2022-01-18 | Snap Inc. | 3D captions with semantic graphical elements |
US11232646B2 (en) | 2019-09-06 | 2022-01-25 | Snap Inc. | Context-based virtual object rendering |
WO2022036372A1 (en) * | 2020-08-13 | 2022-02-17 | Snap Inc. | User interface for pose driven virtual effects |
US11263817B1 (en) | 2019-12-19 | 2022-03-01 | Snap Inc. | 3D captions with face tracking |
US11443491B2 (en) | 2019-06-28 | 2022-09-13 | Snap Inc. | 3D object camera customization system |
US11501499B2 (en) | 2018-12-20 | 2022-11-15 | Snap Inc. | Virtual surface modification |
WO2023043741A1 (en) * | 2021-09-14 | 2023-03-23 | Meta Platforms Technologies, Llc | Creating shared virtual spaces |
US11645817B2 (en) | 2017-07-28 | 2023-05-09 | Tencent Technology (Shenzhen) Company Limited | Information processing method and apparatus, terminal device, and computer readable storage medium on displaying decoupled virtual objects in a virtual scene |
US11715268B2 (en) | 2018-08-30 | 2023-08-01 | Snap Inc. | Video clip object tracking |
EP3223116B1 (en) * | 2016-03-21 | 2023-09-06 | Accenture Global Solutions Limited | Multiplatform based experience generation |
US11991469B2 (en) | 2020-06-30 | 2024-05-21 | Snap Inc. | Skeletal tracking for real-time virtual effects |
Families Citing this family (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103105993B (zh) | 2013-01-25 | 2015-05-20 | 腾讯科技(深圳)有限公司 | 基于增强现实技术实现互动的方法和系统 |
TWI555390B (zh) * | 2013-02-20 | 2016-10-21 | 仁寶電腦工業股份有限公司 | 控制電子設備的方法與電子裝置 |
CN104657568B (zh) * | 2013-11-21 | 2017-10-03 | 深圳先进技术研究院 | 基于智能眼镜的多人移动游戏系统及方法 |
US9787846B2 (en) * | 2015-01-21 | 2017-10-10 | Microsoft Technology Licensing, Llc | Spatial audio signal processing for objects with associated audio content |
US9407865B1 (en) * | 2015-01-21 | 2016-08-02 | Microsoft Technology Licensing, Llc | Shared scene mesh data synchronization |
US10015370B2 (en) | 2015-08-27 | 2018-07-03 | Htc Corporation | Method for synchronizing video and audio in virtual reality system |
US10665019B2 (en) | 2016-03-24 | 2020-05-26 | Qualcomm Incorporated | Spatial relationships for integration of visual images of physical environment into virtual reality |
CN105938629B (zh) * | 2016-03-31 | 2022-01-18 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
CN109219789A (zh) * | 2016-05-04 | 2019-01-15 | 深圳脑穿越科技有限公司 | 虚拟现实的显示方法、装置及终端 |
US10245507B2 (en) * | 2016-06-13 | 2019-04-02 | Sony Interactive Entertainment Inc. | Spectator management at view locations in virtual reality environments |
US10169918B2 (en) * | 2016-06-24 | 2019-01-01 | Microsoft Technology Licensing, Llc | Relational rendering of holographic objects |
CN106200956A (zh) * | 2016-07-07 | 2016-12-07 | 北京时代拓灵科技有限公司 | 一种虚拟现实领域多媒体呈现和交互的方法 |
CN106447786A (zh) * | 2016-09-14 | 2017-02-22 | 同济大学 | 一种基于虚拟现实技术的平行空间创建及共享系统 |
CN106528285A (zh) * | 2016-11-11 | 2017-03-22 | 上海远鉴信息科技有限公司 | 虚拟现实中多个合作调度方法及系统 |
CN106621306A (zh) * | 2016-12-23 | 2017-05-10 | 浙江海洋大学 | 一种双层立体式军旗棋盘 |
US11474534B2 (en) * | 2017-03-01 | 2022-10-18 | Mitsubishi Electric Corporation | Information processing system |
CN107103645B (zh) * | 2017-04-27 | 2018-07-20 | 腾讯科技(深圳)有限公司 | 虚拟现实媒体文件生成方法及装置 |
CN107087152B (zh) * | 2017-05-09 | 2018-08-14 | 成都陌云科技有限公司 | 三维成像信息通信系统 |
CN108932051B (zh) * | 2017-05-24 | 2022-12-16 | 腾讯科技(北京)有限公司 | 增强现实图像处理方法、装置及存储介质 |
CN107320955B (zh) * | 2017-06-23 | 2021-01-29 | 武汉秀宝软件有限公司 | 一种基于多客户端的ar场馆界面交互方法及系统 |
CN109298776B (zh) * | 2017-07-25 | 2021-02-19 | 阿里巴巴(中国)有限公司 | 增强现实互动系统、方法和装置 |
CN107469343B (zh) * | 2017-07-28 | 2021-01-26 | 深圳市瑞立视多媒体科技有限公司 | 虚拟现实交互方法、装置及系统 |
CN107492183A (zh) * | 2017-07-31 | 2017-12-19 | 程昊 | 一种有纸即开型彩票ar展示方法及系统 |
CN107632700A (zh) * | 2017-08-01 | 2018-01-26 | 中国农业大学 | 一种基于虚拟现实的农具展馆体验系统和方法 |
WO2019080902A1 (en) * | 2017-10-27 | 2019-05-02 | Zyetric Inventions Limited | INTELLIGENT INTERACTIVE VIRTUAL OBJECT |
WO2019087564A1 (ja) * | 2017-11-01 | 2019-05-09 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
CN107861682A (zh) * | 2017-11-03 | 2018-03-30 | 网易(杭州)网络有限公司 | 虚拟对象的移动控制方法和装置 |
CN107967054B (zh) * | 2017-11-16 | 2020-11-27 | 中国人民解放军陆军装甲兵学院 | 一种虚拟现实与增强现实耦合的沉浸式三维电子沙盘 |
CN107657589B (zh) * | 2017-11-16 | 2021-05-14 | 上海麦界信息技术有限公司 | 基于三基准点标定的手机ar定位坐标轴同步方法 |
CN107995481B (zh) * | 2017-11-30 | 2019-11-15 | 贵州颐爱科技有限公司 | 一种混合现实的显示方法及装置 |
CN108269307B (zh) * | 2018-01-15 | 2023-04-07 | 歌尔科技有限公司 | 一种增强现实交互方法及设备 |
CN108519817A (zh) * | 2018-03-26 | 2018-09-11 | 广东欧珀移动通信有限公司 | 基于增强现实的交互方法、装置、存储介质及电子设备 |
CN108667798A (zh) * | 2018-03-27 | 2018-10-16 | 上海临奇智能科技有限公司 | 一种虚拟观影的方法及系统 |
CN108479065B (zh) * | 2018-03-29 | 2021-12-28 | 京东方科技集团股份有限公司 | 一种虚拟影像的交互方法及相关装置 |
US11173398B2 (en) * | 2018-05-21 | 2021-11-16 | Microsoft Technology Licensing, Llc | Virtual camera placement system |
CN108919945A (zh) * | 2018-06-07 | 2018-11-30 | 佛山市长郡科技有限公司 | 一种虚拟现实设备工作的方法 |
CN109284000B (zh) * | 2018-08-10 | 2022-04-01 | 西交利物浦大学 | 一种虚拟现实环境下三维几何物体可视化的方法及系统 |
US10866658B2 (en) | 2018-12-20 | 2020-12-15 | Industrial Technology Research Institute | Indicator device, mixed reality device and operation method thereof |
EP3914996A1 (en) | 2019-04-18 | 2021-12-01 | Apple Inc. | Shared data and collaboration for head-mounted devices |
US10948978B2 (en) | 2019-04-23 | 2021-03-16 | XRSpace CO., LTD. | Virtual object operating system and virtual object operating method |
US10499044B1 (en) | 2019-05-13 | 2019-12-03 | Athanos, Inc. | Movable display for viewing and interacting with computer generated environments |
CN110286768B (zh) * | 2019-06-27 | 2022-05-17 | Oppo广东移动通信有限公司 | 虚拟物体显示方法、终端设备及计算机可读存储介质 |
CN110349270B (zh) * | 2019-07-02 | 2023-07-28 | 上海迪沪景观设计有限公司 | 基于现实空间定位的虚拟沙盘呈现方法 |
US20210157394A1 (en) | 2019-11-24 | 2021-05-27 | XRSpace CO., LTD. | Motion tracking system and method |
CN111915736A (zh) * | 2020-08-06 | 2020-11-10 | 黄得锋 | 一种ar交互控制系统、装置及应用 |
CN113941138A (zh) * | 2020-08-06 | 2022-01-18 | 黄得锋 | 一种ar交互控制系统、装置及应用 |
CN115705116A (zh) * | 2021-08-04 | 2023-02-17 | 北京字跳网络技术有限公司 | 交互方法、电子设备、存储介质和程序产品 |
TWI803134B (zh) * | 2021-09-24 | 2023-05-21 | 宏達國際電子股份有限公司 | 虛擬影像顯示裝置及其輸入介面的設定方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030156144A1 (en) * | 2002-02-18 | 2003-08-21 | Canon Kabushiki Kaisha | Information processing apparatus and method |
US20060258420A1 (en) * | 2003-09-02 | 2006-11-16 | Mullen Jeffrey D | Systems and methods for location based games and employment of the same on location enabled devices |
US20080228422A1 (en) * | 2007-03-13 | 2008-09-18 | Canon Kabushiki Kaisha | Information processing apparatus and information processing method |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5588139A (en) * | 1990-06-07 | 1996-12-24 | Vpl Research, Inc. | Method and system for generating objects for a multi-person virtual world using data flow networks |
US6522312B2 (en) * | 1997-09-01 | 2003-02-18 | Canon Kabushiki Kaisha | Apparatus for presenting mixed reality shared among operators |
US6972734B1 (en) * | 1999-06-11 | 2005-12-06 | Canon Kabushiki Kaisha | Mixed reality apparatus and mixed reality presentation method |
US7149691B2 (en) * | 2001-07-27 | 2006-12-12 | Siemens Corporate Research, Inc. | System and method for remotely experiencing a virtual environment |
US20060257420A1 (en) * | 2002-04-26 | 2006-11-16 | Cel-Sci Corporation | Methods of preparation and composition of peptide constructs useful for treatment of autoimmune and transplant related host versus graft conditions |
US8323106B2 (en) * | 2008-05-30 | 2012-12-04 | Sony Computer Entertainment America Llc | Determination of controller three-dimensional location using image analysis and ultrasonic communication |
US8585476B2 (en) * | 2004-11-16 | 2013-11-19 | Jeffrey D Mullen | Location-based games and augmented reality systems |
TWI278772B (en) * | 2005-02-23 | 2007-04-11 | Nat Applied Res Lab Nat Ce | Augmented reality system and method with mobile and interactive function for multiple users |
JP4738870B2 (ja) * | 2005-04-08 | 2011-08-03 | キヤノン株式会社 | 情報処理方法、情報処理装置および遠隔複合現実感共有装置 |
CN101174332B (zh) * | 2007-10-29 | 2010-11-03 | 张建中 | 一种将真实世界实时场景与虚拟现实场景结合交互的方法和装置以及系统 |
US8386918B2 (en) * | 2007-12-06 | 2013-02-26 | International Business Machines Corporation | Rendering of real world objects and interactions into a virtual universe |
US8786675B2 (en) * | 2008-01-23 | 2014-07-22 | Michael F. Deering | Systems using eye mounted displays |
-
2011
- 2011-01-24 WO PCT/US2011/022288 patent/WO2011109126A1/en active Application Filing
- 2011-01-24 CN CN201180022611.0A patent/CN102884490B/zh active Active
- 2011-01-24 MX MX2012010238A patent/MX2012010238A/es active IP Right Grant
- 2011-01-24 CN CN201610220654.4A patent/CN105843396B/zh active Active
- 2011-01-28 TW TW100103494A patent/TWI468734B/zh active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030156144A1 (en) * | 2002-02-18 | 2003-08-21 | Canon Kabushiki Kaisha | Information processing apparatus and method |
US20060258420A1 (en) * | 2003-09-02 | 2006-11-16 | Mullen Jeffrey D | Systems and methods for location based games and employment of the same on location enabled devices |
US20080228422A1 (en) * | 2007-03-13 | 2008-09-18 | Canon Kabushiki Kaisha | Information processing apparatus and information processing method |
Cited By (66)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012135554A1 (en) * | 2011-03-29 | 2012-10-04 | Qualcomm Incorporated | System for the rendering of shared digital interfaces relative to each user's point of view |
US9384594B2 (en) | 2011-03-29 | 2016-07-05 | Qualcomm Incorporated | Anchoring virtual images to real world surfaces in augmented reality systems |
US9047698B2 (en) | 2011-03-29 | 2015-06-02 | Qualcomm Incorporated | System for the rendering of shared digital interfaces relative to each user's point of view |
US9142062B2 (en) | 2011-03-29 | 2015-09-22 | Qualcomm Incorporated | Selective hand occlusion over virtual projections onto physical surfaces using skeletal tracking |
EP2756872A4 (en) * | 2011-09-14 | 2015-07-08 | Bandai Namco Games Inc | PROGRAM, STORAGE MEDIUM, GAME DEVICE AND COMPUTER |
US9155967B2 (en) | 2011-09-14 | 2015-10-13 | Bandai Namco Games Inc. | Method for implementing game, storage medium, game device, and computer |
CN102495959A (zh) * | 2011-12-05 | 2012-06-13 | 无锡智感星际科技有限公司 | 一种基于位置映射的增强现实平台系统及应用方法 |
CN102542165A (zh) * | 2011-12-23 | 2012-07-04 | 三星半导体(中国)研究开发有限公司 | 用于三维虚拟棋盘的操作方法和装置 |
WO2013132146A1 (en) * | 2012-03-09 | 2013-09-12 | Nokia Corporation | Method and apparatus for performing an operation at least partially based upon the relative positions of at least two devices |
WO2013142682A1 (en) | 2012-03-21 | 2013-09-26 | Google Inc. | Using camera input to determine axis of rotation and navigation |
EP2829150A4 (en) * | 2012-03-21 | 2016-01-13 | Google Inc | USE OF CAMERA INPUT TO DETERMINE AXIS OF ROTATION AND NAVIGATION |
CN104205175A (zh) * | 2012-03-29 | 2014-12-10 | 索尼公司 | 信息处理装置,信息处理系统及信息处理方法 |
US9824497B2 (en) | 2012-03-29 | 2017-11-21 | Sony Corporation | Information processing apparatus, information processing system, and information processing method |
WO2013145536A1 (en) * | 2012-03-29 | 2013-10-03 | Sony Corporation | Information processing apparatus, information processing system, and information processing method |
US10198870B2 (en) | 2012-03-29 | 2019-02-05 | Sony Corporation | Information processing apparatus, information processing system, and information processing method |
US9940897B2 (en) | 2013-05-24 | 2018-04-10 | Awe Company Limited | Systems and methods for a shared mixed reality experience |
EP3005195A4 (en) * | 2013-05-24 | 2017-05-24 | Awe Company Limited | Systems and methods for a shared mixed reality experience |
WO2014204756A1 (en) * | 2013-06-18 | 2014-12-24 | Microsoft Corporation | Shared and private holographic objects |
US10146299B2 (en) | 2013-11-08 | 2018-12-04 | Qualcomm Technologies, Inc. | Face tracking for additional modalities in spatial interaction |
WO2015070063A1 (en) * | 2013-11-08 | 2015-05-14 | Qualcomm Incorporated | Face tracking for additional modalities in spatial interaction |
EP2886172A1 (en) * | 2013-12-18 | 2015-06-24 | Microsoft Technology Licensing, LLC | Mixed-reality arena |
US11164546B2 (en) | 2016-01-20 | 2021-11-02 | Samsung Electronics Co., Ltd. | HMD device and method for controlling same |
EP3223116B1 (en) * | 2016-03-21 | 2023-09-06 | Accenture Global Solutions Limited | Multiplatform based experience generation |
US10593116B2 (en) | 2016-10-24 | 2020-03-17 | Snap Inc. | Augmented reality object manipulation |
US11580700B2 (en) | 2016-10-24 | 2023-02-14 | Snap Inc. | Augmented reality object manipulation |
US11195338B2 (en) | 2017-01-09 | 2021-12-07 | Snap Inc. | Surface aware lens |
US11704878B2 (en) | 2017-01-09 | 2023-07-18 | Snap Inc. | Surface aware lens |
KR102406297B1 (ko) | 2017-01-09 | 2022-06-10 | 스냅 인코포레이티드 | 증강 현실 객체 조작 |
EP4270325A3 (en) * | 2017-01-09 | 2023-12-20 | Snap Inc. | Augmented reality object manipulation |
KR20210059040A (ko) * | 2017-01-09 | 2021-05-24 | 스냅 인코포레이티드 | 증강 현실 객체 조작 |
KR102254709B1 (ko) | 2017-01-09 | 2021-05-24 | 스냅 인코포레이티드 | 증강 현실 객체 조작 |
KR20190103335A (ko) * | 2017-01-09 | 2019-09-04 | 스냅 인코포레이티드 | 증강 현실 객체 조작 |
WO2018128930A1 (en) * | 2017-01-09 | 2018-07-12 | Snap Inc. | Augmented reality object manipulation |
US10740978B2 (en) | 2017-01-09 | 2020-08-11 | Snap Inc. | Surface aware lens |
KR20220080209A (ko) * | 2017-01-09 | 2022-06-14 | 스냅 인코포레이티드 | 증강 현실 객체 조작 |
KR102577968B1 (ko) | 2017-01-09 | 2023-09-14 | 스냅 인코포레이티드 | 증강 현실 객체 조작 |
US11645817B2 (en) | 2017-07-28 | 2023-05-09 | Tencent Technology (Shenzhen) Company Limited | Information processing method and apparatus, terminal device, and computer readable storage medium on displaying decoupled virtual objects in a virtual scene |
CN109426333A (zh) * | 2017-08-23 | 2019-03-05 | 腾讯科技(深圳)有限公司 | 一种基于虚拟空间场景的信息交互方法及装置 |
CN109426333B (zh) * | 2017-08-23 | 2022-11-04 | 腾讯科技(深圳)有限公司 | 一种基于虚拟空间场景的信息交互方法及装置 |
WO2019141879A1 (en) * | 2018-01-22 | 2019-07-25 | The Goosebumps Factory Bvba | Calibration to be used in an augmented reality method and system |
CN112020719A (zh) * | 2018-03-22 | 2020-12-01 | 惠普发展公司,有限责任合伙企业 | 三维环境中的数字标记 |
US11715268B2 (en) | 2018-08-30 | 2023-08-01 | Snap Inc. | Video clip object tracking |
US11176737B2 (en) | 2018-11-27 | 2021-11-16 | Snap Inc. | Textured mesh building |
US11836859B2 (en) | 2018-11-27 | 2023-12-05 | Snap Inc. | Textured mesh building |
US12020377B2 (en) | 2018-11-27 | 2024-06-25 | Snap Inc. | Textured mesh building |
US20220044479A1 (en) | 2018-11-27 | 2022-02-10 | Snap Inc. | Textured mesh building |
US11620791B2 (en) | 2018-11-27 | 2023-04-04 | Snap Inc. | Rendering 3D captions within real-world environments |
US12106441B2 (en) | 2018-11-27 | 2024-10-01 | Snap Inc. | Rendering 3D captions within real-world environments |
US11210850B2 (en) | 2018-11-27 | 2021-12-28 | Snap Inc. | Rendering 3D captions within real-world environments |
US11501499B2 (en) | 2018-12-20 | 2022-11-15 | Snap Inc. | Virtual surface modification |
US11250604B2 (en) | 2019-05-06 | 2022-02-15 | Apple Inc. | Device, method, and graphical user interface for presenting CGR files |
US12079910B2 (en) | 2019-05-06 | 2024-09-03 | Apple Inc. | Device, method, and graphical user interface for presenting CGR files |
WO2020226832A1 (en) * | 2019-05-06 | 2020-11-12 | Apple Inc. | Device, method, and computer-readable medium for presenting computer-generated reality files |
US11823341B2 (en) | 2019-06-28 | 2023-11-21 | Snap Inc. | 3D object camera customization system |
US11443491B2 (en) | 2019-06-28 | 2022-09-13 | Snap Inc. | 3D object camera customization system |
US11232646B2 (en) | 2019-09-06 | 2022-01-25 | Snap Inc. | Context-based virtual object rendering |
US11636657B2 (en) | 2019-12-19 | 2023-04-25 | Snap Inc. | 3D captions with semantic graphical elements |
US11810220B2 (en) | 2019-12-19 | 2023-11-07 | Snap Inc. | 3D captions with face tracking |
US11263817B1 (en) | 2019-12-19 | 2022-03-01 | Snap Inc. | 3D captions with face tracking |
US11908093B2 (en) | 2019-12-19 | 2024-02-20 | Snap Inc. | 3D captions with semantic graphical elements |
US11227442B1 (en) | 2019-12-19 | 2022-01-18 | Snap Inc. | 3D captions with semantic graphical elements |
US11991469B2 (en) | 2020-06-30 | 2024-05-21 | Snap Inc. | Skeletal tracking for real-time virtual effects |
US11832015B2 (en) | 2020-08-13 | 2023-11-28 | Snap Inc. | User interface for pose driven virtual effects |
WO2022036372A1 (en) * | 2020-08-13 | 2022-02-17 | Snap Inc. | User interface for pose driven virtual effects |
WO2023043741A1 (en) * | 2021-09-14 | 2023-03-23 | Meta Platforms Technologies, Llc | Creating shared virtual spaces |
US12069061B2 (en) | 2021-09-14 | 2024-08-20 | Meta Platforms Technologies, Llc | Creating shared virtual spaces |
Also Published As
Publication number | Publication date |
---|---|
CN105843396A (zh) | 2016-08-10 |
CN102884490A (zh) | 2013-01-16 |
CN102884490B (zh) | 2016-05-04 |
MX2012010238A (es) | 2013-01-18 |
CN105843396B (zh) | 2019-01-01 |
TWI468734B (zh) | 2015-01-11 |
TW201205121A (en) | 2012-02-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10424077B2 (en) | Maintaining multiple views on a shared stable virtual space | |
TWI468734B (zh) | 用於在共享穩定虛擬空間維持多個視面的方法、攜帶式裝置以及電腦程式 | |
EP2558176B1 (en) | Calibration of portable devices in a shared virtual space | |
US9990029B2 (en) | Interface object and motion controller for augmented reality | |
US9947139B2 (en) | Method and apparatus for providing hybrid reality environment | |
EP3265864B1 (en) | Tracking system for head mounted display | |
CN107362532B (zh) | 视频游戏的方向输入 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180022611.0 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11704871 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2012/010238 Country of ref document: MX |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11704871 Country of ref document: EP Kind code of ref document: A1 |