WO2012122293A1 - Générateurs de mission en réalité augmentée - Google Patents

Générateurs de mission en réalité augmentée Download PDF

Info

Publication number
WO2012122293A1
WO2012122293A1 PCT/US2012/028109 US2012028109W WO2012122293A1 WO 2012122293 A1 WO2012122293 A1 WO 2012122293A1 US 2012028109 W US2012028109 W US 2012028109W WO 2012122293 A1 WO2012122293 A1 WO 2012122293A1
Authority
WO
WIPO (PCT)
Prior art keywords
mission
generator
data
template
mobile device
Prior art date
Application number
PCT/US2012/028109
Other languages
English (en)
Inventor
Brian Elan Lee
Michael Sean STEWART
James Stewartson
Original Assignee
Fourth Wall Studios, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fourth Wall Studios, Inc. filed Critical Fourth Wall Studios, Inc.
Publication of WO2012122293A1 publication Critical patent/WO2012122293A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/216Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/217Input arrangements for video game devices characterised by their sensors, purposes or types using environment-related information, i.e. information generated otherwise than by the player, e.g. ambient temperature or humidity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/33Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections
    • A63F13/332Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using wireless networks, e.g. cellular phone networks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/209Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform characterized by low level software layer, relating to hardware management, e.g. Operating System, Application Programming Interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/406Transmission via wireless network, e.g. pager or GSM
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/69Involving elements of the real world in the game world, e.g. measurement in live races, real video
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • U.S. pat. publ. no. 2006/0223635 to Rosenberg takes simulated gaming a step further by combing simulated gaming objects and events with the real-world. A display can present simulated objects on a display.
  • Rosenberg fails to appreciate the dynamic nature of the real -world and that each game player can have their game play experience.
  • U.S. pat. publ. no. 2007/0281765 to Mullen discusses systems and methods for location based games. Although Mullen contemplates using the physical location of the user to correspond to a virtual location of a virtual character, Mullen fails to contemplate the use of ambient environmental information apart from location information when generating the game.
  • U.S. pat. publ. no. 2011/0081973 to Hall discusses a different location based game, but also fails to contemplate the use of ambient environmental information apart from location information when generating the game.
  • an augmented reality platform can be constructed to generate augmented reality missions for users.
  • a mission can be generated, possibly from a template, based on a user's environment or data collected about the user's environment.
  • Mission objects can have their attributes populated based on the environment data. For example, all red cars local to the user can become mission objects. As the missions are based on a user's environment, two users could experience quite different missions even though the missions are generated from the same template.
  • the inventive subject matter provides apparatus, systems and methods in which one can provide augmented or mixed reality experiences to users.
  • One of the many aspects of the inventive subject matter includes an augmented reality (AR) gaming system capable of generating one or more AR missions.
  • An AR mission can be presented to a user via a mobile device (e.g., portable computer, media player, cell phone, vehicle, game system, sensor, etc.) where the user can interact with the mission via the mobile device, or other interactive devices.
  • a mobile device e.g., portable computer, media player, cell phone, vehicle, game system, sensor, etc.
  • AR missions can be generated via an AR mission generator that includes a mission database storing one or more AR mission templates and an AR mission engine coupled with the database.
  • the AR mission engine can obtain environmental data apart from location information (e.g., GPS coordinates) from one or more remote sensing devices, including the user's mobile device, where the environmental data comprises a digital representation of a scene.
  • the AR mission engine can combine information derived from the digital representation of the scene with an AR mission template to construct a quest (i.e., an instantiated mission) for the user.
  • the AR mission engine can select a mission template from the database based on the environmental data and the location of the user's mobile device, and then populate the mission template with AR objects (e.g., objectives, rewards, goals, etc.) to flush out the mission.
  • AR objects e.g., objectives, rewards, goals, etc.
  • the attributes of the AR objects can also be populated based on the environmental data.
  • FIG. 1 is a schematic of an augmented reality system having an augmented reality mission generator. Detailed Description
  • computing devices comprise a processor configured to execute software instructions stored on a tangible, non-transitory computer readable storage medium (e.g., hard drive, solid state drive, RAM, flash, ROM, etc.).
  • the software instructions preferably configure the computing device to provide the roles, responsibilities, or other functionality as discussed below with respect to the disclosed apparatus.
  • the various servers, systems, databases, or interfaces exchange data using standardized protocols or algorithms, possibly based on SMS, MMS, HTTP, HTTPS, AES, public -private key exchanges, web service APIs, known financial transaction protocols, or other electronic information exchanging methods.
  • Data exchanges preferably are conducted over a packet-switched network, the Internet, LAN, WAN, VPN, PAN, or other type of packet switched network.
  • the disclosed techniques provide many advantageous technical effects including providing an augmented reality infrastructure capable of configuring one or more mobile devices to present a mixed reality interactive environment to users.
  • the mixed reality environment, and accompany missions can be constructed from external data obtained from sensors that are external to the infrastructure.
  • a mission can be populated with information obtained from satellites, Google® StreetViewTM, third party mapping information, security cameras, kiosks, televisions or television stations, set top boxes, weather stations, radios or radio stations, web sites, cellular towers, or other data sources.
  • Coupled to is intended to include both direct coupling (in which two elements that are coupled to each other contact each other) and indirect coupling (in which at least one additional element is located between the two elements). Therefore, the terms “coupled to” and “coupled with” are used synonymously.
  • inventive subject matter provides many example embodiments of the inventive subject matter. Although each embodiment represents a single combination of inventive elements, the inventive subject matter is considered to include all possible combinations of the disclosed elements. Thus if one embodiment comprises elements A, B, and C, and a second embodiment comprises elements B and D, then the inventive subject matter is also considered to include other remaining combinations of A, B, C, or D, even if not explicitly disclosed.
  • FIG. 1 presents an overview of one embodiment of an augmented or mixed reality environment 100 where a user can obtain one or more missions from an AR mission generator 1 10.
  • each user can utilize a mobile device 102 to obtain sensor data from one or more sensor(s) 104 related to a scene 120 or the user's environment that is separate from a user's location information.
  • a user's mobile device 102 can exchange the collected environmental data or a digital representation of the scene 120 with the AR mission generator 1 10.
  • Data exchanges preferably are conducted over a network 130, which could include, for example, cell networks, mesh networks, Internet, LANs, WANs, VPNs, PANs, or other types of networks or combinations thereof.
  • the AR mission generator 1 10 can generate one or more missions for the user, at least in part based on the obtained environment data.
  • the mobile device 102 could also transmit location information such as GPS coordinates and/or cellular triangulation information to the AR mission generator 110.
  • the mobile device 102 is presented as a smart phone, which represents one of many different types of devices that can integrate into the overall AR environment 100.
  • Mobile devices can include, for example, smart phones and other wireless telephones, laptops, netbooks, tablet PCs, and other mobile computers, vehicles, sensors (e.g., a camera), media players, personal digital assistants, MP3 or other media players, watches, and gaming platforms.
  • Other types of devices can include electronic picture frames, desktop computers, appliances (e.g., STB, kitchen appliances, etc), kiosks, non-mobile sensors, media players, game consoles, televisions, or other types of devices.
  • Preferred devices have a
  • a communication link and offer a presentation system (e.g., display, speakers, vibrators, etc.) for presenting AR data to the user.
  • a presentation system e.g., display, speakers, vibrators, etc.
  • Environmental data or a digital representation of the scene 120 can include data from multiple sources or sensors.
  • a sensor 122 e.g., a camera
  • Contemplated sensors can include, for example, microphones, magnetometers, accelerometers, biosensors, still and video cameras, weather sensors, optical sensors, or other types of sensors.
  • the types of data used to form a digital representation of the scene can cover a wide range of modalities including image data, audio data, haptic data, or other modalities.
  • additional data can include weather data, location data, orientation data, movement data, biometrics data, or other types of data.
  • the AR mission generator 110 can include one or more modules or components configured to support the roles or responsibilities of the AR mission generator 110.
  • the AR mission generator 1 10 can include an AR mission template database 1 12 and an AR mission engine 1 14.
  • the AR mission template database 1 12 and AR mission engine 114 are shown as local to the AR mission generator 110, it is contemplated that one or both of the AR mission template database 112 and AR mission engine 1 14 can be separate from, and located locally or remotely with respect to, the AR mission generator 1 10.
  • the AR mission template database 1 12 can store a plurality of AR mission template objects where each mission template object comprises attributes or metadata describing characteristics of a mission.
  • the mission template objects can be stored as an XML file or other serialized format.
  • a mission template object can include a wide spectrum of information including, for example, name / ID of mission, a type of mission (e.g., dynamic, chain, etc.), goals, supporting objects, rewards, narratives, digital assets (e.g., video, audio, etc), mission requirements (e.g., required weapons, achievements, user level, number of players, etc.), location requirements (e.g., indoors or outdoors), conditions, programmatic instructions, links to other missions, or other information that can be used to instantiate a mission.
  • a type of mission e.g., dynamic, chain, etc.
  • goals e.g., supporting objects, rewards, narratives, digital assets (e.g., video, audio, etc)
  • mission requirements e.g., required weapons, achievements, user level, number of players, etc.
  • location requirements e.g., indoors or outdoors
  • conditions programmatic instructions, links to other missions, or other information that can be used to instantiate a mission.
  • the AR mission generator 110 is illustrated as being remote relative to the scene 120 or mobile device 102. However, it is specifically contemplated that some or all of the features of the mission generator 1 10, AR mission engine 114 and/or AR mission template database 1 12, for example, can be integrated into the mobile device 102. In such
  • information can be exchanged through an application program interface (API) or other suitable interface.
  • API application program interface
  • the AR mission engine 114 or other components can comprise a distal computing server, a distributed computing platform, or even an AR computing platform.
  • the AR mission engine 1 14 is preferably configured to obtain environmental data from the user's mobile device 102, about the scene 120 proximate to the mobile device 102. Based on the environmental data, the AR mission engine 114 can determine the AR mission engine 114 .
  • Scene characteristics can include user identification and capabilities of the mobile device 102 including, for example, available sensors 104, screen size, processor speed, available memory, presence of a camera or other imaging sensor. Scene characteristics can also include weather conditions, visual images, location information, orientation, captured audio, presence and type of real-world objects, or other types of characteristics.
  • the AR mission engine 1 14 can compare the characteristics to the requirements, attributes, or conditions associated with the stored AR mission template objects to select a mission template. Once selected or otherwise obtained, the AR mission engine 114 can instantiate a mission for the user from the selected mission template object. It is contemplated that the AR mission generator 1 10 can configured the mobile device 102 to present the generated mission.
  • a mission template object includes a defined grammar having verbs that define user actions with respect to one or more AR objects associated with a mission.
  • an AR mission template object might have several verbs that define a mission with respect to the user's actions.
  • Contemplated verbs include, for example, read, view, deliver, fire (e.g., a weapon, etc.), upgrade, collect, converse, travel, or other actions.
  • the AR objects associated with a mission template can also be stored as a template, or rather as AR object templates.
  • the selected AR mission template object can be populated based on the environmental data.
  • a user could be in a shopping mall and log in to the AR mission generator 110 via their mobile phone to obtain a mission.
  • the AR mission engine 1 14 recognizes from the user's location (e.g., based on GPS coordinates) that the user is in a mall, and selects a mission that requires the user to collect objects.
  • the AR mission engine 1 14 instantiates AR objects as mannequins, and the mission requires that the user travels around the mall photographing mannequins (e.g., collecting the AR objects) to complete the mission.
  • the mobile device 102 could be configured to identify the mannequins, or other object of interest, by its associated features such as by using image recognition software.
  • Populating attributes or features of a mission or associated AR objects can also be achieved through object recognition.
  • the AR mission engine 1 14, perhaps in the mobile device 102 can recognize real-world objects in the scene 120 and use the objects' attributes to populate attributes of the one or more AR objects 124.
  • the attributes can be simply observed or looked-up from a database based on object recognition algorithms (e.g., SIFT, vSLAM, Viper, etc.).
  • a user may capture a picture of a scene having a plurality of trees.
  • the trees can be recognized by the AR mission engine, and AR objects can be generated based upon the trees' attributes (e.g., size, leave color, distance from mobile device, etc.).
  • the AR objects associated with a mission can range across a full spectrum of objects from completely real-world objects through completely virtual objects.
  • Exemplary AR objects can include, for example, a mission objective, a reward, an award point, a currency, a relationship, a virtual object, a real-world object, a promotion, a coupon, or other types of objects.
  • the AR objects can be integrated into the real-world via mobile device 102. For example, as the user pans and tilts their mobile device 102, the AR objects associated with the mission could be superimposed (overlaid) on the captured scene 120 while also maintaining their proper location and orientation with respect to real-world objects within the scene 120. Superimposing images of AR objects on a real-world image can be accomplished by many techniques.
  • a mission can be customized for a specific user based on the user's specific environment. Still, the missions can be efficiently based on just a few types of mission templates.
  • One especially interesting type of mission is a dynamic mission that can be fully customizable for the user. Dynamic missions can be a single one-off mission constructed in real-time if desired based on the obtained environmental data. While completion of a dynamic mission may not advance a story, users may obtain rewards for completing the mission including, for example, points, levels, currency, weapons, and experience. Examples of dynamic missions include shooting ten boars, collective five coins, going on a night patrol, finding a treasure, and so forth.
  • Another interesting type of mission is a chain mission that can be linked with preceding or succeeding missions to form a story arch. Chain mission can be constructed with more thought to create a greater level of immersion for the user.
  • missions have been presented as a single player platform. However, one should appreciate that missions can also comprise multi-player missions requiring two or more users. When multiple users are involved, new types of interactions can occur. Some multi-player missions might require cooperative objectives, while other multi-player missions might comprise counter objectives for the players where the players oppose or compete against each other. Because of the AR nature of the missions, it is contemplated that players could be in a variety of disparate locations while interacting with one another. An exemplary mission having counter objectives could be to infiltrate an enemy's base or to defend a fort.
  • missions are associated with game play. Still, missions can bridge across many markets beyond game play. Other types of missions can be constructed as an exercise program, an advertising campaign, or even following an alternative navigation route home. By constructing various types of missions for a user, the user can be enticed to discover new businesses or opportunities, possibly commercial opportunities.
  • Contemplated AR systems can include an analysis engine that correlates player attributes against mission objectives. Collecting and tracking of such information can be advantageous to businesses when targeting promotions or missions to players or other individuals.
  • ambient environmental data separate from the mobile device's location can be received.
  • An AR mission generator can select an AR mission template from a mission database coupled to the AR mission generator. It is contemplated that the AR mission template can be selected based at least in part upon the ambient environmental data.
  • a mission can be generated using the AR mission generator and the selected AR mission template, where the mission is based on at least a portion of the ambient
  • a mobile device can be configured via the AR mission generator to present the generated mission to a user.

Abstract

La présente invention a trait à des générateurs de mission en réalité augmentée (RA) qui génèrent des missions sur la base de données d'environnement sans lien avec l'emplacement d'un utilisateur. Les données d'environnement peuvent être obtenues à partir d'un appareil mobile de l'utilisateur ou à l'aide d'autres capteurs ou d'informations de tiers. Les missions peuvent être générées à partir d'un modèle de mission RA qui est stocké dans une base de données de missions et présentées à l'utilisateur sur son appareil mobile.
PCT/US2012/028109 2011-03-07 2012-03-07 Générateurs de mission en réalité augmentée WO2012122293A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161450052P 2011-03-07 2011-03-07
US61/450,052 2011-03-07

Publications (1)

Publication Number Publication Date
WO2012122293A1 true WO2012122293A1 (fr) 2012-09-13

Family

ID=45976510

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2012/028109 WO2012122293A1 (fr) 2011-03-07 2012-03-07 Générateurs de mission en réalité augmentée

Country Status (2)

Country Link
US (1) US20120231887A1 (fr)
WO (1) WO2012122293A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9104235B2 (en) 2013-08-22 2015-08-11 International Business Machines Corporation Modifying information presented by an augmented reality device

Families Citing this family (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9310892B2 (en) 2000-11-06 2016-04-12 Nant Holdings Ip, Llc Object information derived from object images
US7680324B2 (en) 2000-11-06 2010-03-16 Evryx Technologies, Inc. Use of image-derived information as search criteria for internet and other search engines
US8224078B2 (en) 2000-11-06 2012-07-17 Nant Holdings Ip, Llc Image capture and identification system and process
US7565008B2 (en) 2000-11-06 2009-07-21 Evryx Technologies, Inc. Data capture and identification system and process
US7899243B2 (en) 2000-11-06 2011-03-01 Evryx Technologies, Inc. Image capture and identification system and process
CA2694200C (fr) 2007-07-27 2015-06-16 Intertrust Technologies Corporation Systemes et procedes de publication de contenu
EP2193825B1 (fr) * 2008-12-03 2017-03-22 Alcatel Lucent Dispositif mobile pour applications de réalité augmentée
US9573064B2 (en) * 2010-06-24 2017-02-21 Microsoft Technology Licensing, Llc Virtual and location-based multiplayer gaming
WO2013078345A1 (fr) * 2011-11-21 2013-05-30 Nant Holdings Ip, Llc Service de facturation d'abonnement, systèmes et procédés associés
US20130281202A1 (en) * 2012-04-18 2013-10-24 Zynga, Inc. Method and apparatus for providing game elements in a social gaming environment
US9174128B2 (en) * 2012-04-26 2015-11-03 Zynga Inc. Dynamic quests in game
US9539498B1 (en) 2012-07-31 2017-01-10 Niantic, Inc. Mapping real world actions to a virtual world associated with a location-based game
US9604131B1 (en) 2012-07-31 2017-03-28 Niantic, Inc. Systems and methods for verifying player proximity within a location-based game
US9669293B1 (en) 2012-07-31 2017-06-06 Niantic, Inc. Game data validation
US9128789B1 (en) 2012-07-31 2015-09-08 Google Inc. Executing cross-cutting concerns for client-server remote procedure calls
US9621635B1 (en) 2012-07-31 2017-04-11 Niantic, Inc. Using side channels in remote procedure calls to return information in an interactive environment
US9669296B1 (en) 2012-07-31 2017-06-06 Niantic, Inc. Linking real world activities with a parallel reality game
US9226106B1 (en) 2012-07-31 2015-12-29 Niantic, Inc. Systems and methods for filtering communication within a location-based game
US9782668B1 (en) 2012-07-31 2017-10-10 Niantic, Inc. Placement of virtual elements in a virtual world associated with a location-based parallel reality game
US9338622B2 (en) 2012-10-04 2016-05-10 Bernt Erik Bjontegard Contextually intelligent communication systems and processes
US8968099B1 (en) 2012-11-01 2015-03-03 Google Inc. System and method for transporting virtual objects in a parallel reality game
US20140128161A1 (en) * 2012-11-06 2014-05-08 Stephen Latta Cross-platform augmented reality experience
US20140201205A1 (en) * 2013-01-14 2014-07-17 Disney Enterprises, Inc. Customized Content from User Data
WO2014181892A1 (fr) * 2013-05-08 2014-11-13 Square Enix Holdings Co., Ltd. Appareil de traitement d'informations, procédé de commande et programme
US10463953B1 (en) 2013-07-22 2019-11-05 Niantic, Inc. Detecting and preventing cheating in a location-based game
US9545565B1 (en) 2013-10-31 2017-01-17 Niantic, Inc. Regulating and scoring player interactions within a virtual world associated with a location-based parallel reality game
WO2015167549A1 (fr) * 2014-04-30 2015-11-05 Longsand Limited Plate-forme de jeu augmentée
US9861894B2 (en) * 2015-09-29 2018-01-09 International Business Machines Corporation Dynamic personalized location and contact-aware games
US10115234B2 (en) 2016-03-21 2018-10-30 Accenture Global Solutions Limited Multiplatform based experience generation
EP3465331A4 (fr) * 2016-06-06 2020-01-29 Warner Bros. Entertainment Inc. Système de réalité mixte
US10384130B2 (en) * 2016-08-05 2019-08-20 AR Sports LLC Fantasy sport platform with augmented reality player acquisition
AU2017401485A1 (en) * 2017-03-02 2019-09-19 Motorola Solutions, Inc. Method and apparatus for gathering visual data using an augmented-reality application
WO2018160081A1 (fr) * 2017-03-02 2018-09-07 Motorola Solutions, Inc. Procédé et appareil de collecte de données visuelles à l'aide d'une application à réalité augmentée
WO2018186178A1 (fr) * 2017-04-04 2018-10-11 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
US10717005B2 (en) * 2017-07-22 2020-07-21 Niantic, Inc. Validating a player's real-world location using activity within a parallel reality game
US10741088B1 (en) 2017-09-29 2020-08-11 DroneUp, LLC Multiplexed communications for coordination of piloted aerial drones enlisted to a common mission
CN108245881A (zh) * 2017-12-29 2018-07-06 武汉市马里欧网络有限公司 基于ar的三维拼板模型搭建系统
US20210263484A1 (en) * 2018-07-05 2021-08-26 Themissionzone, Inc. Systems and methods for manipulating the shape and behavior of a physical space
US11410488B2 (en) * 2019-05-03 2022-08-09 Igt Augmented reality virtual object collection based on symbol combinations
US10873951B1 (en) 2019-06-04 2020-12-22 Motorola Solutions, Inc. Method and device to minimize interference in a converged LMR/LTE communication device
US11210857B2 (en) 2019-09-26 2021-12-28 The Toronto-Dominion Bank Systems and methods for providing an augmented-reality virtual treasure hunt
US11574423B2 (en) 2021-01-29 2023-02-07 Boomanity Corp. A Delaware Corporation Augmented reality (AR) object communication and interaction system and method
US11941558B2 (en) * 2021-04-08 2024-03-26 Raytheon Company Intelligence preparation of the battlefield (IPB) collaborative time machine with real-time options

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2385238A (en) * 2002-02-07 2003-08-13 Hewlett Packard Co Using virtual environments in wireless communication systems
US20040041788A1 (en) 2002-08-28 2004-03-04 Lockheed Martin Corporation Interactive virtual portal
US6771294B1 (en) 1999-12-29 2004-08-03 Petri Pulli User interface
US6951515B2 (en) 1999-06-11 2005-10-04 Canon Kabushiki Kaisha Game apparatus for mixed reality space, image processing method thereof, and program storage medium
US6972734B1 (en) 1999-06-11 2005-12-06 Canon Kabushiki Kaisha Mixed reality apparatus and mixed reality presentation method
US20060223635A1 (en) 2005-04-04 2006-10-05 Outland Research method and apparatus for an on-screen/off-screen first person gaming experience
US20070104348A1 (en) 2000-11-06 2007-05-10 Evryx Technologies, Inc. Interactivity via mobile image recognition
US20070281765A1 (en) 2003-09-02 2007-12-06 Mullen Jeffrey D Systems and methods for location based games and employment of the same on locaton enabled devices
WO2009016186A2 (fr) * 2007-07-31 2009-02-05 Jochen Hummel Procédé d'obtention, assistée par ordinateur, d'une réalité virtuelle tridimensionnelle interactive
US7564469B2 (en) 2005-08-29 2009-07-21 Evryx Technologies, Inc. Interactivity with a mixed reality
EP2098271A2 (fr) * 2008-02-21 2009-09-09 Palo Alto Research Center Incorporated Plate-forme de jeu à réalité mixte avec informations de position
US20110081973A1 (en) 2005-11-30 2011-04-07 Hall Robert J Geogame for mobile device
US20110319148A1 (en) 2010-06-24 2011-12-29 Microsoft Corporation Virtual and location-based multiplayer gaming

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003095050A2 (fr) * 2002-05-13 2003-11-20 Consolidated Global Fun Unlimited, Llc Procede et systeme permettant d'interagir avec des phenomenes simules
US20050009608A1 (en) * 2002-05-13 2005-01-13 Consolidated Global Fun Unlimited Commerce-enabled environment for interacting with simulated phenomena
US20030232649A1 (en) * 2002-06-18 2003-12-18 Gizis Alexander C.M. Gaming system and method
US8585476B2 (en) * 2004-11-16 2013-11-19 Jeffrey D Mullen Location-based games and augmented reality systems
US20060223637A1 (en) * 2005-03-31 2006-10-05 Outland Research, Llc Video game system combining gaming simulation with remote robot control and remote robot feedback
US8355410B2 (en) * 2007-08-17 2013-01-15 At&T Intellectual Property I, L.P. Location-based mobile gaming application and method for implementing the same using a scalable tiered geocast protocol
US8506404B2 (en) * 2007-05-07 2013-08-13 Samsung Electronics Co., Ltd. Wireless gaming method and wireless gaming-enabled mobile terminal
GB2449694B (en) * 2007-05-31 2010-05-26 Sony Comp Entertainment Europe Entertainment system and method
US9901828B2 (en) * 2010-03-30 2018-02-27 Sony Interactive Entertainment America Llc Method for an augmented reality character to maintain and exhibit awareness of an observer
US8251819B2 (en) * 2010-07-19 2012-08-28 XMG Studio Sensor error reduction in mobile device based interactive multiplayer augmented reality gaming through use of one or more game conventions
US8267793B2 (en) * 2010-08-17 2012-09-18 Samsung Electronics Co., Ltd. Multiplatform gaming system
US8425295B2 (en) * 2010-08-17 2013-04-23 Paul Angelos BALLAS System and method for rating intensity of video games
US20120122570A1 (en) * 2010-11-16 2012-05-17 David Michael Baronoff Augmented reality gaming experience

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6951515B2 (en) 1999-06-11 2005-10-04 Canon Kabushiki Kaisha Game apparatus for mixed reality space, image processing method thereof, and program storage medium
US6972734B1 (en) 1999-06-11 2005-12-06 Canon Kabushiki Kaisha Mixed reality apparatus and mixed reality presentation method
US6771294B1 (en) 1999-12-29 2004-08-03 Petri Pulli User interface
US20070104348A1 (en) 2000-11-06 2007-05-10 Evryx Technologies, Inc. Interactivity via mobile image recognition
GB2385238A (en) * 2002-02-07 2003-08-13 Hewlett Packard Co Using virtual environments in wireless communication systems
US20040041788A1 (en) 2002-08-28 2004-03-04 Lockheed Martin Corporation Interactive virtual portal
US20070281765A1 (en) 2003-09-02 2007-12-06 Mullen Jeffrey D Systems and methods for location based games and employment of the same on locaton enabled devices
US20060223635A1 (en) 2005-04-04 2006-10-05 Outland Research method and apparatus for an on-screen/off-screen first person gaming experience
US7564469B2 (en) 2005-08-29 2009-07-21 Evryx Technologies, Inc. Interactivity with a mixed reality
US20110081973A1 (en) 2005-11-30 2011-04-07 Hall Robert J Geogame for mobile device
WO2009016186A2 (fr) * 2007-07-31 2009-02-05 Jochen Hummel Procédé d'obtention, assistée par ordinateur, d'une réalité virtuelle tridimensionnelle interactive
EP2098271A2 (fr) * 2008-02-21 2009-09-09 Palo Alto Research Center Incorporated Plate-forme de jeu à réalité mixte avec informations de position
US20110319148A1 (en) 2010-06-24 2011-12-29 Microsoft Corporation Virtual and location-based multiplayer gaming

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9104235B2 (en) 2013-08-22 2015-08-11 International Business Machines Corporation Modifying information presented by an augmented reality device
US9104236B2 (en) 2013-08-22 2015-08-11 International Business Machines Corporation Modifying information presented by an augmented reality device

Also Published As

Publication number Publication date
US20120231887A1 (en) 2012-09-13

Similar Documents

Publication Publication Date Title
US20120231887A1 (en) Augmented Reality Mission Generators
JP6905154B2 (ja) 並行現実ゲーム内の活動を用いたプレイヤーの現実世界位置の検証
US9573064B2 (en) Virtual and location-based multiplayer gaming
CN109445662B (zh) 虚拟对象的操作控制方法、装置、电子设备及存储介质
CA2621191C (fr) Interactivite via la reconnaissance d'une image mobile
KR101736477B1 (ko) 저장된 콘텐츠 및 ar 통신의 로컬 센서 증강
JP7145976B2 (ja) 仮想オブジェクトの情報表示方法並びにその、アプリケーション・プログラム、装置、端末及びサーバ
CN109529356B (zh) 对战结果确定方法、装置及存储介质
JP7239668B2 (ja) 検証経路に対応するランドマークの画像データを使用したプレーヤの現実世界のロケーションの検証
WO2012007764A1 (fr) Système de réalité augmentée
CN110448908B (zh) 虚拟环境中瞄准镜的应用方法、装置、设备及存储介质
JP2021535806A (ja) 仮想環境の観察方法、デバイス及び記憶媒体
CN113058264A (zh) 虚拟场景的显示方法、虚拟场景的处理方法、装置及设备
US20230206268A1 (en) Spectator and participant system and method for displaying different views of an event
CN112569596A (zh) 视频画面展示方法、装置、计算机设备及存储介质
CN112569607A (zh) 预购道具的显示方法、装置、设备及介质
TW202300201A (zh) 興趣點之重複性預測
CN114130012A (zh) 用户界面的显示方法、装置、设备、介质及程序产品
CN111679879B (zh) 帐号段位信息的显示方法、装置、终端及可读存储介质
CN110585708B (zh) 虚拟环境中从飞行器着陆的方法、装置及可读存储介质
CN112169321B (zh) 模式确定方法、装置、设备及可读存储介质
CN111589113B (zh) 虚拟标记的显示方法、装置、设备及存储介质
CN113633970A (zh) 动作效果的显示方法、装置、设备及介质
CN113041613A (zh) 对局回顾方法、装置、终端及存储介质
CN113144595A (zh) 虚拟道路的生成方法、装置、终端及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12715249

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12715249

Country of ref document: EP

Kind code of ref document: A1