WO2022147526A1 - User-specific interactive object systems and methods - Google Patents
User-specific interactive object systems and methods Download PDFInfo
- Publication number
- WO2022147526A1 WO2022147526A1 PCT/US2022/011104 US2022011104W WO2022147526A1 WO 2022147526 A1 WO2022147526 A1 WO 2022147526A1 US 2022011104 W US2022011104 W US 2022011104W WO 2022147526 A1 WO2022147526 A1 WO 2022147526A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- interactive object
- interactive
- user
- special effect
- environment
- Prior art date
Links
- 230000002452 interceptive effect Effects 0.000 title claims abstract description 592
- 238000000034 method Methods 0.000 title claims description 69
- 230000000694 effects Effects 0.000 claims abstract description 169
- 230000007613 environmental effect Effects 0.000 claims abstract description 63
- 230000033001 locomotion Effects 0.000 claims abstract description 54
- 230000009471 action Effects 0.000 claims abstract description 17
- 238000004891 communication Methods 0.000 claims description 40
- 230000005670 electromagnetic radiation Effects 0.000 claims description 27
- 230000003287 optical effect Effects 0.000 claims description 20
- 239000003550 marker Substances 0.000 claims description 19
- 230000001815 facial effect Effects 0.000 claims description 13
- 238000013442 quality metrics Methods 0.000 claims description 9
- 230000003213 activating effect Effects 0.000 claims description 5
- 230000004044 response Effects 0.000 claims description 4
- 230000003993 interaction Effects 0.000 description 21
- 230000004913 activation Effects 0.000 description 13
- 238000005286 illumination Methods 0.000 description 12
- 230000005540 biological transmission Effects 0.000 description 11
- 230000008569 process Effects 0.000 description 11
- 238000012545 processing Methods 0.000 description 10
- 230000001755 vocal effect Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 238000001514 detection method Methods 0.000 description 7
- 238000003306 harvesting Methods 0.000 description 7
- 238000003860 storage Methods 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 238000013461 design Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000001960 triggered effect Effects 0.000 description 3
- 239000003086 colorant Substances 0.000 description 2
- 230000004069 differentiation Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000005021 gait Effects 0.000 description 2
- 230000001404 mediated effect Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000001151 other effect Effects 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000003292 diminished effect Effects 0.000 description 1
- 238000005530 etching Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 239000002516 radical scavenger Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 230000000153 supplemental effect Effects 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/23—Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/24—Constructional details thereof, e.g. game controllers with detachable joystick handles
- A63F13/245—Constructional details thereof, e.g. game controllers with detachable joystick handles specially adapted to a particular type of game, e.g. steering wheels
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/28—Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
- A63F13/285—Generating tactile feedback signals via the game input device, e.g. force feedback
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
- A63F13/98—Accessories, i.e. detachable arrangements optional for the use of the video game device, e.g. grip supports of game controllers
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63G—MERRY-GO-ROUNDS; SWINGS; ROCKING-HORSES; CHUTES; SWITCHBACKS; SIMILAR DEVICES FOR PUBLIC AMUSEMENT
- A63G31/00—Amusement arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1025—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection
- A63F2300/1031—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection using a wireless connection, e.g. Bluetooth, infrared connections
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/105—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1056—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving pressure sensitive buttons
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1062—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to a type of game, e.g. steering wheel
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1087—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
-
- H—ELECTRICITY
- H02—GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
- H02J—CIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
- H02J50/00—Circuit arrangements or systems for wireless supply or distribution of electric power
- H02J50/001—Energy harvesting or scavenging
-
- H—ELECTRICITY
- H02—GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
- H02J—CIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
- H02J50/00—Circuit arrangements or systems for wireless supply or distribution of electric power
- H02J50/20—Circuit arrangements or systems for wireless supply or distribution of electric power using microwaves or radio frequency waves
-
- H—ELECTRICITY
- H02—GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
- H02J—CIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
- H02J50/00—Circuit arrangements or systems for wireless supply or distribution of electric power
- H02J50/40—Circuit arrangements or systems for wireless supply or distribution of electric power using two or more transmitting or receiving devices
-
- H—ELECTRICITY
- H02—GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
- H02J—CIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
- H02J7/00—Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries
- H02J7/0013—Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries acting upon several batteries simultaneously or sequentially
Definitions
- the present disclosure relates generally to the objects for use in interactive environments, such as a game environment or an amusement park. More specifically, embodiments of the present disclosure relate to an addressable interactive object that facilitates interactive effects, such as one or more special effects.
- an interactive object comprising a special effects system disposed in or on the housing, a controller disposed in or on the housing that controls the operation of the special effects system, and a plurality of environmental sensors configured to generate sensor data.
- the embodiment also includes a central controller that operates to receive a plurality of user profiles from a plurality of users and receive sensor data from the plurality of environmental sensors of an interactive environment. The central controller identifies a user of the plurality of the users based on the sensor data, wherein the identified user is associated with the interactive object.
- the central controller then characterizes a movement or action of the interactive object based on the environmental sensor data and communicates instructions to the controller of the interactive object to activate a special effect of the special effect system, wherein the instructions are based on the user profile of the plurality of user profiles, the user profile being associated with the identified user, and the characterized movement or action of the interactive object.
- a method comprising receiving sensor data from a plurality of sensors in an interactive environment; identifying a plurality of interactive objects and a plurality of users in the interactive environment based on the sensor data; associating an identified interactive object with an identified user; tracking the movement of the identified interactive object using the sensor data; and communicating instructions to the identified interactive object to activate an on-board special effect based on the tracked movement and a user profile of the identified user.
- an interactive object comprising a housing and a detectable marker disposed on or in the housing that operates to reflect a first portion of electromagnetic radiation from an environment.
- Communication circuitry on or in the housing to the housing that operates to receive a second portion of the electromagnetic radiation from the environment, transmit interactive object identification information of the interactive object responsive to receiving the second portion of electromagnetic radiation, and receive special effect instructions.
- a controller on or in the housing that receives the special effect instructions and generates a special effect command.
- the interactive object also comprises a special effects system that receives the special effect command and activates a special effect based on the special effect command.
- FIG. 1 is a schematic illustration of an embodiment of an interactive object system, in accordance with present techniques
- FIG. 2 is a schematic illustration of features of an embodiment of the interactive object system, in accordance with present techniques
- FIG. 3 is a flow diagram of the interactive object system, in accordance with present techniques.
- FIG. 4 is a schematic illustration of the communication system of the interactive object system, in accordance with present techniques;
- FIG. 5 is a flow diagram of a method of assigning a user profile, in accordance with present techniques.
- FIG. 6 is a flow diagram of a method for detecting an interactive object and facilitating effect emission of the interactive object, in accordance with present techniques.
- Users in an interactive environment or participating in an immersive experience may enjoy carrying a handheld object or wearing a costume element that aligns with a theme, such as a sword, stuffed animal, hat, wand, jewelry, or other prop.
- interactions are mediated by external objects that recognize the object (e.g., via object recognition or wireless communication) and activate external actions based on the recognition.
- object recognition or wireless communication e.g., via object recognition or wireless communication
- Such an arrangement permits the objects to be implemented as relatively inexpensive objects, with the more complex and costly elements of the interactions being off-board or external to the passive device.
- feedback systems can be situated as components of the environment, the ability to generate feedback in or on a hand held or worn device can facilitate a deeper level of immersion into the environment.
- the disclosed interactive object techniques permit user and/or interactive object identification and targeting to mediate elements of an interactive environment.
- the system locates a particular user within a crowd and activates or directs special effects to an interactive object carried or worn by the user without necessarily activating other interactive objects in the area.
- the disclosed techniques operate to identify relatively simple interactive objects that do not necessarily include any communication packages. Accordingly, the identification is based on environmental data that is collected within an interactive environment and that is used to identify and locate both user and interactive object. Further, the disclosed techniques account for situations in which a particular user is using an interactive object that is not pre-registered to the system or otherwise linked to the particular user.
- users may switch or share interactive objects within a family group. While wireless communication of object identification information to the system permits identification of the interactive object itself, identification information alone would not recognize a situation in which the user carrying the object changes throughout the day.
- the present techniques permit dynamic updating and addressing of interactive objects in a manner that is specific to the identity of the particular user interacting with the object at a particular time.
- the interactive object system collects data from multiple sensing modalities, which may be pooled and/or arbitrated, to permit the user to be identified with greater accuracy while utilizing minimal or reduced processing power.
- the data can include radio frequency data, optical sensing methods, 3D time of flight systems, and other sensing systems. This facilitates identification of a specific user in a crowd, and leads to personalizing the effect to the user in a manner that may be linked to the identity of the interactive object carried by the user. This positive user identification also facilitates the linking of an interactive object to a specific user, which further personalizes the user experience in a crowded environment.
- Such interactive objects may be, in an embodiment, a prop or toy used within an interactive environment to permit greater variability in special effect control by using individualized user interactives.
- the user interactive enables a user profile to be associated with the interactive. This profile is used to select special effects that are based upon the user profile.
- a toy, prop, or handheld object such as a sword, wand, token, book, ball, or figurine
- the disclosed embodiments may be used with other types of objects.
- Such objects may include wearable objects, such as clothing, jewelry, bracelets, headgear, medallions, or glasses.
- the object may be a prop or scenery item within an interactive environment.
- the interactive environment may be part of an immersive area, such as an amusement park, an entertainment complex, a retail establishment, etc.
- the disclosed systems and methods may include at least one interactive environment of a themed area having a common theme and may additionally include different interactive environments within the single themed area. Further, the disclosed systems and methods may include additional or other interactive environments having different themes but that are contained within an immersive area, such as a theme park or entertainment venue.
- the interactive environment may be a live show, wherein the user in the audience may be able to participate using their object.
- the interactive environment may include a certain area within which a user can interact with interactive elements within the area.
- an interactive environment may also include different locations that are geographically separated from one another or that are dispersed throughout an amusement park.
- the interactive environment may also be in a remote location separate from an immersive area.
- the user may be able to establish an interactive environment at their home or any other location via a user electronic device that may be configured to interact with the object.
- FIG. 1 generally illustrates the manner in which an interactive object control system 10 may be integrated within an interactive environment in accordance with present embodiments.
- a plurality of users 12 e.g., guests
- the users 12 may have a handheld or wearable interactive object 20 that moves with the user 12 throughout the interactive environment 14.
- the user’s interactive object 20 facilitates output of user-specific special effects based on the user interactions 12 via the interactive object 20 within the interactive environment 14.
- the effects may provide feedback to the user 12 that an interaction is occurring or has successfully occurred.
- the interactive object control system 10 includes a plurality of environmental sensors 16 dispersed throughout the interactive environment 14 and, in certain embodiments, between the various areas of an interactive environment 14 or in different interactive environments.
- the environmental sensors 16 generate sensor data.
- the sensor data may be user data, such as data from images or camera feeds of one or more users in the interactive environment 14. Additionally or alternatively, the sensor data may be interactive object data, such as data indicative of a location or motion of the interactive object 20 in the interactive environment 14.
- the acquired sensor data may be used to track the user’s location within and between different interactive environments 14.
- the environmental sensors 16 send the sensor data to a central controller 18, and the data can be processed to identify the user 12 during the user’s interactions within the interactive environment 14.
- the environmental sensors 16 may also track user movement throughout the interactive environment 14.
- the user data of all the users 12 in the interactive environment can be pooled and/or arbitrated to identify one or more users 12 within the interactive environment 14.
- the sensor data may be used to identify one or more interactive objects 20 within the interactive environment 14.
- the interactive object control system 10 may target user-specific special effects to a particular interactive object 20 and/or a particular location in the interactive environment 14 based on the acquired data.
- the user 12 may experience the interactive object 20 as having the appearance of a passive or low technology device.
- the user’s own actions using the interactive object 20 may trigger an on-board special effect on the interactive object 20 and/or a location-specific special effect in the interactive environment 14.
- effects can be selected by the central controller 18 based on user actions, user profile information, or historical or other data associated with the user 12 and/or the interactive object 20.
- the effects that are experienced by the user 12, and that may also be visible to other nearby users 12, are variable or unpredictable within the interactive environments 14 leading to increased enjoyment.
- the interactive object control system 10 is capable of tracking real-time user location.
- the user 12 may leave one interactive environment 14a and enter a second interactive environment 14b.
- the user’s location is tracked via the environmental sensors 16, so that the user’s location can be transmitted to the central controller 18, and user linkage to the interactive object 20 can occur more efficiently throughout the plurality of interactive environments 14.
- the user’s interactive object 20, upon entering the interactive environment 14, is triggered to transmit interactive object identification data to the environmental sensors 16 that then transmit the interactive object identification data to the central controller 18.
- the central controller 18 receives the object identification data and utilizes the data to link the specific interactive object 20 to a specific user 12 of the plurality of users in the interactive environment 14.
- the user identification occurs based on received sensor data.
- the sensor data is assessed to identify one or more users 12 in an area of an interactive environment.
- Characteristics of the users extracted from the sensor data are then used to identify individual users 12 from a set of identified users 12 known to be generally within the area.
- the users 12 generally within the interactive environment 14 may form a subset of a total set of all users within an attraction or a theme park based on the collected sensor data, and user identification may be expedited by dynamic updating of the relevant subset as a likely candidate pool.
- the identified user 12 is then linked to the interactive object 20 by the central controller 18 so that a user profile can be updated based on the user interactions via the linked interactive object within the interactive environment 14. Further, specific effects or actions can be targeted to the identified user 12 and/or linked interactive object 20.
- the user 12 can perform a specific motion or gesture with the interactive object 20, which is tracked via environmental sensors 16 and communicated to the central controller 18.
- the central controller 18 processes the motion data, in some instances in combination with the user profile data or object profile data, to generate a special effect communication, e.g., special effect instruction, that is personalized based on one or more of the motion data, user profile data, and object profile data.
- the interactive object 20 then receives this communication and activates a personalized special effect based on the specific user 12, the specific object 20, and/or the tracked motion.
- the user 12 is then able to perceive distinct special effects (e.g., visual, auditory, haptic) throughout the interactive environment 14, including in some instances on or through the object 20.
- the user’s profile, and in some instances the object’s profile, stored in the central controller 18, is updated to store user’s interaction data within the interactive environment 14.
- the special effect instructions may be personalized according to the user profile associated with the user’s interactive object 20.
- the user profile may comprise user skill level (e.g. length of time of use with interactive object, accuracy of gestures with interactive object over time) and user identity (e.g. pre-selected theme or other user preferences).
- the special effect command corresponding to a user’s gesture with the interactive object 20 may be personalized to the user 12 according to the user profile by varying audio, haptic, or visual aspects of the special effect command sent to the user’s interactive object 20.
- characteristics such as intensity (e.g., light intensity, audio intensity, haptic effect intensity) of the special effect can be adjusted in a rules-based manner.
- the central controller 18 may identify the user 12 and receive the user profile comprising user skill level and user identity.
- the central controller 18 can set special effect commands based on the user skill level and user identity.
- the special effect command causes activation of a light on the interactive object 20 having a specific color and/or intensity corresponding to the user skill level and user identity in combination with the sensor data associated with gesture performance data received by the central controller 18.
- the user 12 may possess an intermediate skill level and the central controller may determine, based on the sensor data, that the gesture was performed accurately. This information will be utilized by the central controller 18 to generate or adjust a specific special effect based on skill level and correct performance of the gesture.
- completing a figure eight gesture in combination with an intermediate skill level may be determined by the central controller 18 to correspond to a special effect command associated with illuminating the user’s interactive object 20 green.
- Real-time special effect adjustment may occur over the course of the special effect (e.g., with changes in color intensity corresponding to higher or lower quality changes in the user actions).
- Another user who possesses a beginner skill level but performed the same figure eight gesture may cause the central controller 18 to generate a special effect command associated with an audio clapping or a sparking effect.
- the central controller may detect that the individual is not linked to the interactive object 20 and/or that the individual is linked to a different interactive object 20.
- the central controller 18 may generate an effect to convey to the individual that they are the incorrect user of the interactive object 20.
- the central controller 18 may send an audio special effect command to the interactive object 20 to output an audio error message. This will indicate to the individual that they are the incorrect user of the interactive object 20.
- every good faith interaction with the interactive object 20, regardless of the user identity or the skill level can generate some sort of perceivable feedback at the interactive object 20.
- the user 12 may enter the interactive environment 14, and a special effect command may be transmitted before any motion is made via the interactive object 20 based on the user profile data or object profile data.
- a special effect command may be transmitted before any motion is made via the interactive object 20 based on the user profile data or object profile data.
- the user enters the interactive environment 14, the user’s interactive object 20 is triggered to transmit object identification data into the area.
- the interactive object data is then transmitted to the central controller 18.
- the user 12 may have a specified characteristic within their user profile that triggers a special effect based on the specific characteristic.
- the user could choose a specific identification color for their profile, and the central controller 18 could communicate this specific color output to the interactive object 20, and enable the interactive object 20 to emit light from a specific color LED housed within the interactive object 20 corresponding to the user profile characteristic.
- the central controller 18 may send a general special effect command to all user interactive objects 20 currently in a bounded area or within a specific interactive environment 14.
- the central controller 18 may send a general special effect command to illuminate every interactive object 20 in the bounded area or within a specific interactive environment 14 a specific color.
- the bounded area may be a specified distance or radius of the interactive environment 14, so that the interactive objects 20 in the bounded area receive the same special effect command.
- FIG. 2 illustrates a schematic diagram of interactions of the interactive object control system 10.
- the interactive object control system 10 receives or detects interactive object information (e.g., unique device identification number or code) from one or more interactive objects 20 in an area of an interactive environment 14 which includes an area in range of emitters 28 and sensors 16 of the interactive object control system 10.
- the information is based on a detectable marker 21 (e.g., a barcode, a quick response code, a patterned retror effective marker, a glyph) on a housing 22 of the interactive object 20 and is detected by the interactive object control system 10 such that the interactive objects 20 provide the information passively.
- the interactive objects 20 may include a mix of passive and active elements that may permit different modes of communication and activation depending on the environment.
- the users 12 interact with the interactive object control system 10 that may include one or more emitters 28 (which may be all or a part of an emission subsystem having one or more emission devices and associated control circuitry) that emits one or more wavelengths of electromagnetic radiation (e.g., light such as infrared, ultraviolet, visible, or radio waves and so forth).
- the interactive object control system 10 may also include one or more environmental sensors 16 (which may be all or a part of a detection subsystem having one or more sensors, cameras, or the like, and associated control circuitry) that detects one or more of signals transmitted from the interactive object 20, a detectable marker 21 on the interactive object 20, and the users 12 in the interactive environment 14 as described above in FIG. 1.
- the interactive object control system 10 also includes the central controller 18 communicatively coupled to the emitters 28 and the environmental sensors 16.
- the interactive object control system 10 may include the interactive object 20 (illustrated as a handheld object) that includes a housing 22 having an exterior surface 24 that, in an embodiment, includes a grip sensor, and the interior of the housing which includes communication circuitry 26.
- the housing 22 may also include a detectable marker 21.
- the communication circuitry 26 may actively communicate a device identification of the interactive object 20 to the environmental sensors 16 in the interactive environment 14.
- the communication circuitry 26 may include a radio- frequency identification (RFID) tag.
- RFID radio- frequency identification
- the communication circuitry 26 can communicate device identification of the interactive object to the environmental sensors 16 (implemented as receivers) of the interactive environment 14, which in turn pass the information to the central controller 18 of the interactive object control system 10.
- the communication circuitry 26 enables wireless communication of device identification information between the hardware of the interactive object 20 and the hardware of the interactive object control system 10 so that interactive object information that relates to one or both of a user profile or an object profile can be dynamically updated and used to generate personalized commands sent to the interactive object 20 and/or the interactive environment 14 from the central controller 18
- the emitter 28 is external to (e.g., spaced apart from) the interactive object 20.
- the emitter 28 operates to emit electromagnetic radiation, which is represented by an expanding electromagnetic radiation beam for illustrative purposes, to selectively illuminate, bathe, or flood the interactive environment 14 in the electromagnetic radiation.
- the electromagnetic radiation beam in certain embodiments, may be representative of multiple light beams (beams of electromagnetic radiation) being emitted from different sources of the emitter or emitters 28 (all part of an emission subsystem that includes one or more emitters 28).
- the source may be a visible light source, an infrared light source, etc., to emit the desired wavelength of electromagnetic radiation.
- the emitter 28 may include one or more sources of different types, such as light emitting diodes, laser diodes, or other sources.
- the electromagnetic radiation beam is intended to generally represent any form of electromagnetic radiation that may be used in accordance with present embodiments, such as forms of light (e.g., infrared, visible, UV) and/or other bands of the electromagnetic spectrum (e.g., radio waves and so forth).
- forms of light e.g., infrared, visible, UV
- bands of the electromagnetic spectrum e.g., radio waves and so forth.
- it may be desirable to use certain bands of the electromagnetic spectrum depending on various factors. For example, in one embodiment, it may be desirable to use forms of electromagnetic radiation that are not visible to the human eye or within an audible range of human hearing, so that the electromagnetic radiation used does not distract guests from their experience.
- the detectable marker 21 may be a retroreflector, e.g., operating to reflect light in a particular range (800-1100 nm range in an embodiment) that reflects the emitted light from the emitter 28.
- the reflected light is detected at one or more environmental sensors 16 to generate sensor data indicative of a presence or motion of the interactive object 20.
- the interactive environment 14 may correspond to all or a part of an amusement park attraction area or interactive environment, including a stage show, a ride vehicle loading area, a waiting area outside of an entrance to a ride or show, interactive features dispersed within an amusement park, and so forth.
- the interactive environment 14 may also be movable or transitory, such as incorporated within a parade or a street performance.
- the interactive environment 14 may be interacted with by user 12 individually, such as part of a game, scavenger hunt, or nonlinear narrative experience.
- the emitter 28 is fixed in position within the environment while the interactive object 20 moves within the area of an interactive environment 14 and receive the electromagnetic radiation signal.
- the interactive object 20 may be detected (e.g., located within the interactive environment 14), tracked via the environmental sensors 16 in the area, and communicated with to activate one or more on board special effects of the interactive object 20 via emitted and detected electromagnetic radiation of the interactive object control system 10.
- the detection of the interactive object 20 is controlled by the central controller 18, which drives the emitter 28.
- the activation may be indiscriminate, such that the emitter 28 continuously emits electromagnetic radiation of the appropriate wavelength or frequency that corresponds to the communication circuitry 26 and the device information that is communicated, and any interactive object positioned within the interactive environment 14 and oriented towards the emitter 28 is activated to emit a signal of device identification to the environmental sensors 16, dispersed throughout the interactive environment 14.
- the sensors may include radio frequency sensors, optical sensors, 3D time of flight sensors, facial recognition sensors and other sensing systems to aid in the user 12 and the interactive object 20 identification.
- the activation may be selective, such that the central controller 18 operates to locate and process the transmitted object identification data via the communication circuitry 26 of the interactive object 20 and, upon locating and detecting, drive the emitter 28 to direct a signal given by the central controller 18 to the interactive object 20 such that the activation of the special effect of the interactive object 20 may be turned on or off depending on a desired narrative or user actions.
- the user 12 may enter the interactive environment 14 with their respective interactive object 20.
- the interactive object may wirelessly transmit interactive object information or may interact with (reflect) emitted light in the interactive environment 14 to provide interactive object data to the environmental sensors 16 in the interactive environment 14.
- the environmental sensors 16 may also obtain interactive object data from a detectable marker 21 on the interactive object 20.
- the object identification data is then transmitted to the central controller 18 for processing.
- the environmental sensors 16 e.g., facial recognition sensors, 3D time of flight sensors, optical sensors
- the central controller 18 may identify users 12 via the sensor data to narrow down the user pool in the interactive environment 14 such that object identification data can be linked to the specific user 12 more efficiently.
- the user 12 may then perform a motion or gesture with their interactive object 20.
- the motion data of the interactive object 20 is collected by the environmental sensors 16 in the interactive environment 14 and transmitted to the central controller 18.
- the central controller 18 then utilizes the user data in combination with the motion data to send a personalized effect response to the communication circuitry 26 of the interactive object 20 that has been previously linked to the user 12 by the central controller 18. If the user 12 has previously visited the interactive environment 14, the personalized effect response can be differentiated by the central controller 18 to be different than previous effect commands sent to the user’s interactive object 20 in the interactive environment 14.
- the specific gesture or motion performed with the interactive object 20 can also cause effect differentiation.
- the specific motions performed with the interactive object 20 by the user 12 can trigger motion specific effects.
- the motion data may be compared to a stored set of motions and assessed for accuracy based on preset quality metrics.
- the effect based on accuracy and/or performed motion can be designated to correspond to a certain color emission of light from the interactive object 20, or other effect emitted from the special effects system of the interactive object 20.
- the central controller 18, based on the user’s interactions with their interactive object 20 in the interactive environment 14, may trigger special effects that vary throughout the course of the effect.
- a power boosting device is controlled to increase the luminance of the special effect output via the interactive object 20.
- the signal boosting may be implemented by controllable radio frequency energy emitters, or through additional infrared emitting power sources in an optical power harvesting example.
- the radio frequency emitters may direct and/or focus the radio frequency energy transmission beam from a radio frequency emitter to a particular interactive object 20 based on the interactive object’s 20 detected location.
- the direction and focus of the beam to the location of the interactive object 20 facilitates an increase in the interactive object’s 20 available power and allow for the device to output a special effect using the extra available power with a higher intensity and/or luminance relative to other interactive objects 20 in close proximity.
- a single interactive object 20 in a group can be singled out to, for example, form a high intensity beam of light.
- the radio frequency energy transmission may include Ultra high frequency (UHF) energy transmissions to power the interactive object’s 20 special effects.
- UHF Ultra high frequency
- the change in luminance may be dynamic and tied to user actions with the interactive object 20, so that when the user is improving, getting closer to a goal (e.g., getting “warmer” to finding an object), or performing a movement pattern with a higher quality metric, the luminance increases, and when the user is doing relatively less well (e.g., getting “cooler”), performing a less accurate (lower quality metric) motion pattern, the luminance decreases.
- a nature of the special effect may be based on a quality metric being above or below a threshold.
- the quality metric may be based on accuracy of motion patterns of the interactive object 20, distance of the interactive object 20 from a goal (within a certain distance being above a quality threshold), or interaction of the interactive object 20 within the interactive environment 14.
- the intensity of the luminance could vary depending on the phase of the activation.
- discrete color illuminations could be tied to particular interactives, the completion of a specific gesture using the interactive object 20, or a series of gestures.
- the special effect variance discussed above, and the use of illumination and color discussed in connection with various embodiments herein may be expressed by other sensory effects including haptic, such as a vibration of the interactive object 20, or sound, such as a tone emitted by the interactive object 20.
- the power boosting effect may be implemented by utilizing an external device, e.g., a mobile device, associated with the user 12.
- the interactive object 20 may include a Near-Field Communication (NFC) coil located in the interior of the interactive object 20.
- the NFC coil may facilitate charging and/or power boosting for the interactive object 20 by gaining charge via transmission of energy from an external device associated with the user (e.g. mobile phone, NFC charger, which may be implemented as a toy or wearable device).
- the external device may include a holster and/or holder for the interactive object 20 so that the interactive object 20 may be continuously charged as the user 12 moves about the interactive environments 14.
- the interactive object 20 may also include a rechargeable energy source (e.g.
- the NFC coil may enable pairing of the interactive object 20 to the user’s mobile device to allow for interactivity between the user’s mobile device and the user’s interactive object 20.
- the mobile phone may pair with the user’s interactive object 20 and allow transmission of interactive object performance data to the mobile device.
- the interactive object 20 performance data may be processed via an application of the mobile device and displayed to the user 12 so that the user can view their performance statistics in real time.
- the interactive objects 20 may be recharged throughout the day if on display and/or not in use by the user 12.
- the interactive object 20 may be recharged by utilizing the optical power harvesting method described above.
- the interactive object 20 storage area may include a radio frequency emitter that may continuously emit energy towards the interactive object storage area to recharge the interactive objects 20 when not in use, so that they are fully charged when the user 20 obtains the interactive object 20.
- the interactive objects 20 in the storage area may also be charged via a near field device that may be incorporated into a shelving unit or other storage space. This near field charging method may serve as a 1: 1 top off (e.g. charging) method.
- the optical power harvesting method may be used in combination with other charging methods for the interactive object 20, such as mid-range to long-range charging methods via charging over Ultra high frequency (UHF) radio frequencies and charging using near-field communication (NFC) methods (e.g., NFC coil located within the interactive object 20, near field device).
- UHF Ultra high frequency
- NFC near-field communication
- any of the above charging methods may be implemented individually or in combination throughout user 12 interactions to power or charge the interactive object 20.
- the discussed power harvesting techniques may be used to directly power on-board special effects of the interactive object 20 and/or may be used to charge a battery or power storage of the interactive object 20 that is in turn used to power the special effects.
- the central controller 18 may detect and store historical data associated with past interactions between the user’s interactive object 20 and other interactive objects. For example, the user’s interactive object 20 may have interacted with an opponent’s interactive object 20 during a battle scenario. The central controller 18 may update the user’s profile to include historical information pertaining to the user’s 12 interaction with the opponent’s interactive object 20 during the battle scenario. The central controller 18 may then detect at a later time that the user’s interactive object 20 is attempting to battle with the same opponent’s interactive object 20. The central controller 18 may then receive the historical data comprising the past battle scenario data and differentiate special effect commands sent to the user’s interactive object 20 to activate new special effects based upon previous battle interactions.
- the user may enter the interactive environment 14, and an initial effect command may be sent to the interactive object 20 based off the interactive object identification via wireless transmission from the on board communication circuitry 26 of the interactive object 20 to the central controller 18, and user identification via environmental sensors 16.
- This initial identification may enable the central controller 18 to send an initial command based off object identification 20 and the user 12 identification.
- the interactive object 20 could receive an initial command to project a certain color light from an LED housed within the special effects system of the interactive object 20. This projection of LED light color could be based on user’s preference or user’s level of experience with the corresponding interactive object 20. The user could then perform a motion or gesture with the interactive object 20.
- the environmental sensors 16 disposed throughout the environment collect the motion data of the interactive object, and the motion data is then transmitted to the central controller 18.
- the central controller 18 based on the motion data can then send another special effect command to the interactive object 20.
- the communication circuitry receives the command sent from the central controller 18, and outputs a different color LED based on the motion or gesture performed. This enables the user to observe a constant output of effects from the interactive object 20 during the user’s entire experience in the interactive environment 14.
- the interactive object 20 may be sent commands to perform discrete illumination sequences throughout the user’s experience in a specific interactive environment 14. For instance, the interactive object 20 may illuminate a certain color LED based on initial identification and linkage to user profile by the central controller 18. The user 12 may then perform a gesture or series of gestures and based on the accuracy of these gestures the interactive object 20 may be sent commands to illuminate a certain color LED, or one or more LED’s of different colors in a specified sequence or in conjunction depending on the accuracy of the performed gesture.
- the accurate performance of a gesture determined via the central controller 18 triggers the central controller 18 to send a second command to the interactive object 20 to illuminate an alternate color LED from the initial identification or a sequence of alternate colors of LEDs, which may be based on whether the gesture was performed accurately.
- the illumination of one or more specific color LEDs may correspond to a themed aspect of the interactive experience.
- the color may correspond to a group or house affiliation stored in the user profile that corresponds to a pre-selected color option, to connect the user 12 to their user profile throughout the user experience.
- a mobile device of the user 12 may be used to identify the interactive object 20 and link the user 12 to the interactive object 20.
- the interactive object 20 may have high level symbolic representations (e.g. runes and/or a sequence of runes) etched on the exterior of the interactive object 20.
- the runes may also be any other symbol or etching system used to represent a unique pattern on the exterior casing of the interactive object 20.
- the order of the runes may correspond to a unique identifier for the interactive object 20. For example, rune A and rune B may appear in order AB on a first interactive object and order BA on a second interactive object.
- the runes may be order specific so that order AB corresponds to a different unique identifier than order BA. This order specific identification of the runes enables a greater number of unique identifiers to be available while using a smaller number of runes.
- the user 12 may utilize their mobile device to scan or take a picture of the interactive object 20 utilizing the camera of the mobile device.
- the user 12 may also download an application on their mobile device that is able detect the runes from the picture obtained via the camera of the mobile device.
- the application may have access to a database that contains all the unique identifiers for each rune combination.
- the mobile device may obtain the user 12 information via the application and link the user’s interactive object 20 via the unique identifier obtained from the runes with the user information.
- the mobile device may be configured via the application to transmit the user information and associated interactive object information to the central controller 18.
- the central controller may utilize the user 12 and interactive object information to transmit special effect commands based on the user 12 being associated with the interactive object 20 via the mobile device.
- This method can be implemented in combination with the environmental sensor 16 method of identifying users 12 and linking each user 12 to their respective interactive object 20.
- the identification of interactive object 20 via the user mobile device can aid in identification in a crowded environment in combination with the environmental sensors 16, or to aid in addition to the environmental sensors 16 for user identification.
- the special effects of the interactive object 20 can be varied or selected based on user action.
- the special effect command may be determined based on a gesture of the interactive object 20, a verbal command by the user 12 of the interactive object 20, the user profile comprising a level of the user 12, or any combination thereof.
- Certain gesture and verbal action combinations may be associated with higher intensity or rarer generated special effects relative to a gesture or verbal command alone.
- the user 12 may perform a first gesture with the interactive object 20 without the user 12 reciting a verbal command.
- the central controller 18 may receive sensor data related to the gesture performed with the interactive object 20 and link the interactive object 20 to the user 12 and the user profile corresponding to the user 12.
- the central controller 18 may then transmit a special effect command to the interactive object 20 based on the gesture performed by the user with the interactive object 20 and the user profile.
- the user 14 may alternatively perform the first gesture in combination with a verbal command.
- the central controller 18 may be sent sensor data comprising data related to the gesture performed and the verbal command.
- the central controller 18 may generate a special effect command based on the gesture and the verbal command different than the command for the gesture-only case. This enables special effect command generation to be differentiated based on multiple combinations of the gesture and verbal commands.
- the special effect may also be differentiated depending on the skill level of the user associated with the user profile as discussed previously.
- the user 14 may then be able to receive more personalized feedback and attempt more combinations of gestures and verbal commands.
- the special effect command may specify the intensity level of illumination to be emitted from the interactive object 20.
- the intensity level of the illumination may be tied to a performance of a motion or gesture, an experience level of the user 12, or the user’s previous experiences in the interactive environment 14, or any combination thereof.
- the color of the illumination may also be specified via the special effect command, and may be associated with a particular interactive object 20, or be dependent upon the correct completion of a specific gesture or series of gestures with the interactive object 20.
- a user may initially enter an area of an interactive environment 14 of the plurality of interactive environments 14.
- the users interactive object 20 via wireless transmission sends object identification information to the central controller 18.
- the central controller 18 may then link the interactive object information to a respective user 12 in the interactive environment 14.
- the central controller may then send an initial special effect command specifying a specific intensity level and color of illumination based off of interactive object information.
- the user 12 may then complete a series of gestures with their interactive object 20.
- the environmental sensors 16 transmit the motion data of the interactive object 20 to the central controller 18 which accesses the data for accuracy and sends a special effect command specifying a color and/or intensity level for the illumination effect that may be different from the initial command, based off accuracy of gestures performed with the interactive object 20.
- the interactive object 20 may illuminate a green color LED at a high intensity for the correct performance of a gesture, and for the incorrect performance of a gesture illuminate a red color LED at a low intensity.
- the interval of the illumination may also be specified via the special effect command received by the interactive object 20, and may specify a longer interval (time period of illumination) of illumination or a different intensity level based on the performed action or object identification information.
- the environmental sensors 16 may be unable to identify a user to link to a specific interactive object of the plurality of interactive objects 20.
- the central controller 18 utilizes the interactive object data received, and recognizes that no best match of the interactive object 20 to the user 12 can be conducted.
- the central controller sends a default effect command retrieved from a plurality of default effects stored in the central controller 18 to the interactive object 20 that was not matched. This enables the user 12 of the unmatched interactive object 20, to observe a special effect.
- FIG. 3 illustrates a process flow diagram for a method 29 that permits association of the user 12 to their respective interactive object 20 in the interactive environment 14 and enables updating the respective user’s profile based on the user’s interaction 12 within the interactive environment 14.
- the method 29 may efficiently select a user 12 of a pool of pre-identified users 12 without conducting de novo user recognition using more computationally intensive techniques.
- a plurality of the users 12 move about freely in the area of an interactive environment of a plurality of interactive environments 14.
- the interactive object control system 10 acquires user data and interactive object data collected via the environmental sensors 16 dispersed throughout the interactive environment 14. The data is received by the interactive object control system 10, such as at the central controller 18 of the interactive object control system 10 (block 30).
- the system 10 receives unique identification information from a tag on the interactive object 20 or any interactive objects in range of the environmental sensors 16 of the interactive environment 14.
- the system 10 also receives location information associated with the interactive objects 20.
- the location information may be based on radio frequency triangulation from the tag, such that the interactive object 20 is linked/identified to particular identification information based on an estimation of location via the sensor signals of multiple sensors 16.
- the system 10 can identify a particular interactive object 20 via wireless communication and link the interactive object 20 to a unique identification number.
- the location information is additionally or alternatively determined via sensing of detectable markers on the interactive object 20. The detectable markers are located in space or tracked via the environmental sensors 16.
- the location information of a sensed detectable marker can be associated with a particular identification tag by determining if an identified interactive object 20 is co-located with a sensed detectable marker or may be based on an estimated closest distance/likely match between a detected retroreflective marker and a triangulated RFID signal origin location associated with (e.g., that transmitted) the identification information of a particular tag.
- the detectable marker may also encode identification information and/or the interactive object 20 may include a light emitter that emits the identification information and that is tracked in space to provide location/motion information.
- User identification may also contribute to interactive object identification.
- Certain interactive objects 20 may be calibrated to or linked to a particular user profile. Thus, the user associated with the user profile is a most-likely candidate to be holding the interactive object 20. Identification of the user within the area of an interactive environment (e.g., via camera data of the sensors 16) can be used to identify the associated interactive object 20. Further, assessment may be based on historical data. Interactive objects 20 may be assumed to be linked to the most-recent user from an adjacent the interactive environment 14 until new data is received.
- the system 10 analyzes the plurality of user and interactive object identification data to select a best match of a user of the respective users 12 present in the interactive environment 14 to a respective interactive objects 20 in the interactive environment 14 (block 32).
- the matching may be rules-based as provided herein.
- the system matches or associates an interactive object 20 to a single user 12 of the plurality of users 12 for each interaction.
- the interactive environment 14 may include multiple users 12, some of whom do not carry interactive objects 20.
- the rules may permit some users 12 to be unassociated with any interactive object 20.
- each interactive object 20 may be required to be associated with at least one user 12.
- the rules-based matching may use proximity as a factor in matching, with detected interactive objects 20 being likely to be associated with a closest user 12.
- an elongated interactive object 20 held at arms-length may be potentially closer to a head/face of a different user 12. Accordingly, additional factors such as identification of the object 12 as being hand-held, where appropriate, or being worn in an appropriate manner, may also be considered.
- the acquired data from the environmental sensors 16 may include camera data that is processed and provided to the analysis to assess these factors.
- the rules may identify a set of potential users within a larger area, such as a theme park, as candidates for user identification.
- high quality image recognition and user profile linking to the user image and/or other user characteristics is performed using more computationally intensive sensors and processing at a designated area, such as a main entrance of a theme park.
- the interactive object control system 10 may identify best matches within the set and using less computationally intensive user recognition analyses to permit more efficient operation of the interactive object control system 10.
- the data collected by the environmental sensors 16 is processed to narrow down the prospective user pool of the interactive environment 14.
- the system can utilize the user pool to more efficiently match users 12 to their respective interactive objects 20.
- the ability to narrow down the possible user pool for a specific interactive environment 14 facilitates identification of users 12 within the crowded interactive environment 14. Through utilization of multiple forms of sensing to identify both users and interactive objects, users 12 can be matched to their respective interactive objects more efficiently. Further, the interactive object control system 10 may be able to identify cases in which the interactive object 20 is shared between different users.
- the interactive object control system 10 may generate different special effect instructions (e.g., on-board special effects activated on the interactive object 20 and/or of the interactive environment 14) relative to those generated for a second user 12 using the same interactive object 20.
- the interactive object 20 is perceived to respond differently to different users 12.
- the user profile associated with the selected best match user 12 and the identified interactive object 20 is then updated by the system to include the association.
- the user profile is also updated to include user location information corresponding to the specific interactive environment 14 and interactive object data relating to the interactive object 20 interactions within the interactive environment 14 (block 34).
- the interactive object 20 is sent personalized special effect commands based on the user’s previous experiences in the interactive environments 14 (block 36). This enables the corresponding user profile to be updated when the user 12 enters a new interactive environment 14, such that the special effect command sent to the user’s interactive object 20 can be differentiated or varied at repeat visits based on the user profile containing previous user information relating to location of the user 12 and experience of the user 12 and the user’s interactive object 20 in previous interactive environments 14 visited.
- the method 29 may be implemented to build a system -generated user profile that may be coordinated with a user-generated user profile stored on the interactive object control system 10 or that, for users who do not register a profile, may be used independently.
- the user profile information provided by the user may include user age, preferences, attraction visit history, park visit history, family group information, payment information, etc.
- the interactive object control system 10, as provided herein, may also add interactive object data to the user profile. This may be added in a manner that is invisible to the user, but that is accessed by the interactive object control system 10 to guide interactive experiences within interactive environments 14.
- FIG. 4 is a schematic diagram of the interactive object control system 10 demonstrating the communication between the interactive object 20 and various components of the interactive object control system 10 external to the interactive object 20. Additionally or alternatively, the disclosed detection or locating of the interactive object 20 as provided herein may involve environmental sensors 16 (e.g., proximity sensors, optical sensors, image sensors) of the system that provide location or movement data of the interactive object 20.
- environmental sensors 16 e.g., proximity sensors, optical sensors, image sensors
- the environmental sensors 16 sense the interactive object 20 and/or the user 12 through image recognition (e.g., interactive object recognition, facial recognition), detection of a retroflective marker on the interactive object 20, 3D time of flight systems, radio frequency sensing, and optical sensing in addition to other sensing methods that detect that the user 12 and/or the user’s interactive object 20 is present in the interactive environment 14.
- the interactive object 20 can also include communication circuitry 26 that may include a radio-frequency identification tag (RFID) that can be activated through transmission of electromagnetic radiation to output object identification data to the environmental sensors 16 in the interactive environment 14. This data can then be utilized by the processor 40 disposed in the central controller 18 to link the interactive object 20 to a specific user 12 in the interactive environment 14.
- RFID radio-frequency identification tag
- the linkage of the user 12 to the user’s interactive object 20 enables a personalized special effect signal to be sent to the communication circuitry 26 of the interactive object 20, and enables the user profde to be updated based on the user’s interactions 12 within the interactive environment 14 via the central controller 18.
- This special effect signal sent by the central controller 18 is then processed by an object controller 39 housed in the interactive object 20, and activates the special effect system 52 that is powered either passively, e g., via power harvesting (optical power harvesting) or actively by a power source, to emit a special effect that is personalized to the user’s profile.
- the interactive object 20 may include an active or passive RFID tag that communicates device identification information.
- the RFID tag may be a controllable backscatter RFID tag.
- the communication circuitry transmits interactive object device information to the central controller 18.
- one or more sensors 46 of the interactive object 20 detect electromagnetic radiation that is projected into the interactive environment 14.
- the communication circuitry 26 either emits a wireless signal with interactive object device data via radio frequency identifier (RFID) tag or infrared light signal.
- RFID radio frequency identifier
- the environmental sensor 16 receives the interactive object device data and transmits this data to the central controller 18.
- the interactive object data is utilized by the processor 40 in combination with the user identification data from the environmental sensors 16 and/or memory 42.
- a personalized special effect signal based on the device and/or user identification, is then transmitted back to communication circuitry 26.
- the communication circuitry 26 passes the command to the object controller 39 of the interactive object 20.
- the object controller 39 is able to send the command to the special effects system 52 of the interactive object 20.
- a processor 48 and memory 50 enable special effect instructions to be stored and enable special effect activation and control corresponding to the command sent.
- the environmental sensors 16 detect the user’s presence in the interactive environment 14 and collect user data in addition to tracking of the interactive object 20 based on a performed gesture.
- the environmental sensors 16 may include camera facial recognition sensors, 3D time of flight sensors, optical sensors, and radio frequency sensors. These environmental sensors 16 are dispersed throughout the interactive environment 14 so that the users 12 can be tracked and located efficiently, and a personalized effect command can be sent to the communication circuitry 26 of the user’s associated interactive object 20.
- the environmental sensors 16 can be used to identify the user 12 so that the user information and device information provided by the central controller 18 enables a dynamic user profile to be created and updated as the user 12 moves about the plurality of interactive environments 14.
- the identification of the user 12 corresponding to the interactive object 20 may be accomplished using grip recognition and/or vision recognition via facial recognition cameras dispersed throughout the interactive environments 14.
- the memory 42 of the central controller 18 may store user profiles of the plurality of users 12 who have previously been matched to the plurality of interactive objects 20 within the interactive environment.
- the user profiles can then be updated as user experiences with the user’s interactive object 20 take place throughout the plurality of interactive environments 14.
- the central controller 18 is able to update user profile based on user’s experiences with their interactive object 20 within the area of an interactive environment of the plurality of interactive environments 14. This enables special effects to be differentiated based on user profile throughout interactive environments 14, and within multiple visits to the same interactive environment 14.
- the user profile can also include information that is associated with the user, which may comprise user specific characteristics that are predetermined before first use of object and after first use of object. These characteristics can enable further differentiation of special effect commands based on the specific user 12.
- the user profile can be updated to display this information.
- the central controller 18 may then send a special effect signal based in part on the user profile. This may comprise the output of a specific color LED, a sound effect, a haptic effect, a visual projection, or any combination thereof.
- the central controller 18 may be able to link only a threshold or preset number of the users 12 to the interactive object 20.
- the number of the users 12 that can be linked to the interactive object 20 may be limited to a specific threshold to maintain device security of the interactive objects 20. For example, if a specific interactive object 20 has been linked to two users, the central controller 18 may recognize that the threshold number of users for the specific interactive object 20 is two and may not identify a third user that is trying to utilize the interactive object 20.
- the central controller 18 may send a signal (e g. effect) to the third user’s interactive object 20 to communicate to the third user that the interactive object 20 is not able to be linked to the third user and the third user may need to obtain another interactive object 20. This may be accomplished through a visual effect command that directs the interactive object 20 to illuminate a specific color, a special effect command that directs the interactive object 20 to output a sound effect that communicates the interactive object 20 is not able to be linked, or any other effect method.
- a particular detected motion pattern of the interactive object 20 may be assessed by the central controller 18. Certain types of motion patterns may be associated with activating a red light on the interactive object 20 while other types of motion patterns may be associated with activating a blue light. Based on the detected pattern, the instructions for activation of the light color are transmitted to the interactive object 20.
- the special effect instructions may include instructions to set an intensity, hue, or interval pattern of light activation. One or more of these may be varied based on characteristics of the sensed motion pattern and/or user profile characteristics.
- the activation of the on-board special effect provides feedback to the user that a successful interactive experience has occurred, and lack of the special effect or a muted special effect (dim light activation) is indicative that the interaction should be improved or altered.
- the central controller 18 that drives the emitter 28 and that receives and processes data from the environmental sensors 16 may include the one or more processors 40 and the memory 42.
- the processors 40, 48 and the memory 42, 50 may generally referred to herein as “processing circuitry.”
- the one or more processors 40, 48 may include one or more application specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more general purpose processors, or any combination thereof.
- the one or more memory 42, 50 may include volatile memory, such as random access memory (RAM), and/or non-volatile memory, such as read-only memory (ROM), optical drives, hard disc drives, or solid-state drives.
- the central controller 18 may form at least a portion of a control system configured to coordinate operations of various amusement park features, such as an amusement park attraction and control system. It should be understood that the subsystems of the interactive object control system 10 may also include similar features.
- the special effect system 52 may include processing capability via the processor 48 and the memory 50. Further, the object controller 39, when present, may also include integral processing and memory components.
- the central controller 18 may be part of a distributed decentralized network of one or more central controllers 18.
- the decentralized network of the one or more central controllers 18 may communicate with a park central controller and park central server.
- the decentralized network of the one or more central controllers 18 facilitates reduction in processing time and processing power required for the one or more central controllers 18 dispersed throughout the one or more interactive environments 14.
- the decentralized network of the one or more central controllers 18 may be configured to obtain user profiles by requesting the profile from a profile feed stored in the park central server.
- the user profile feed may comprise user accomplishments associated with the interactive object, user experience level, past user locations, and other user information.
- the one or more central controllers 18 may act as edge controllers that subscribe to a profile feed comprising a plurality of user profiles stored in a park central server and cache the feed to receive one or more user profiles contained in the feed.
- the interactive environment 14 may include one or more central controllers 18.
- the one or more central controllers 18 within the interactive environment 14 may communicate with each other through the use of a wireless mesh network (WMN) or other wireless and/or wired communication methods.
- WSN wireless mesh network
- the special effect commands may be generated by the central controller 18, a distributed node of the central controller 18, or by a dedicated local controller associated with the interactive environment 18 and communicated to the interactive object 20.
- the sensor 46 of the interactive object 20 may include an array of individual pressure or grip sensors that provide pressure information to the object controller 39.
- the array may be a capacitive or force sensitive resistor array of at least 16 or at least 256 individual sensors.
- the object controller 39 under passive power, can use the signals from the array to calibrate based on sensor data indicative of a characteristic grip biometric for a particular user.
- the calibration process may activate a feedback via the special effect system 52 (e.g., activation of one or more light sources 53 in a pattern associated with matching the interactive object to a particular user, activating a speaker, or another special effect).
- the calibration process may be limited to one user or a threshold number of users, so that only a preset number of users may be linked to the interactive object 20 to maintain device security of the interactive object 20.
- the interactive object 20 may include a power source 56, which may be a battery or a power-harvester, such as a radio frequency based power-harvesting antenna or an optical harvester.
- the power source 58 such as the harvested power, is used to power one or more functions of the interactive object 20, such as the special effect system 52.
- the power source 58 may power multiple light emitting diodes with red, green, blue and white (RGBW) emitters.
- RGBW red, green, blue and white
- the interactive object 20 may provide object identification information via optical emissions that are detected by the environmental sensor 16.
- the light source 53 of the special effect system 52 may be used to transmit the optical information, or another light source may be used. Identification may be achieved through the use of radio frequency, infrared, and/or a RGBW-based, visible light method of identity transmission. In the case of an infrared or visible method of identity transmission, the illuminated output of the light source 53 can be modulated to encode an identity signal while being indiscernible to the eye. When using RGBW light emitter as a method of output and identification, a second emitter in the infrared range can be utilized to transmit supplemental identifier information.
- Interactive object identification via a first technique may be combined with interactive object sensing or tracking via a second technique (e.g., detecting a retroreflective marker).
- the identification information may be linked to the tracking information as provided herein (e.g., proximity assessment, matching to a common user).
- the central controller 18 may send a special effect command to an external prop or display that includes an external special effect system 59 in addition to sending a special effect command to the interactive object 20.
- the user 12 could make a gesture or motion with their interactive object 20.
- the environmental sensors 16 collect motion data, and transmit the motion data to the central controller 18.
- the central controller 18 utilizes user profile data and motion data to send a personalized special effect command to the interactive object 20.
- the central controller 18 may send an additional special effect command to an external prop or display in the area of an interactive environment.
- the special effect command sent to the external prop or display can comprise a visual effect, sound effect, or another type of effect command.
- the method includes a process of detecting a first or initial use of an interactive object 20 (block 62) and acquiring user recognition data during the first use (block 64).
- This identification of the user is facilitated by acquired data gathered from multiple environmental sensors 16 including facial recognition sensors, 3D time of flight systems, and other modes of user recognition.
- the sensor data collected from the environmental sensors 16 is pooled together so that the possibility of users present in the interactive environment 14 can be determined, and user possibilities are narrowed down.
- a single user 12 is selected from a set of users 12 based on the user recognition data.
- the data pooled from the multiple environmental sensors 16 reduces processing power needed for identification of users in an area of an interactive environment, and expedites user identification and increases accuracy of user identification. If user cannot be identified via a method of facial recognition, another sensing method is able to identify the user thus increasing accuracy of user identification.
- a default user profile may be associated with the identified interactive object and used until a user identification is made.
- the location of the possible users 12 to the user interactive object 20 can then be narrowed down to a smaller pool of users (block 66) until a single user 12 is selected.
- the selected user is linked to the interactive object 20.
- a first use special effect command is transmitted to the interactive object 20 to activate an on-board special effect that may be specific to characteristics of the initial use (block 68).
- the central controller 18 detects that no profile has been created for the user 12 and the user’s interactive object 20, thus triggering the creation of a new profile to store the user information.
- a user profile is assigned to the user 12 based on identification of the user 12 via environmental sensors 16 and detection of the user’s interactive object 20.
- the profile is stored in the central controller 18, so that it can be updated and utilized to communicate personalized special effect commands.
- FIG. 6 illustrates a process flow diagram for a method 72 for detection of the interactive object 20 in an interactive environment.
- the method 72 may include steps that are stored as instructions in the memory 42 and that are executable by the one or more processors 40 of the central controller 18. It should be noted that in some embodiments, steps of the method 72 may be performed in different orders than those shown, or omitted altogether. In addition, some of the blocks illustrated may be performed in combination with each other.
- the method 72 includes emitting electromagnetic radiation into the area of an interactive environment of the plurality of interactive environments 14 (block 74).
- the communication circuitry 26 of the interactive object 20 is then triggered by the electromagnetic radiation to emit a wireless signal to transmit the interactive object 20 data to the central controller 18 in the interactive environment 14 (block 76).
- This signal emitted by the communication circuitry 26 of the interactive object 20 may be facilitated by use of a radio frequency identification (RFID) tag or optical transmitter.
- RFID radio frequency identification
- the communication circuitry 26 enables communication of the interactive object data to the central controller 18. Concurrently to the interactive object transmission of the interactive object data to the environmental sensors 16 dispersed throughout the interactive environment, the sensors collect user information via facial recognition data, 3D time of flight system data, and other sensor data (block 78).
- This user information is utilized to facilitate efficiency in user identification within a crowded environment.
- the multiple forms of user identification enable the user pool to be narrowed down, and make identification of the users 12 in the interactive environment 14 more efficient.
- multiple interactive objects 20 and users 12 can be present in the same interactive environment 14.
- an effect sent from the central controller 18 can be personalized to individual users more efficiently.
- the central controller 18 is able to process all the collected sensor data from the plurality of users 12 and interactive objects 20, and utilize the data to determine the user 12 associated with each of the interactive objects 20.
- the device information transmitted from the interactive object 20 can include how long the interactive object 20 has been active in the interactive environment 14.
- the environmental sensor 16 can communicate interactive object data 20 and user data to the central controller 18.
- the central controller 18 will then transmit, based on the user profile, a personalized effect signal to the user’s interactive object communication circuitry 26 (block 80).
- the central controller 18 can recognize that a user has previously visited an area of an interactive environment, and is now revisiting the same area of an interactive environment.
- the activation of the special effect is detected by the environmental sensors 16 in the interactive environment 14 to activate or trigger a responsive effect based on the user profile to differentiate the effect from the previous time the user was visiting the area of an interactive environment (block 82).
- the guest experience can be further personalized through the addition of experience levels to the user profiles. These levels may be determined by how much time the user has spent with the interactive object 20, how many visits they have made to the interactive environment 14, and other additional criteria.
- the user profile information can be transmitted by the interactive object 20 in conjunction with signals to the central controller 18 to the interactive object 20 so that the effects can further be differentiated based on level of the user 12.
- This ability of the interactive object 20 to link to user profiles enables a single interactive object 20 to link to multiple users.
- the overall ability of hardware in the interactive environment and hardware in the user’s interactive object 20 to communicate user data enables dynamic user profiles to be established that include user information built up from previous visits and interactions in the environment, which creates a personalized and updated user experience throughout multiple visits.
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280009006.8A CN116745009A (en) | 2021-01-04 | 2022-01-04 | User-specific interactive object system and method |
KR1020237026207A KR20230129181A (en) | 2021-01-04 | 2022-01-04 | User-specific interactive object system and method |
EP22701082.4A EP4271490A1 (en) | 2021-01-04 | 2022-01-04 | User-specific interactive object systems and methods |
CA3202117A CA3202117A1 (en) | 2021-01-04 | 2022-01-04 | User-specific interactive object systems and methods |
JP2023540508A JP2024502074A (en) | 2021-01-04 | 2022-01-04 | User-specific interactive object system and method |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202163133625P | 2021-01-04 | 2021-01-04 | |
US63/133,625 | 2021-01-04 | ||
US202163172447P | 2021-04-08 | 2021-04-08 | |
US63/172,447 | 2021-04-08 | ||
US17/563,564 US20220214742A1 (en) | 2021-01-04 | 2021-12-28 | User-specific interactive object systems and methods |
US17/563,564 | 2021-12-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022147526A1 true WO2022147526A1 (en) | 2022-07-07 |
Family
ID=80050765
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2022/011104 WO2022147526A1 (en) | 2021-01-04 | 2022-01-04 | User-specific interactive object systems and methods |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP4271490A1 (en) |
JP (1) | JP2024502074A (en) |
KR (1) | KR20230129181A (en) |
CA (1) | CA3202117A1 (en) |
WO (1) | WO2022147526A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024059219A1 (en) * | 2022-09-15 | 2024-03-21 | Universal City Studios Llc | System and method for integrating interactive objects and attraction experiences |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190143204A1 (en) * | 2017-09-28 | 2019-05-16 | James Andrew Aman | Interactive game theater with secret message imaging system |
US10360419B1 (en) * | 2018-01-15 | 2019-07-23 | Universal City Studios Llc | Interactive systems and methods with tracking devices |
US20190318539A1 (en) * | 2018-04-13 | 2019-10-17 | Infinite Kingdoms Llc | Smart tracking system |
-
2022
- 2022-01-04 JP JP2023540508A patent/JP2024502074A/en active Pending
- 2022-01-04 EP EP22701082.4A patent/EP4271490A1/en active Pending
- 2022-01-04 WO PCT/US2022/011104 patent/WO2022147526A1/en active Application Filing
- 2022-01-04 KR KR1020237026207A patent/KR20230129181A/en unknown
- 2022-01-04 CA CA3202117A patent/CA3202117A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190143204A1 (en) * | 2017-09-28 | 2019-05-16 | James Andrew Aman | Interactive game theater with secret message imaging system |
US10360419B1 (en) * | 2018-01-15 | 2019-07-23 | Universal City Studios Llc | Interactive systems and methods with tracking devices |
US20190318539A1 (en) * | 2018-04-13 | 2019-10-17 | Infinite Kingdoms Llc | Smart tracking system |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024059219A1 (en) * | 2022-09-15 | 2024-03-21 | Universal City Studios Llc | System and method for integrating interactive objects and attraction experiences |
Also Published As
Publication number | Publication date |
---|---|
KR20230129181A (en) | 2023-09-06 |
CA3202117A1 (en) | 2022-07-07 |
JP2024502074A (en) | 2024-01-17 |
EP4271490A1 (en) | 2023-11-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11379679B2 (en) | Interactive systems and methods with tracking devices | |
US20220214742A1 (en) | User-specific interactive object systems and methods | |
US20210027587A1 (en) | Interactive systems and methods with feedback devices | |
KR20180110472A (en) | System and method for controlling a stereoscopic emotion lighting | |
WO2022147526A1 (en) | User-specific interactive object systems and methods | |
US20210325580A1 (en) | Interactive object systems and methods | |
CN116745009A (en) | User-specific interactive object system and method | |
US20240135548A1 (en) | Systems and methods for tracking an interactive object | |
WO2024086229A1 (en) | Systems and methods for tracking an interactive object | |
US11797079B2 (en) | Variable effects activation in an interactive environment | |
US20240012471A1 (en) | Variable effects activation in an interactive environment | |
US20230419726A1 (en) | Interactive imagery systems and methods | |
US20210342616A1 (en) | Identification systems and methods for a user interactive device | |
KR20240050455A (en) | Interactive systems and methods with tracking devices | |
CN116829233A (en) | Variable effect activation in an interactive environment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22701082 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 3202117 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023540508 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280009006.8 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 20237026207 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022701082 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022701082 Country of ref document: EP Effective date: 20230804 |