CN116745009A - User-specific interactive object system and method - Google Patents

User-specific interactive object system and method Download PDF

Info

Publication number
CN116745009A
CN116745009A CN202280009006.8A CN202280009006A CN116745009A CN 116745009 A CN116745009 A CN 116745009A CN 202280009006 A CN202280009006 A CN 202280009006A CN 116745009 A CN116745009 A CN 116745009A
Authority
CN
China
Prior art keywords
interactive object
interactive
user
special effect
environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280009006.8A
Other languages
Chinese (zh)
Inventor
W·C·叶
R·E·罗杰斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Universal City Studios LLC
Original Assignee
Universal City Studios LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/563,564 external-priority patent/US20220214742A1/en
Application filed by Universal City Studios LLC filed Critical Universal City Studios LLC
Priority claimed from PCT/US2022/011104 external-priority patent/WO2022147526A1/en
Publication of CN116745009A publication Critical patent/CN116745009A/en
Pending legal-status Critical Current

Links

Landscapes

  • Toys (AREA)

Abstract

A system, comprising: an interactive object, the interactive object comprising a special effect system; a controller that controls operation of the special effect system; and an environmental sensor configured to generate sensor data. The system includes a central controller operative to receive a user profile and to receive sensor data from the environmental sensors of an interactive environment. The central controller can then identify a user based on the collected sensor data, wherein the identified user is associated with the interactive object. The central controller can then characterize the movement of the interactive object based on the sensor data and communicate instructions to the controller of the interactive object to activate special effects of the special effect system, wherein the instructions are based on a user profile associated with the identified user and the characterized movement or action of the interactive object.

Description

User-specific interactive object system and method
Cross reference to related applications
The present application claims priority from U.S. provisional application serial No. 63/133,625 (titled "USER-SPECIFIC INTERACTIVEOBJECT SYSTEMSANDMETHODS" and filed on day 4 of 2021) and U.S. provisional application serial No. 63/172,447 (titled "USER-SPECIFIC INTERACTIVEOBJECT SYSTEMSANDMETHODS" and filed on day 8 of 2021), which are hereby incorporated by reference in their entirety for all purposes.
Background
The present disclosure relates generally to objects for use in an interactive environment, such as a gaming environment or amusement park. More particularly, embodiments of the present disclosure relate to addressable interactive objects that facilitate interactive effects, such as one or more special effects.
This section is intended to introduce the reader to various aspects of art that may be related to various aspects of the present disclosure. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present disclosure. Accordingly, it should be noted that these statements are to be read in this light, and not as admissions of prior art.
In recent years, it has become more common to create interactive environments containing props, scenery, audiovisual and other media elements, as well as special effects, that improve the customer experience and support specific environment narratives. In some interactive environments, it is pleasant for customers to have their own objects, such as props or toys, that interact with the environment in various ways. In one example, a customer may wish to interact with an interactive environment using a handheld device to generate a particular effect that simulates an effect from a movie or game. In general, such interactive environments are crowded, and conventional techniques for wireless communication can be cumbersome (when multiple customers all carry their own handheld objects).
Disclosure of Invention
The following summarizes certain embodiments commensurate in scope with the originally claimed subject matter. These embodiments are not intended to limit the scope of the present disclosure, but rather are intended to provide a brief summary of certain disclosed embodiments. Indeed, this disclosure may encompass a wide variety of forms that may be similar to or different from the embodiments set forth below.
According to an embodiment, an interactive object includes a special effect system disposed in or on a housing, a controller disposed in or on the housing that controls operation of the special effect system, and a plurality of environmental sensors configured to generate sensor data. The present embodiment also includes a central controller operative to receive a plurality of user profiles from a plurality of users and to receive sensor data from the plurality of environmental sensors of the interactive environment. The central controller identifies a user of the plurality of users based on the sensor data, wherein the identified user is associated with the interactive object. The central controller then characterizes the movement or action of the interactive object based on the environmental sensor data and communicates instructions to the controller of the interactive object to activate special effects of the special effect system, wherein the instructions are based on a user profile of the plurality of user profiles associated with the identified user and the characterized movement or action of the interactive object.
According to another embodiment, a method comprises: receiving sensor data from a plurality of sensors in an interactive environment; identifying a plurality of users and a plurality of interactive objects in the interactive environment based on the sensor data; associating an identified interactive object with an identified user; tracking movement of the identified interactive object using the sensor data; and delivering instructions to the identified interactive object to activate an on-board (on-board) special effect based on the tracked movements and a user profile of the identified user.
According to another embodiment, an interactive object comprises: a housing; and a detectable label disposed on or in the housing, operative to reflect a first portion of electromagnetic radiation from the environment. Communication circuitry on or in the housing, the communication circuitry operative to receive a second portion of the electromagnetic radiation from the environment, transmit interactive object identification information for the interactive object in response to receiving the second portion of electromagnetic radiation, and receive special effect instructions. A controller on or in the housing, the controller receiving the special effect instruction and generating a special effect command. The interactive object also includes a special effect system that receives the special effect command and activates a special effect based on the special effect command.
Drawings
These and other features, aspects, and advantages of the present invention will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
FIG. 1 is a schematic diagram of an embodiment of an interactive object system in accordance with the present technique;
FIG. 2 is a schematic diagram of features of an embodiment of an interactive object system in accordance with the present technique;
FIG. 3 is a flow diagram of an interactive object system in accordance with the present technique;
FIG. 4 is a schematic diagram of a communication system of an interactive object system in accordance with the present technique;
FIG. 5 is a flow chart of a method of assigning a user profile in accordance with the present technique; and
FIG. 6 is a flow chart of a method of detecting an interactive object and facilitating effect emission of the interactive object in accordance with the present technique.
Detailed Description
One or more specific embodiments will be described below. In an effort to provide a concise description of these embodiments, not all features of an actual implementation are described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
When introducing elements of various embodiments of the present disclosure, the articles "a," "an," "the," and "said" are intended to mean that there are one or more of the elements. The terms "comprising," "including," and "having" are intended to be inclusive and mean that there may be additional elements other than the listed elements. One or more specific embodiments of the present embodiments described herein will be described below. In an effort to provide a concise description of these embodiments, not all features of an actual implementation may be described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
Users in an interactive environment or participating in an immersive experience may prefer to carry hand-held objects or wear clothing elements consistent with a theme such as swords, stuffed animal toys, hats, sticks, jewelry, or other props. In one example, interaction is mediated by identifying an object (e.g., via object identification or wireless communication) and activating an external object of an external action based on the identification. Such an arrangement permits the object to be implemented as a relatively inexpensive object, with the more complex and expensive elements of the interaction being located outside the passive device or off-board. While the feedback system can be positioned as a component of the environment, the ability to generate feedback in or on the handheld or wearable device can facilitate deeper level immersion into the environment.
Furthermore, individually addressing a user and/or an interactive object within a crowded environment with many users and many interactive objects is tricky. A crowd, object, or scene may obstruct the view of a particular type of sensor (e.g., camera, beam sensor). Further, some sensors may have reduced feedback during severe weather or limited light conditions. These environmental factors may impair the ability of the interactive system to identify a particular user and direct interactive effects to particular interactive objects associated with the particular user. Furthermore, users and their interactive objects tend to move within the interactive environment. It is also tricky to accurately locate users and objects in crowded environments. While face recognition techniques may permit identification of a single user, face recognition is computationally intensive and slow. Further, users in an interactive environment may wear hats, masks, sunglasses, or other items of props, which makes identification even more difficult.
The disclosed interactive object technology permits a user and/or an interactive object to identify and select a target to mediate an element of an interactive environment. In embodiments of the present disclosure, the system locates a particular user within a crowd of people and activates or directs a special effect to an interactive object carried or worn by the user without necessarily activating other interactive objects in the area. In contrast to communications mediated by mobile devices or other portable electronic devices, the disclosed techniques operate to identify relatively simple interactive objects that do not necessarily contain any communication packages. Accordingly, the identification is based on environmental data collected within the interactive environment and used to identify and locate both the user and the interactive object. Further, the disclosed techniques account for situations in which a particular user is using an interactive object that is not preregistered to the system or otherwise linked to the particular user. In one example, a user may switch or share interactive objects within a home community. While wireless transfer of object identification information to the system permits identification of the interactive object itself, merely by the identification information will not identify conditions in which the user carrying the object changes throughout the day. The present technology permits interactive objects to be dynamically updated and addressed in a manner specific to the identification code (identity) of a particular user interacting with the object at a particular time.
The interactive object system collects data from multiple sensing modalities (which may be aggregated and/or arbitrated) to permit users to be identified with greater accuracy while utilizing minimal or reduced processing power. The data can include radio frequency data, optical sensing methods, 3D time-of-flight systems, and other sensing systems. This facilitates the identification of specific users in the population and causes a personalized effect to the user in a manner that can be linked to the identification code of the interactive object carried by the user. This positive user identification also facilitates linking of the interactive object with a particular user, which further personalizes the user experience in a crowded environment.
Such interactive objects may be props or toys used within an interactive environment in one embodiment to permit greater variability in special effect control through the use of personalized user interactions. User interactions enable user profiles to be associated with the interactions. This profile is used to select special effects based on the user profile. Further, it should be appreciated that while embodiments of the present disclosure are discussed in the context of toys, props, or hand-held objects, such as swords, sticks, tokens (tokens), books, balls, or figurines (figurines), it should be understood that the disclosed embodiments may be used with other types of objects. Such objects may include wearable objects such as clothing, jewelry, bracelets, headwear, medallions, or glasses.
Further, the object may be a prop or a scenery item within the interactive environment. The interactive environment may be part of an immersive area, such as an amusement park, an amusement complex, a retail establishment, or the like. The disclosed systems and methods may include at least one interactive environment having a theme zone of a common theme, and may additionally include different interactive environments within a single theme zone. Further, the disclosed systems and methods may include additional or other interactive environments having different themes but contained within an immersive area, such as a theme park or entertainment venue. The interactive environment may be a live performance in which users in the audience may be able to participate using their objects. When referring to an interactive environment, the interactive environment may include an area within which a user can interact with interactive elements within the area. Further, the interactive environment may also include different locations that are geographically separated from each other or dispersed throughout the amusement park. The interactive environment may also be at a remote location separate from the immersive area. For example, a user may be able to establish an interactive environment at their home or any other location via a user electronic device configurable to interact with an object.
Certain aspects of the present disclosure may be better appreciated with reference to fig. 1, which generally illustrates the manner in which an interactive object control system 10 may be integrated within an interactive environment in accordance with the present embodiment. As shown, a plurality of users 12 (e.g., customers) move around one or more interactive environments 14 (e.g., that may be in an amusement park or a casino). The user 12 may have a handheld or wearable interactive object 20 that moves throughout the interactive environment 14 with the user 12. The user's interactive object 20 facilitates output based on user-specific special effects interacted with by the user 12 via the interactive object 20 within the interactive environment 14. The effects may provide feedback to user 12 that an interaction is occurring or has successfully occurred.
The interactive object control system 10 includes a plurality of environmental sensors 16 that are interspersed throughout the interactive environment 14 and, in some embodiments, among various areas of the interactive environment 14 or among different interactive environments. The environmental sensor 16 generates sensor data. The sensor data may be user data such as image or camera feed (feed) data from one or more users in the interactive environment 14. Additionally or alternatively, the sensor data may be interactive object data, such as data indicative of the position or movement of the interactive object 20 in the interactive environment 14. In one embodiment, the acquired sensor data may be used to track the location of the user within and between different interactive environments 14. The environmental sensor 16 sends sensor data to the central controller 18 and the data can be processed to identify the user 12 during user interaction within the interactive environment 14. The environmental sensor 16 may also track user movements throughout the interactive environment 14. In an embodiment, user data for all users 12 in the interactive environment can be aggregated and/or arbitrated to identify one or more users 12 within the interactive environment 14. Further, the sensor data may be used to identify one or more interactive objects 20 within the interactive environment 14. Still further, the interactive object control system 10 may target user-specific special effects to specific locations and/or specific interactive objects 20 in the interactive environment 14 based on the acquired data. Thus, in an embodiment, user 12 may experience interactive object 20 as having the appearance of a passive or low technology device. However, the user's own actions using interactive object 20 may trigger on-board special effects on interactive object 20 and/or location-specific special effects in interactive environment 14. These effects can be selected by central controller 18 based on user actions, user profile information, or historical or other data associated with user 12 and/or interactive object 20. Thus, the effects experienced by the user 12 and that may also be visible to other nearby users 12 are variable or unpredictable within the interactive environment 14, thereby causing increased enjoyment.
In one example, the interactive object control system 10 is capable of tracking real-time user locations. For example, the user 12 may leave one interactive environment 14a and enter a second interactive environment 14b. The location of the user is tracked via the environment sensor 16 such that the user's location can be communicated to the central controller 18 and user contact to the interactive object 20 can occur more efficiently throughout the plurality of interactive environments 14.
In one embodiment, the user's interactive object 20 is triggered upon entering the interactive environment 14 to transmit interactive object identification data to the environment sensor 16, which then transmits the interactive object identification data to the central controller 18. The central controller 18 receives the object identification data and uses the data to link a particular interactive object 20 to a particular user 12 of the plurality of users in the interactive environment 14. In one embodiment, the user identification occurs based on the received sensor data. The sensor data is evaluated to identify one or more users 12 in the area of the interactive environment. Characteristics of the user extracted from the sensor data, such as extracted facial features, skeletal features, gait, limb characteristics or movements, matches with previously identified articles of clothing, or detectable biometric (biometric) characteristics, are then used to identify individual users 12 from a set of identified users 12 known to be generally within the area. The users 12 generally within the interactive environment 14 may form a subset of the total set of all users within the attraction or theme park based on the collected sensor data, and the user identification may be accelerated by dynamically updating the relevant subset to a pool of possible candidates.
The identified users 12 are then linked to the interactive objects 20 by the central controller 18 so that the user profile can be updated based on user interactions via the linked interactive objects within the interactive environment 14. Further, the particular effect or action can be directed to the identified user 12 and/or linked interactive object 20.
In one embodiment, user 12 is able to employ interactive object 20 to perform a particular motion or gesture that is tracked via environmental sensor 16 and communicated to central controller 18. The central controller 18 then processes the motion data (in some instances in combination with the user profile data or the object profile data) to generate special effect communications, such as special effect instructions, that are personalized based on one or more of the motion data, the user profile data, and the object profile data. The interactive object 20 then receives this communication and activates a personalized special effect based on the particular user 12, the particular object 20, and/or the tracked motion. The user 12 is then able to perceive different special effects (e.g., visual, auditory, tactile) throughout the interactive environment 14 (including on the object 20 or through the object 20 in some instances). The profile of the user and, in some instances, the profile of the object stored in the central controller 18 is updated to store the interaction data of the user within the interactive environment 14.
The special effect instructions may be personalized according to a user profile associated with the user's interactive object 20. The user profile may include a user skill level (e.g., length of time used with the interactive object, accuracy of gestures with the interactive object over time) and a user identification code (e.g., pre-selected theme or other user preference). For example, special effect commands corresponding to gestures of a user employing interactive object 20 may be personalized by user 12 according to the user profile by altering the audible, tactile, or visual aspects of the special effect commands sent to user's interactive object 20. For example, characteristics such as the intensity of a particular effect (e.g., light intensity, audio intensity, haptic effect intensity) can be adjusted in a rule-based manner. The central controller 18 may identify the user 12 and receive a user profile including user skill levels and user identification codes. The central controller 18 can set special effect commands based on user skill level and user identification code. In one example, the special effect command, in combination with sensor data associated with gesture execution data received by the central controller 18, causes activation of light on the interactive object 20 having a particular color and/or intensity corresponding to the user skill level and the user identification code. User 12 may possess an intermediate skill level, and the central controller may determine that the gesture was performed accurately based on the sensor data. This information will be used by the central controller 18 to generate or adjust specific special effects based on the skill level and the correct execution of the gesture. For example, completing a digital eight gesture in combination with an intermediate skill level may be determined by the central controller 18 to correspond to a special effect command associated with illuminating the user's interactive object 20 green. Real-time special effect adjustments may occur during the course of a special effect (e.g., where a change in color intensity corresponds to a higher or lower quality change in user action). Another user having a beginner skill level but performing the same splayed gesture may cause central controller 18 to generate special effect commands associated with an audio clapping or spark effect.
If an individual not associated with the interactive object 20 attempts to perform a gesture or uses the interactive object 20, the central controller may detect that the individual is not linked to the interactive object 20 and/or that the individual is linked to a different interactive object 20. The central controller 18 may generate an effect that conveys to the individual that they are incorrect users of the interactive object 20. For example, central controller 18 may send an audio special effect command to interactive object 20 to output an audio error message. This will indicate to the individual that they are incorrect users of the interactive object 20. In an embodiment, it is contemplated that each dishonest interaction (regardless of user identification code or skill level) with the interactive object 20 can generate some perceivable feedback at the interactive object 20.
In another embodiment, user 12 may enter interactive environment 14, and special effect commands may be transmitted based on user profile data or object profile data before any movement is made via interactive object 20. For example, when user 12 enters interactive environment 14, user's interactive object 20 is triggered to transfer object identification data into the area. The interactive object data is then transferred to the central controller 18. Users 12 may have specified characteristics within their user profile that trigger special effects based on the particular characteristics. For example, a user can pick a particular identification color for their profile, and the central controller 18 can communicate this particular color output to the interactive object 20 and enable the interactive object 20 to emit light from the particular color LEDs contained within the interactive object 20 that correspond to the user profile characteristics.
In one embodiment, the central controller 18 may send a general special effect command to all user-interactive objects 20 currently within a defined area or within a particular interactive environment 14. The central controller 18 may send a general special effect command that causes each interactive object 20 in a defined area or within a particular interactive environment 14 to emit a particular color of light. The defined area may be a prescribed distance or radius of the interactive environment 14 such that the interactive objects 20 in the defined area receive the same special effect command.
Fig. 2 shows a schematic diagram of the interaction of the interactive object control system 10. In one embodiment, the interactive object control system 10 receives or detects interactive object information (e.g., a unique device identification number or code) from one or more interactive objects 20 in an area of the interactive environment 14 that includes an area in range of the transmitter 28 and the sensor 16 of the interactive object control system 10. In one embodiment, the information is based on a detectable mark 21 (e.g., bar code, quick response code, patterned retroreflective mark, pictograph) on the housing 22 of the interactive object 20 and is detected by the interactive object control system 10 such that the interactive object 20 passively provides the information. Interactive object 20 may include a mix of passive and active elements of different types that may permit activation and communication depending on the environment.
As shown, user 12 interacts with interactive object control system 10, which may include one or more emitters 28 (which may be all or a portion of an emission subsystem having one or more emission devices and associated control circuitry) that emit electromagnetic radiation (e.g., light such as infrared, ultraviolet, visible, or radio waves, etc.) at one or more wavelengths. The interactive object control system 10 may also include one or more environmental sensors 16 (which may be all or part of a detection subsystem having one or more sensors, cameras, or the like, and associated control circuitry) that detect one or more of the signals transmitted from the interactive object 20, the detectable marker 21 on the interactive object 20, and the user 12 in the interactive environment 14, as described above in fig. 1. To control the operation of the transmitter 28 and the environmental sensor 16 (the transmitting subsystem and the sensor subsystem) and to perform various signal processing routines resulting from the transmitting and detecting processes, the interactive object control system 10 also includes a central controller 18 communicatively coupled to the transmitter 28 and the environmental sensor 16. As shown, the interactive object control system 10 may contain an interactive object 20 (shown as a handheld object) that contains: a housing 22 having an outer surface 24 that in one embodiment includes a grip sensor; and an interior of the housing containing the communication circuitry 26. The housing 22 may also contain a detectable label 21.
As discussed, communication circuitry 26 may actively communicate a device identification of interactive object 20 to environmental sensor 16 in interactive environment 14. Communication circuitry 26 may include Radio Frequency Identification (RFID) tags. The communication circuitry 26 is capable of communicating the device identification of the interactive object to an environment sensor 16 (implemented as a receiver) of the interactive environment 14, which in turn communicates information to the central controller 18 of the interactive object control system 10. The communication circuitry 26 enables wireless communication of device identification information between the hardware of the interactive object 20 and the hardware of the interactive object control system 10 such that interactive object information relating to one or both of the user profile or the object profile can be dynamically updated and used to generate personalized commands that are sent from the central controller 18 to the interactive object 20 and/or the interactive environment 14.
In an embodiment, the transmitter 28 is external to the interactive object 20 (e.g., spaced apart from the interactive object 20). The transmitter 28 is operative to transmit electromagnetic radiation, represented for illustrative purposes by an expanded beam of electromagnetic radiation, to selectively illuminate, flood or flood the interactive environment 14 in the electromagnetic radiation. The electromagnetic radiation beam may, in certain embodiments, represent a plurality of light beams (electromagnetic radiation beams) emitted from different sources of one or more emitters 28, including all parts of the emission subsystem of the one or more emitters 28. For example, the source may be a visible light source, an infrared light source, or the like, to emit electromagnetic radiation of a desired wavelength. Further, the emitter 28 may include one or more sources of different types, such as light emitting diodes, laser diodes, or other sources. The electromagnetic radiation beam is intended to generally represent any form of electromagnetic radiation that may be used in accordance with the present embodiments, such as forms of light (e.g., infrared, visible, UV) and/or other bands of the electromagnetic spectrum (e.g., radio waves, etc.). It is also currently recognized that in certain embodiments, it may be desirable to use certain bands of the electromagnetic spectrum depending on various factors. For example, in one embodiment, it may be desirable to use a form of electromagnetic radiation that is not visible to the human eye or that is not within the audible range of human hearing, such that the electromagnetic radiation used does not distract the customer from their experience. Further, it is currently also recognized that certain forms of electromagnetic radiation, such as certain wavelengths of light (e.g., infrared) may be more desirable than other forms of electromagnetic radiation, depending on the particular setting (e.g., whether the setting is "dark" or whether one is expected to traverse the path of the beam). The detectable marker 21 may be, for example, a retroreflector operative to reflect light in a particular range (in one embodiment, the range of 800-1100 nm) that reflects the emitted light from the emitter 28. The reflected light is detected at one or more environmental sensors 16 to generate sensor data indicative of the presence or movement of the interactive object 20.
The interactive environment 14 may correspond to all or a portion of an amusement park attraction area or interactive environment, including stage shows, ride vehicle loading areas, waiting areas outside of the entrance to a ride or show, interactive features interspersed within an amusement park, and so forth. The interactive environment 14 may also be mobile or transient, such as being incorporated within a parade or street show. The user 12 may interact with the interactive environment 14 alone, such as a game, a hunting game, or a portion of a non-linear narrative experience. In one embodiment, emitter 28 is fixed in position within the environment while interactive object 20 moves within the area of interactive environment 14 and receives electromagnetic radiation signals. Accordingly, interactive object 20 may be detected (e.g., located within interactive environment 14), tracked via environmental sensors 16 in the area, and communicated to activate one or more on-board special effects of interactive object 20 via emitted and detected electromagnetic radiation of interactive object control system 10.
As generally disclosed herein, detection of the interactive object 20 is controlled by the central controller 18, which drives the transmitter 28. The activation may be indiscriminate such that transmitter 28 continuously emits electromagnetic radiation of the appropriate wavelength or frequency corresponding to communication circuitry 26 and the communicated device information, and any interactive objects positioned within interactive environment 14 and oriented toward transmitter 28 are activated to transmit a signal of the device identification to environmental sensors 16 dispersed throughout interactive environment 14. The sensors may include radio frequency sensors, optical sensors, 3D time-of-flight sensors, facial recognition sensors, and other sensing systems to aid in the identification of the user 12 and interactive object 20. In an embodiment, as disclosed in greater detail herein, the activation may be selective such that the central controller 18 operates to locate and process the transmitted object identification data via the communication circuitry 26 of the interactive object 20, and upon locating and detecting drive the transmitter 28 to direct signals given by the central controller 18 to the interactive object 20 such that activation of special effects of the interactive object 20 may be turned on or off depending on the desired narrative or user action.
For example, users 12 may enter interactive environment 14 along with their respective interactive objects 20. The interactive object may wirelessly communicate interactive object information in the interactive environment 14 or may interact (reflect) the emitted light to provide interactive object data to an environment sensor 16 in the interactive environment 14. The environmental sensor 16 may also obtain interactive object data from detectable marks 21 on the interactive object 20. The object identification data is then transferred to the central controller 18 for processing. The environmental sensor 16 (e.g., face recognition sensor, 3D time-of-flight sensor, optical sensor) may also detect user-related information, such as image information. The central controller 18 may identify the user 12 via the sensor data to narrow the user pool in the interactive environment 14 so that the object identification data can be more efficiently linked to a particular user 12.
The user 12 may then employ their interactive object 20 to perform a motion or gesture. The motion data of the interactive object 20 is collected by the environmental sensors 16 in the interactive environment 14 and transmitted to the central controller 18. The central controller 18 then utilizes the user data in conjunction with the athletic data to send a personalized effects response to the communication circuitry 26 of the interactive object 20 that has been previously linked to the user 12 by the central controller 18. If the user 12 has previously accessed the interactive environment 14, the personalized effect response can be distinguished by the central controller 18 as being different from the previous effect command sent to the user's interactive object 20 in the interactive environment 14. Specific gestures or movements performed with the interactive object 20 can also cause effect differentiation. A particular motion performed by user 12 with interactive object 20 can trigger a motion-specific effect. In one embodiment, the motion data may be compared to a stored set of motions and evaluated for accuracy based on a preset quality metric. The effect based on the accuracy and/or the performed motion can be specified to correspond to a certain color emission of light from the interactive object 20 or other effect emitted from a special effect system of the interactive object 20.
The central controller 18 may trigger special effects that change throughout the effect based on user interactions with their interactive objects 20 in the interactive environment 14. In one example, the power boost device is controlled to increase the brightness of the special effect output via interactive object 20. Signal boosting may be achieved by a controllable radio frequency energy emitter or in the example of optical power harvesting by an additional infrared emitting power source. The radio frequency transmitter may direct and/or focus a radio frequency energy delivery beam from the radio frequency transmitter to a particular interactive object 20 based on the detected position of the interactive object 20. The steering and focusing of the beam to the location of the interactive object 20 facilitates an increase in the available power of the interactive object 20 and allows the apparatus to output special effects at a higher intensity and/or brightness relative to other interactive objects 20 in proximity using additional available power. In this way, individual interactive objects 20 in a group can be picked up, for example, to form a high intensity beam. The radio frequency energy delivery may comprise Ultra High Frequency (UHF) energy delivery to power special effects of the interactive object 20. The change in brightness may be dynamic and tied to a user action employing interactive object 20 such that brightness increases when the user is improving, is closer to the target (e.g., becomes "closer" to finding the object), or performs a movement pattern with a higher quality metric, and brightness decreases when the user is making a relatively less good (e.g., becomes "farther"), less accurate (lower quality metric) movement pattern. These user actions are tracked in real-time by environmental sensor 16 so that system 10 can provide feedback in substantially real-time via output on interactive object 20. Accordingly, the nature of the special effect may be based on a quality metric above or below a threshold. The quality metric may be based on the accuracy of the motion pattern of the interactive object 20, the distance of the interactive object 20 from the target (within a certain distance above the quality threshold), or the interaction of the interactive object 20 within the interactive environment 14. In another embodiment, the intensity of the brightness may vary depending on the stage of activation. Or discrete color illumination can be tied to a specific interaction, a specific gesture using interactive object 20, or completion of a series of gestures. In another embodiment, the special effect alterations discussed above, as well as the use of illumination and color discussed in connection with the various embodiments herein, may be expressed by other sensory effects, including haptic sensations (such as vibration of the interactive object 20) or sounds (such as tones emitted by the interactive object 20).
In another example, the power boost effect may be achieved by utilizing an external device (e.g., a mobile device) associated with user 12. The interactive object 20 may include a Near Field Communication (NFC) coil located in an interior of the interactive object 20. The NFC coil may facilitate charging and/or power boosting of the interactive object 20 by obtaining a charge via transfer of energy from an external device associated with the user (e.g., a mobile phone, an NFC charger that may be implemented as a toy or a wearable device). The external device may include a sheath and/or a cradle of the interactive object 20 such that the interactive object 20 may be continuously charged as the user 12 moves around the interactive environment 14. The interactive object 20 may also include a rechargeable energy source (e.g., battery, supercapacitor) that can buffer and store energy from a radio frequency transmitter, a user's mobile device, an accessory of the interactive object, or any combination thereof. The rechargeable energy source may be used to perform the power boost effect at any point in time and ensure that the interactive object 20 has stored energy for effect output regardless of the location of the interactive object 20. In another embodiment, the NFC coil may be capable of pairing the interactive object 20 with a user's mobile device to allow interactivity between the user's mobile device and the user's interactive object 20. For example, the mobile phone may be paired with the user's interactive object 20 and allow the interactive object to perform the transfer of data to the mobile device. Interactive object 20 execution data may be processed and displayed to user 12 via the mobile device's application so that the user can view their execution statistics in real-time.
In another embodiment, the interactive object 20 may be recharged throughout the day if on display and/or not being used by the user 12. The interactive object 20 may be recharged by using the optical power harvesting method described above. The interactive object 20 storage area may include a radio frequency transmitter that may continuously transmit energy toward the interactive object storage area to recharge the interactive objects 20 when not in use so that they are fully charged when the user 20 obtains the interactive objects 20. The interactive objects 20 in the storage area may also be charged via a near field device, which may be incorporated into a shelf unit or other storage space. Such a near field charging method may be used as a 1:1 full (top off) (e.g., charging) method. The optical power harvesting method may be used with other charging methods, such as a medium-to-long distance charging method via charging by Ultra High Frequency (UHF) radio frequency, and charging using Near Field Communication (NFC) methods (e.g., NFC coils, near field devices located within the interactive object 20), in combination with the interactive object 20. It should be appreciated that any of the above charging methods may be implemented separately or in combination throughout user 12 interaction to power or charge interactive object 20. Further, the discussed power harvesting techniques may be used to directly power the on-board special effects of the interactive object 20 and/or may be used to charge a battery or power storage device of the interactive object 20 that is in turn used to power the special effects.
The central controller 18 may detect and store historical data associated with past interactions between the user's interactive object 20 and other interactive objects. For example, the user's interactive object 20 may have interacted with the opponent's interactive object 20 during a combat scene. The central controller 18 may update the user's profile to include historical information regarding interactions of the user 12 with the opponent's interactive object 20 during the combat scene. The central controller 18 may then detect at a later time that the user's interactive object 20 is attempting to combat the same opponent's interactive object 20. The central controller 18 may then receive historical data including past combat scene data and differentiate between special effect commands sent to the user's interactive objects 20 to activate new special effects (based on previous combat interactions).
In another example, a user may enter the interactive environment 14 and the initial effect command may be sent to the interactive object 20 based on the interactive object identification via wireless transmission from the on-board communication circuitry 26 of the interactive object 20 to the central controller 18, and based on the user identification via the environment sensor 16. This initial identification may enable central controller 18 to send an initial command based on object identification 20 and user 12 identification. For example, interactive object 20 can receive an initial command to project a color of light from an LED housed within a special effect system of interactive object 20. This projection of the LED light color can be based on a user's preference or a user's experience level with the corresponding interactive object 20. The user can then employ the interactive object 20 to perform a motion or gesture. Environmental sensors 16 disposed throughout the environment collect motion data of the interactive objects, and the motion data is then transmitted to a central controller 18. The central controller 18 based on the motion data can then send another special effect command to the interactive object 20. The communication circuitry receives commands sent from the central controller 18 and outputs different colored LEDs based on the motion or gesture performed. This enables the user to observe a sustained output of effects from the interactive object 20 during the user's entire experience in the interactive environment 14.
For example, the interactive object 20 may be sent a command to execute a discrete lighting sequence throughout the user's entire experience in a particular interactive environment 14. For example, the interactive object 20 may illuminate a color LED based on an initial identification by the central controller 18 and a contact to the user profile. User 12 may then perform a gesture or a series of gestures, and based on the accuracy of these gestures, interactive object 20 may be sent a command to illuminate one or more LEDs of different colors or a color LED (depending on the accuracy of the performed gesture) in a prescribed sequence or together. For example, the accurate execution of the gesture determined via the central controller 18 triggers the central controller 18 to send a second command to the interactive object 20 to illuminate the alternate color LEDs or a sequence of alternate color LEDs according to the initial identification, which may be based on whether the gesture was accurately performed. The illumination of one or more specific color LEDs may correspond to a theme aspect of the interactive experience. The colors may correspond to community or family affiliations (affirmations) stored in the user profile corresponding to pre-selected color options to connect the users 12 to their user profile throughout the user experience.
In another embodiment, the mobile device of user 12 may be used to identify interactive object 20 and link user 12 to interactive object 20. The interactive object 20 may have a high-level symbology (e.g., fu Wen and/or a series of Fu Wen) etched on the exterior of the interactive object 20. The Fu Wen may also be any other symbol or etching system used to represent a unique pattern on the housing of the interactive object 20. The order of Fu Wen may correspond to the unique identifier of the interactive object 20. For example, fu Wen a and Fu Wen B can appear on a first interactive object in order AB and on a second interactive object in order BA. The Fu Wenke is order specific such that order AB corresponds to a unique identifier that is different from order BA. This order-specific identification of Fu Wen enables a greater number of unique identifiers to be available using a fewer number Fu Wenshi.
To link the user 12 to the interactive object 20, the user 12 may utilize their mobile device to scan the interactive object 20 or take a picture of the interactive object 20 with a camera of the mobile device. The user 12 may also download applications on their mobile device that can detect Fu Wen from photographs obtained via the camera of the mobile device. The application may have access to a database containing all unique identifiers for each combination of tokens. The mobile device may obtain the user 12 information via the application and link the user's interactive object 20 with the user information via the unique identifier obtained from Fu Wen. The mobile device may be configured via the application to communicate user information and associated interactive object information to the central controller 18. The central controller may utilize user 12 and interactive object information to communicate special effect commands (based on user 12 being associated with interactive object 20 via the mobile device). Such a method can be implemented in conjunction with an environmental sensor 16 method that identifies users 12 and links each user 12 to their respective interactive object 20. Identification of interactive objects 20 via the user mobile device can be combined with environmental sensor 16 to aid identification in crowded environments or to aid user identification in addition to environmental sensor 16.
The special effects of the interactive object 20 can be altered or selected based on user actions. In another embodiment, the special effect command may be determined based on a gesture of interactive object 20, a verbal command by user 12 of interactive object 20, a user profile that includes a level of user 12, or any combination thereof. Certain gesture and spoken action combinations may be associated with special effects that are more intense or less infrequently generated than individual gestures or spoken commands. For example, user 12 may employ interactive object 20 to perform the first gesture without user 12 reciting a verbal command. The central controller 18 may receive sensor data related to gestures performed with the interactive object 20 and link the interactive object 20 to the user 12 and a user profile corresponding to the user 12. Central controller 18 may then transmit special effect commands to interactive object 20 based on the gestures performed by the user with interactive object 20 and the user profile. The user 14 may alternatively perform the first gesture in combination with the verbal command. The central controller 18 may be sent sensor data including data related to the gestures performed and the verbal commands. Central controller 18 may generate special effect commands based on the gestures and verbal commands that are different from the commands for the gesture-only case. This enables special effect command generation to be distinguished based on multiple combinations of gestures and verbal commands. Special effects may also be distinguished depending on the skill level of the user associated with the user profile, as previously discussed. The user 14 may then be able to receive more personalized feedback and attempt more combinations of gestures and verbal commands.
Further, the special effect command may specify an intensity level of illumination to be emitted from interactive object 20. The intensity level of the illumination may be tied to the performance of the motion or gesture, the experience level of the user 12, or the user's previous experience in the interactive environment 14, or any combination thereof. The color of the illumination may also be specified via special effect commands and may be associated with a particular interactive object 20 or dependent on the proper completion of a particular gesture or series of gestures with the interactive object 20. For example, a user may initially enter an area of the interactive environment 14 of the plurality of interactive environments 14. The user-interactive object 20 sends the object identification information to the central controller 18 via a wireless transmission. The central controller 18 may then link the interactive object information to the corresponding user 12 in the interactive environment 14. The central controller may then send an initial special effect command specifying a particular intensity level and color of the illumination based on the interactive object information. User 12 may then employ their interactive object 20 to complete a series of gestures. The environmental sensor 16 communicates motion data of the interactive object 20 to the central controller 18, which central controller 18 accesses the data for precision and sends special effect commands specifying a color and/or intensity level of the lighting effect that may be different from the initial command based on the precision of the gesture performed with the interactive object 20. For example, the interactive object 20 may illuminate a green LED with high intensity for proper execution of a gesture and a red LED with low intensity for improper execution of a gesture. The interval of illumination may also be specified via special effect commands received by interactive object 20, and may specify a longer interval of illumination (time period of illumination) or different intensity levels (based on performed actions or object identification information).
In another embodiment, the environmental sensor 16 may not be able to identify a user to link to a particular interactive object of the plurality of interactive objects 20. The central controller 18 utilizes the received interactive object data and identifies that a best match of the interactive object 20 with the user 12 is not possible. In this embodiment, the central controller sends a default effect command to the unmatched interactive object 20 that is retrieved based on the plurality of default effects stored in the central controller 18. This enables a special effect to be observed by the user 12 of the unmatched interactive object 20.
Fig. 3 shows a process flow diagram of a method 29 that permits association of users 12 with their respective interactive objects 20 in an interactive environment 14 and enables updating of a profile of the respective users based on interactions of the users 12 within the interactive environment 14. For example, the method 29 may efficiently select users 12 of a pool of pre-identified users 12 without using more computationally intensive techniques to de novo user identification.
In this embodiment, the plurality of users 12 are free to move around in the area of the interactive environment of the plurality of interactive environments 14. As the user 12 moves around the interactive environment 14, the interactive object control system 10 obtains interactive object data and user data collected via the environment sensors 16 dispersed throughout the interactive environment 14. The data is received by the interactive object control system 10 (block 30), such as at the central controller 18 of the interactive object control system 10.
In one example, the system 10 receives unique identification information from the interactive object 20 or a tag on any interactive object in the range of the environmental sensor 16 of the interactive environment 14. The system 10 also receives location information associated with the interactive object 20. The location information may be based on radio frequency triangulation from the tags such that the interactive object 20 is linked/identified to specific identification information based on an estimate of the location of the sensor signals via the plurality of sensors 16. Thus, the system 10 is able to identify a particular interactive object 20 via wireless communication and link the interactive object 20 to a unique identification number. In another example, the location information is additionally or alternatively determined via sensing of a detectable marker on the interactive object 20. The detectable marks are located in space or tracked via the environmental sensor 16. The sensed location information of the detectable label can be associated with the particular identification label by determining whether the identified interactive object 20 is co-located (co-located) with the sensed detectable label or can be based on an estimated closest distance/possible match between the detected retroreflective label and the location of the origin of the triangulated RFID signal associated with (e.g., conveying) the identification information of the particular label. Further, in some embodiments, the detectable marks may also encode identification information, and/or the interactive object 20 may contain a light emitter that emits identification information and that is tracked in space to provide position/motion information.
User identification may also facilitate interactive object identification. Some interactive objects 20 may be calibrated to or linked to a particular user profile. Thus, the user associated with the user profile is the most likely candidate to hold the interactive object 20. The identity of the user within the area of the interactive environment (e.g., camera data via sensor 16) can be used to identify the associated interactive object 20. Further, the evaluation may be based on historical data. Interactive object 20 may be assumed to be linked to the nearest user from the vicinity of interactive environment 14 until new data is received.
The system 10 analyzes the plurality of users and the interactive object identification data to select a best match of users of the respective users 12 present in the interactive environment 14 with the respective interactive objects 20 in the interactive environment 14 (block 32). The matching may be rule-based, as provided herein. In an embodiment, the system matches or associates the interactive object 20 with a single user 12 of the plurality of users 12 for each interaction. For example, the interactive environment 14 may include a plurality of users 12, some of whom do not carry interactive objects 20. Thus, the rules may permit some users 12 to be unassociated with any interactive object 20. However, each interactive object 20 may be required to be associated with at least one user 12. Rule-based matching may use proximity as a factor in matching, where the detected interactive object 20 may be associated with the closest user 12. However, the elongated interactive object 20 held at the arm length may potentially be closer to the head/face of a different user 12. Accordingly, additional factors such as identifying the object 12 as being held in a suitable manner or being worn in a suitable manner may also be considered. As disclosed herein, the acquired data from the environmental sensor 16 may include camera data that is processed and provided to the analysis to evaluate these factors.
Further, the rules may identify a set of potential users within a larger area (such as a theme park) as candidates for user identification. In one embodiment, high quality image recognition and linking of user profiles to user images and/or other user characteristics (gait, clothing, appearance-based metrics, biometrics) is performed using more computationally intensive sensors and processing in a designated area (such as the main entrance of a theme park). Of the set of potential users, only a subset will be within a particular interactive environment 14 or attraction. Thus, rather than performing the recognition analysis de novo, the interactive object control system 10 may identify the best match within the set and use less computationally intensive user recognition analysis to permit more efficient operation of the interactive object control system 10.
The data collected by the environment sensors 16 is processed to narrow the pool of intended users of the interactive environment 14. The system is able to more efficiently match users 12 with their corresponding interactive objects 20 using the user pool. The ability to narrow the pool of possible users of a particular interactive environment 14 facilitates the identification of users 12 within a crowded interactive environment 14. By utilizing multiple forms of sensing to identify both the user and the interactive object, the user 12 can more efficiently match their respective interactive object. Further, the interactive object control system 10 may be capable of identifying situations in which the interactive object 20 is shared among different users. When a first user interacts with interactive environment 14 using interactive object 20, interactive object control system 10 may generate different special-effect instructions (e.g., on-board special effects of interactive environment 14 and/or activated on interactive object 20) relative to those generated for second user 12 using the same interactive object 20. Thus, interactive object 20 is considered to respond differently to different users 12.
The user profile associated with the selected best matching user 12 and the identified interactive object 20 is then updated by the system to include the association. The user profile is also updated to include user location information corresponding to the particular interactive environment 14 and interactive object data related to the interactions of the interactive objects 20 within the interactive environment 14 (block 34). Personalized special effect commands are sent to the interactive object 20 based on the user's previous experience in the interactive environment 14 (block 36). This enables the corresponding user profile to be updated as the user 12 enters the new interactive environment 14, so that special effect commands sent to the user's interactive object 20 can be distinguished or altered upon repeated accesses based on the user profile containing previous user information related to the location of the user 12 and the experience of the user 12 in the previous interactive environment 14 being accessed, as well as the user's interactive object 20.
It should be appreciated that method 29 may be implemented to build a system-generated user profile that may be coordinated with a user-generated user profile stored on interactive object control system 10 or that may be used independently for unregistered profile users. The user profile information provided by the user may include user age, preferences, attraction access history, park access history, home community information, payment information, and the like. Interactive object control system 10 (as provided herein) may also add interactive object data to the user profile. This may be added in a manner that is not visible to the user, but is accessed by the interactive object control system 10 to direct the interactive experience within the interactive environment 14.
Fig. 4 is a schematic diagram of the interactive object control system 10 demonstrating communication between the interactive object 20 and various components of the interactive object control system 10 external to the interactive object 20. Additionally or alternatively, detection or positioning of the disclosed interactive object 20 as provided herein may involve an environmental sensor 16 (e.g., a proximity sensor, an optical sensor, an image sensor) of a system providing position or movement data of the interactive object 20.
In operation, the environmental sensor 16 senses the interactive object 20 and/or the user 12 by image recognition (e.g., interactive object recognition, facial recognition), detection of retroreflective markers on the interactive object 20, 3D time-of-flight systems, radio frequency sensing, and optical sensing (in addition to other sensing methods that detect the presence of the user 12 and/or the user's interactive object 20 in the interactive environment 14). The interactive object 20 can also include communication circuitry 26, which can include radio frequency identification tags (RFID) that can be activated by the transmission of electromagnetic radiation to output object identification data to the environmental sensors 16 in the interactive environment 14. This data can then be used by a processor 40 disposed in the central controller 18 to link the interactive object 20 to a particular user 12 in the interactive environment 14. The contact of the user 12 to the user's interactive object 20 enables personalized special effect signals to be sent to the communication circuitry 26 of the interactive object 20 and enables the user profile to be updated via the central controller 18 based on the interactions of the user 12 within the interactive environment 14. This special effect signal sent by the central controller 18 is then processed by the object controller 39 housed in the interactive object 20 and activates the special effect system 52, which is powered, for example, passively via power harvesting (optical power harvesting) or actively by a power source, to emit a special effect that is personalized to the user's profile. Further, the interactive object 20 may include an active or passive RFID tag that communicates device identification information. In one embodiment, the RFID tag may be a controlled backscatter (backscatter) RFID tag.
In the illustrated embodiment, the communication circuitry communicates interactive object device information to the central controller 18. In one embodiment, one or more sensors 46 of interactive object 20 detect electromagnetic radiation projected into interactive environment 14. Communication circuitry 26 transmits wireless signals or infrared light signals with interactive object device data via Radio Frequency Identifier (RFID) tags. The environmental sensor 16 receives the interactive object device data and transmits this data to the central controller 18. The interactive object data is utilized by the processor 40 in conjunction with user identification data from the environmental sensor 16 and/or the memory 42. The personalized special effect signal is then passed back to the communication circuitry 26 based on the device and/or user identification. Communication circuitry 26 communicates commands to object controller 39 of interactive object 20. Object controller 39 can send commands to special effect system 52 of interactive object 20. Processor 48 and memory 50 enable special effect instructions to be stored and enable special effect activation and control corresponding to the commands sent.
In the illustrated embodiment, the environmental sensor 16 detects the presence of a user in the interactive environment 14 and collects user data in addition to tracking the interactive object 20 based on the performed gestures. The environmental sensors 16 may include camera face recognition sensors, 3D time-of-flight sensors, optical sensors, and radio frequency sensors. These environmental sensors 16 are dispersed throughout the interactive environment 14 so that the user 12 can be efficiently tracked and located, and personalized effect commands can be sent to the communication circuitry 26 of the user's associated interactive object 20. The environment sensors 16 can be used to identify the user 12 such that the user information and device information provided by the central controller 18 enable dynamic user profiles to be created and updated (as the user 12 moves around the plurality of interactive environments 14). Identification of the user 12 corresponding to the interactive object 20 may be performed via facial recognition cameras dispersed throughout the interactive environment 14 using grip recognition and/or visual recognition.
The memory 42 of the central controller 18 may store user profiles of the plurality of users 12 that have previously been matched with the plurality of interactive objects 20 within the interactive environment. The user profile can then be updated as the user experience with the user's interactive object 20 occurs throughout the plurality of interactive environments 14. The central controller 18 is capable of updating the user profile based on the user's experience with their interactive objects 20 within the area of the interactive environment of the plurality of interactive environments 14. This enables special effects to be distinguished based on user profiles throughout the interactive environment 14 and within multiple accesses to the same interactive environment 14. The user profile can also contain information associated with the user, which can include user-specific characteristics that are predetermined before and after the first use of the object. These characteristics enable further differentiation of special effect commands based on a particular user 12. For example, if a user requests a particular affiliation with a community or selects a particular category from preset selections of categories, the user profile can be updated to display this information. Central controller 18 may then send a special effect signal based in part on the user profile. This may include the output of a particular color LED, sound effect, haptic effect, visual projection, or any combination thereof.
In some embodiments, the central controller 18 may be capable of linking only a threshold or preset number of users 12 to the interactive object 20. The number of users 12 that can be linked to the interactive object 20 may be limited to a particular threshold to maintain device security of the interactive object 20. For example, if a particular interactive object 20 has been linked to two users, the central controller 18 may identify that the threshold number of users of the particular interactive object 20 is two and may not identify a third user seeking to utilize the interactive object 20. The central controller 18 may send a signal (e.g., an effect) to the interactive object 20 of the third user to communicate to the third user: the interactive object 20 cannot be linked to a third user and the third user may need to obtain another interactive object 20. This may be done by a visual effect command that directs the interactive object 20 to illuminate a particular color, a special effect command that directs the interactive object 20 to output a sound effect that conveys out that the interactive object cannot be linked, or any other effect method.
In one example, a particular detected motion pattern of interactive object 20 (based on interactive object data from environmental sensor 16) may be evaluated by central controller 18. Some types of movement patterns may be associated with activating red lights on interactive object 20, while other types of movement patterns may be associated with activating blue lights. Based on the detected pattern, an instruction for activating the lamp color is transmitted to the interactive object 20. Special effect instructions may include instructions to set the intensity, hue, or interval pattern of lamp activation. One or more of these may be altered based on the sensed characteristics of the movement pattern and/or the user profile characteristics. In one embodiment, the activation of the on-board special effect provides feedback to the user that a successful interactive experience has occurred, and the lack of special effect or the special effect of the subtraction (dim light activation) indicates that the interaction should be improved or altered.
The central controller 18, which drives the transmitters 28 and receives and processes data from the environmental sensors 16, may include one or more processors 40 and memory 42. The processors 40, 48 and memories 42, 50 may be referred to generally herein as "processing circuitry. As a specific but non-limiting example, the one or more processors 40, 48 may include one or more Application Specific Integrated Circuits (ASICs), one or more Field Programmable Gate Arrays (FPGAs), one or more general purpose processors, or any combinations thereof. Further, the one or more memories 42, 50 may include volatile memory, such as Random Access Memory (RAM), and/or non-volatile memory, such as read-only memory (ROM), optical drive, hard disk drive, or solid state drive. In some embodiments, the central controller 18 may form at least a portion of a control system configured to coordinate operation of various amusement park features, such as amusement park attractions and control systems. It should be appreciated that the subsystems of the interactive object control system 10 may also contain similar features. In one example, special effects system 52 may include processing capabilities via processor 48 and memory 50. Further, the object controller 39, when present, may also include overall processing and memory components.
The central controller 18 may be part of a distributed, decentralized network of one or more central controllers 18. The decentralized network of one or more central controllers 18 may be in communication with a park central controller and a park central server. The decentralized network of the one or more central controllers 18 facilitates reducing the processing time and processing power required by the one or more central controllers 18 dispersed throughout the one or more interactive environments 14. The decentralized network of one or more central controllers 18 may be configured to obtain a user profile by requesting a profile from a profile feed stored in a park central server. The user profile feed may include user achievements associated with interactive objects, user experience levels, past user locations, and other user information. The one or more central controllers 18 may act as edge controllers that subscribe to a profile feed that includes a plurality of user profiles stored in a park central server and cache the feed to receive one or more user profiles contained in the feed.
In some embodiments, the interactive environment 14 may include one or more central controllers 18. The one or more central controllers 18 within the interactive environment 14 may communicate with each other using a Wireless Mesh Network (WMN) or other wireless and/or wireline communication methods. Special effect commands may be generated by central controller 18, distributed nodes of central controller 18, or by a dedicated local controller associated with interactive environment 18, and communicated to interactive object 20.
In another embodiment, the sensors 46 of the interactive object 20 may include an array of individual pressure or grip sensors that provide pressure information to the object controller 39. The array may be a capacitive or force sensitive resistor array of at least 16 or at least 256 individual sensors. The subject controller 39 can use the signals from the array under passive power to calibrate based on sensor data indicative of the characteristic grip biometrics of a particular user. The calibration process may activate feedback via special effect system 52 (e.g., activate one or more light sources 53, activate a speaker, or another special effect in a pattern associated with matching the interactive object with a particular user). The calibration process may be limited to one user or a threshold number of users such that only a preset number of users may be linked to the interactive object 20 to maintain device security of the interactive object 20.
The interactive object 20 may contain a power source 56, which may be a battery or a power harvester, such as a radio frequency based power harvesting antenna or an optical harvester. A power source 56, such as harvested power, is used to power one or more functions of the interactive object 20, such as the special effect system 52. For example, the power source 56 may power a plurality of light emitting diodes having red, green, blue, and white (RGBW) emitters.
As discussed herein, the interactive object 20 may provide object identification information via optical emissions detected by the environmental sensor 16. The light source 53 of the special effect system 52 may be used to transmit optical information or another light source may be used. Identification may be achieved through the use of radio frequency, infrared, and/or RGBW-based visible light methods of identification code transmission. In the case of infrared or visible methods of identification code transmission, the output of the illuminated light source 53 can be modulated to encode the identification code signal while being indistinguishable to the eye. When an RGBW light emitter is used as a method of outputting and identifying, a second emitter in the infrared range can be used to transmit supplemental identifier information. Interactive object identification via a first technique (receiving RFID signals) may be combined with interactive object sensing or tracking via a second technique (e.g., detecting retroreflective markers). The identification information may be linked to tracking information (e.g., proximity assessment, matching with co-users) as provided herein.
In another embodiment, central controller 18 may send special effect commands to external props or displays containing external special effect system 59 in addition to sending special effect commands to interactive object 20. For example, the user 12 can make gestures or motions with their interactive object 20. The environmental sensors 16 collect motion data and transmit the motion data to the central controller 18. The central controller 18 uses the user profile data and the motion data to send personalized special effect commands to the interactive object 20. In addition to the special effect commands sent to interactive object 20, central controller 18 may also send additional special effect commands to external props or displays in the area of the interactive environment. The special effect command sent to the external prop or display can include a visual effect, a sound effect, or another type of effect command.
Fig. 5 shows a process flow diagram of a method 60 of associating respective users 12 with their interactive objects 20. The method comprises the following steps: detecting a first or initial use of the interactive object 20 (block 62); and user identification data is acquired during the first use (block 64). This identification of the user is facilitated by acquired data aggregated from a plurality of environmental sensors 16 (including facial recognition sensors, 3D time-of-flight systems, and other types of user recognition). The sensor data collected from the environmental sensors 16 is aggregated such that the likelihood of a user being present in the interactive environment 14 can be determined and the user likelihood narrowed. Thus, an individual user 12 is selected from the collection of users 12 based on the user identification data. The data gathered from the plurality of environment sensors 16 reduces the processing power required for identification of users in the area of the interactive environment and accelerates user identification and improves the accuracy of user identification. Another sensing method can identify the user if the user cannot be identified via the face recognition method, thus improving the accuracy of user identification. In the event that the user cannot be identified by available methods, a default user profile may be associated with the identified interactive object and used until the user identification is made. The locations of possible users 12 to user-interactive objects 20 can then be narrowed down to a smaller user pool (block 66) until a single user 12 is selected. The selected user is linked to the interactive object 20. The first use special effect command is communicated to interactive object 20 to activate an on-board special effect that may be specific to the characteristics of the initial use (block 68).
In one embodiment, central controller 18 detects that no profile has been created for user 12 and user's interactive object 20, thus triggering the creation of a new profile to store user information. The user profile is assigned to the user 12 based on the identification of the user 12 via the environmental sensor 16 and the detection of the user's interactive object 20. The profile is stored in the central controller 18 so that it can be updated and utilized to deliver personalized special effect commands.
Fig. 6 shows a process flow diagram of a method 72 for detection of an interactive object 20 in an interactive environment. The method 72 may include steps stored as instructions in the memory 42 and executable by the one or more processors 40 of the central controller 18. It should be noted that in some embodiments, the steps of method 72 may be performed in a different order than those shown, or omitted entirely. Furthermore, some of the illustrated blocks may be performed in combination with each other.
In the illustrated embodiment, the method 72 includes emitting electromagnetic radiation into an area of an interactive environment of the plurality of interactive environments 14 (block 74). The communication circuitry 26 of the interactive object 20 is then triggered by electromagnetic radiation to transmit a wireless signal to communicate interactive object 20 data to the central controller 18 in the interactive environment 14 (block 76). This signal transmitted by the communication circuitry 26 of the interactive object 20 may be facilitated through the use of Radio Frequency Identification (RFID) tags or optical transmitters. Communication circuitry 26 enables the transfer of interactive object data to central controller 18. Concurrent with the interactive object transmission of the interactive object data to the environmental sensors 16 dispersed throughout the interactive environment, the sensors collect user information via facial recognition data, 3D time-of-flight system data, and other sensor data (block 78). This user information is used to facilitate efficiency in user identification within a crowded environment. The multiple forms of user identification enable the user pool to be narrowed and the identification of users 12 in the interactive environment 14 to be more efficient. For example, in a crowded environment, multiple interactive objects 20 and users 12 may exist in the same interactive environment 14. By combining the device data sent to the central controller 18 via the interactive object 20 with sensor data that identifies the user and tracks the movement of the interactive object, the effects sent from the central controller 18 can be more efficiently personalized to the individual users.
For example, in a crowded environment, the central controller 18 can process all collected sensor data from the plurality of users 12 and the interactive objects 20 and utilize the data to determine the users 12 associated with each of the interactive objects 20. The device information transmitted from the interactive object 20 can include how long the interactive object 20 has been active in the interactive environment 14. For example, in the interactive environment 14, the environment sensor 16 can communicate interactive object 20 data and user data to the central controller 18. The central controller 18 will then transmit a personalized effect signal to the user's interactive object communication circuitry 26 based on the user profile (block 80). Using the user profile information, the central controller 18 can identify areas of the interactive environment that the user has previously accessed, and now revisit the same areas of the interactive environment. In one embodiment, activation of a special effect is detected by the environmental sensor 16 in the interactive environment 14 to activate or trigger a responsive effect based on the user profile to distinguish the effect from the previous time the user accessed the area of the interactive environment (block 82). The customer experience can be further personalized by adding experience levels to the user profile. These levels may be determined by how much time the user has spent on the interactive object 20, how much access they have made to the interactive environment 14, and other additional criteria. User profile information can be communicated to the interactive object 20 by signals that the interactive object 20 is coupled to the central controller 18 so that the effects can be further differentiated based on the level of the user 12. This ability of the interactive object 20 to link to a user profile enables a single interactive object 20 to link to multiple users. The overall ability of the hardware in the interactive environment and the hardware in the user's interactive object 20 to communicate user data enables a dynamic user profile to be established that contains user information established from interactions and previous accesses in the environment, which creates a personalized and updated user experience throughout multiple accesses.
While only certain features of the invention have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the disclosure.
The technology presented and claimed herein is referenced and applied to substantial objects and concrete examples that do improve the utility of the art, and thus are not abstract, intangible, or purely theoretical. Further, if any claim attached to the end of this specification contains one or more elements designated as "means for [ performing ]," means for [ function ], or "step for [ performing ]," means for [ function ], it is intended that such elements be interpreted according to 35u.s.c.112 (f). However, for any claim containing elements specified in any other way, it is intended that such elements not be construed in accordance with 35u.s.c.112 (f).

Claims (20)

1. An interactive object system, comprising:
an interactive object, the interactive object comprising:
a special effect system disposed in or on the housing of the interactive object;
A controller disposed in or on the housing of the interactive object, the controller controlling operation of the special effect system;
a plurality of environmental sensors configured to generate sensor data; and
a central controller operative to:
receiving a plurality of user profiles for a plurality of users;
receiving sensor data from a plurality of environmental sensors of an interactive environment;
identifying a user of the plurality of users based on the sensor data, wherein the identified user is associated with the interactive object;
characterizing movement or action of the interactive object based on the collected data from the environmental sensor; and
delivering instructions to the controller of the interactive object to activate a special effect of the special effect system, wherein the instructions are based on a user profile of the plurality of user profiles, the user profile being associated with the identified user and the characterized movement or action of the interactive object.
2. The system of claim 1, wherein the plurality of environmental sensors comprises a facial recognition sensor, a 3D time-of-flight sensor, a radio frequency sensor, an optical sensor, or any combination thereof.
3. The system of claim 1, wherein the sensor data comprises facial recognition data, optical data, radio frequency data, motion data, or any combination thereof.
4. The system of claim 1, wherein the interactive object comprises one or more on-board sensors.
5. The system of claim 1, wherein the interactive object comprises a handheld object, wherein the handheld object is a sword, a stick, a token, a book, a ball, or a statue.
6. The system of claim 1, wherein the interactive object comprises a wearable object, wherein the wearable object is a necklace, medallion, wristband, or hat.
7. The system of claim 1, wherein the interactive object comprises:
a plurality of pressure sensors disposed on an outer surface of the interactive object in a region corresponding to the grip portion; and
wherein the controller is programmed to:
receiving signals from the plurality of pressure sensors;
determining, based on the signal, that a grip is associated with the identified user; and
control signals to the special effect system are generated based on the grip.
8. The system of claim 1, wherein the special effect system further comprises one or more of a haptic feedback device, a light source, or a sound system activated in response to the instruction.
9. The system of claim 1, wherein the central controller is operative to characterize the movement or action by identifying a movement pattern of the interactive object.
10. The system of claim 1, wherein the activated special effect is based on a quality metric of the characterized movement or action.
11. The system of claim 10, wherein a first special effect is activated when the quality metric is above a threshold and a second special effect is activated when the quality metric is below the threshold.
12. The system of claim 9, wherein the activated special effect changes based on a corresponding change to the quality metric.
13. The system of claim 1, wherein the interactive object comprises an optical power harvester that powers the special effect system.
14. A method of activating special effects of an interactive object, comprising:
receiving sensor data from a plurality of sensors in an interactive environment;
Identifying a plurality of users and a plurality of interactive objects in the interactive environment based on the sensor data;
associating an identified interactive object with an identified user;
tracking movement of the identified interactive object using the sensor data; and
instructions are communicated to the identified interactive object to activate an on-board special effect of the interactive object based on the tracked movement and a user profile of the identified user.
15. The method of claim 13, comprising: emitting electromagnetic radiation into the interactive environment and detecting reflection of the electromagnetic radiation by retroreflective markers of the plurality of interactive objects, wherein tracking the movement of the identified interactive object includes tracking retroreflective markers associated with the interactive object.
16. The method of claim 15, comprising: identification information wirelessly communicated by the plurality of interactive objects to identify the plurality of interactive objects is received.
17. The method of claim 16, comprising: the identified interactive object is associated with the retroreflective marker by identifying, based on the sensor data, a retroreflective marker closest to an origin of a wireless signal associated with identification information of the identified interactive object.
18. An interactive object, comprising:
a housing;
a detectable label disposed on or in the housing, operative to reflect a first portion of electromagnetic radiation from the environment;
communication circuitry on or in the housing, the communication circuitry operative to:
receiving a second portion of the electromagnetic radiation from the environment;
transmitting interactive object identification information of the interactive object in response to receiving the second portion of the electromagnetic radiation; and
receiving a special effect instruction;
a controller on or in the housing, the controller receiving the special effect instruction and generating a special effect command; and
a special effect system that receives the special effect command and activates a special effect based on the special effect command.
19. The interactive object of claim 18, wherein the detectable marker comprises a retroreflective marker.
20. The interactive object of claim 18, wherein the communication circuitry comprises a Radio Frequency Identification (RFID) tag or an optical communicator.
CN202280009006.8A 2021-01-04 2022-01-04 User-specific interactive object system and method Pending CN116745009A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US63/133625 2021-01-04
US63/172447 2021-04-08
US17/563564 2021-12-28
US17/563,564 US20220214742A1 (en) 2021-01-04 2021-12-28 User-specific interactive object systems and methods
PCT/US2022/011104 WO2022147526A1 (en) 2021-01-04 2022-01-04 User-specific interactive object systems and methods

Publications (1)

Publication Number Publication Date
CN116745009A true CN116745009A (en) 2023-09-12

Family

ID=87917313

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280009006.8A Pending CN116745009A (en) 2021-01-04 2022-01-04 User-specific interactive object system and method

Country Status (1)

Country Link
CN (1) CN116745009A (en)

Similar Documents

Publication Publication Date Title
US20220214742A1 (en) User-specific interactive object systems and methods
US11983596B2 (en) Interactive systems and methods with tracking devices
US20210027587A1 (en) Interactive systems and methods with feedback devices
US20220100282A1 (en) Gesture recognition (gr) device with multiple light sources generating multiple lighting effects
WO2022147526A1 (en) User-specific interactive object systems and methods
CN116745009A (en) User-specific interactive object system and method
JP2020513859A (en) Darts game device and darts game system providing lesson video
US20240135548A1 (en) Systems and methods for tracking an interactive object
US20240233138A9 (en) Systems and methods for tracking an interactive object
US20220308536A1 (en) Information processing device, method, and program
US11797079B2 (en) Variable effects activation in an interactive environment
US12032753B2 (en) Identification systems and methods for a user interactive device
US20210342616A1 (en) Identification systems and methods for a user interactive device
WO2022165203A1 (en) Variable effects activation in an interactive environment
CN116829233A (en) Variable effect activation in an interactive environment
KR20230164158A (en) Interactive experience with portable devices

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40100295

Country of ref document: HK