US8939840B2 - System and method for playsets using tracked objects and corresponding virtual worlds - Google Patents
System and method for playsets using tracked objects and corresponding virtual worlds Download PDFInfo
- Publication number
- US8939840B2 US8939840B2 US12/462,140 US46214009A US8939840B2 US 8939840 B2 US8939840 B2 US 8939840B2 US 46214009 A US46214009 A US 46214009A US 8939840 B2 US8939840 B2 US 8939840B2
- Authority
- US
- United States
- Prior art keywords
- connection
- virtual
- physical environment
- processor
- virtual world
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/23—Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
- A63F13/235—Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
-
- A63F13/12—
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H3/00—Dolls
- A63H3/28—Arrangements of sound-producing means in dolls; Means in dolls for producing sounds
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/216—Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/428—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/58—Controlling game characters or game objects based on the game progress by computing conditions of game characters, e.g. stamina, strength, motivation or energy level
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/65—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/825—Fostering virtual characters
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H3/00—Dolls
- A63H3/36—Details; Accessories
- A63H3/52—Dolls' houses, furniture or other equipment; Dolls' clothing or footwear
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/215—Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
- A63F13/98—Accessories, i.e. detachable arrangements optional for the use of the video game device, e.g. grip supports of game controllers
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/50—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
- A63F2300/57—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player
- A63F2300/575—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player for trading virtual items
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H2200/00—Computerized interactive toys, e.g. dolls
Definitions
- the present invention relates generally to interactive toys. More particularly, the present invention relates to interactive toys with sensing features and online play.
- tangible objects are amenable for a pattern of play geared towards nurturing and care, such as for a pet.
- a user can feed, pet, bathe, groom, and perform other interactions to nurture and care for the pet.
- these same actions might also be simulated in a virtual environment, they may not provide the same sensory feedback and immediacy as a tangible object, as opposed to a merely passive computer display that can only visually display the simulated actions occurring in the virtual environment.
- virtual online worlds and social networks provide compelling features that are not typically available to standalone, non-networked toys. These features may include, for example, communication with friends and colleagues from school or around the world, participation in chat groups or forums, the ability to view other people's virtual collections and environments or to display your own, and exploring fantastic virtual worlds that may be difficult to simulate in a tangible way. As consumers have grown to use and take advantage of these networked experiences, greater integration of these networked services into products provides an attractive selling point.
- FIG. 1 presents a system for providing a playset environment using tracked objects and corresponding virtual worlds, according to one embodiment of the present invention
- FIG. 2 presents a toy object interacting with an accessory object and a virtual object corresponding to the toy object, according to one embodiment of the present invention
- FIG. 3 presents a diagram demonstrating a toy object providing a time contextual interaction, according to one embodiment of the present invention.
- FIG. 4 shows a flowchart describing the steps, according to one embodiment of the present invention, by which a toy object can be used with an environment and connected to a virtual world corresponding to the environment.
- the present application is directed to a system and method for playsets using tracked objects and corresponding virtual worlds.
- the following description contains specific information pertaining to the implementation of the present invention.
- One skilled in the art will recognize that the present invention may be implemented in a manner different from that specifically discussed in the present application. Moreover, some of the specific details of the invention are not discussed in order not to obscure the invention. The specific details not described in the present application are within the knowledge of a person of ordinary skill in the art.
- the drawings in the present application and their accompanying detailed description are directed to merely exemplary embodiments of the invention. To maintain brevity, other embodiments of the invention, which use the principles of the present invention, are not specifically described in the present application and are not specifically illustrated by the present drawings.
- FIG. 1 presents a system for providing a physical playset environment using tracked objects and corresponding virtual worlds, according to one embodiment of the present invention.
- Diagram 100 of FIG. 1 includes physical environment 110 , object 120 , display 131 , host computer 140 , network 145 , server 150 , virtual environment 160 , and virtual object 170 .
- Physical environment 110 includes rooms 111 a - 111 f , with object 120 positioned in room 111 a .
- Server 150 includes virtual environment 160 .
- Virtual environment 160 also includes virtual object 170 in a position corresponding to room 111 a.
- FIG. 1 presents a broad overview of how figural toys can be used and tracked by playset environments and interfaced with virtual worlds.
- the figural toy in FIG. 1 is represented by a pet figure or plush toy, but could comprise other forms such as battle and action figures, dress up and fashion dolls, posing figures, game pieces for board games, robots, and others.
- Object 120 is placed in room 111 a of physical environment 110 , which may comprise a dining area of a playset or dollhouse.
- physical environment 110 is presented as a house or living space, physical environment 110 may also comprise other forms such as a game board.
- Position tracking can be implemented using any method known in the art, such as infrared tracking, radio frequency identification (RFID) tags, electromagnetic contacts, electrical fields, optical tracking, and others. Additionally, an orientation in space might also be detected to provide even more precise position tracking. This might be used to detect whether object 120 is facing a particular feature in physical environment 110 . For example, if object 120 is facing a mirror within room 111 e , a speaker embedded in object 120 might make a comment about how well groomed his mane is, whereas if object 120 has his back to the mirror, the speaker would remain silent.
- RFID radio frequency identification
- physical environment 110 can also interface with virtual environment 160 , a virtual world representation of physical environment 110 .
- physical environment 110 can communicate with virtual environment 160 by using host computer 140 to communicate with server 150 over network 145 .
- a connection from physical environment 110 to host computer 140 might be supported through a Universal Serial Bus (USB) connection using a client daemon or middleware program service running on host computer 140 , and network 145 might comprise a public network such as the Internet.
- the middleware program might, for example, be provided to the consumer on disc media or as a download when object 120 is purchased at retail.
- the middleware program may scan USB devices for the presence of physical environment 110 and act as a communications intermediary between physical environment 110 , object 120 , host computer 140 , and server 150 .
- physical environment 110 or object 120 may include embedded WiFi, Bluetooth, 3G mobile communications, WiMax, or other wireless communications technologies to communicate over network 145 directly instead of using the middleware program on host computer 140 as an intermediary to access network 145 .
- a client application or web browser on host computer 140 might access a virtual world or website on server 150 providing access to virtual environment 160 .
- This client application may interface with the middleware program previously described to communicate with physical environment 110 and object 120 , or communicate directly over network 145 if direct wireless communication is supported.
- the client application or web browser of host computer 140 may then present a visual depiction of virtual environment 160 on display 131 , including virtual object 170 corresponding to object 120 .
- server 150 can also connect to other host computers connected to network 145
- virtual environment 160 can also provide interactivity with other virtual environments and provide features such as online chat, item trading, collection showcases, downloading or trading of supplemental or user generated content, synchronization of online and offline object states, and other features.
- the position of virtual object 170 in relation to virtual environment 160 can be made to correspond to the position of object 120 in relation to physical environment 110 .
- virtual object 170 may similarly move right to the corresponding adjacent room in virtual environment 160 .
- a corresponding virtual furniture accessory object may be instantiated and also placed in a corresponding room of virtual environment 160 with the same position against the left wall and orientation facing towards the right. In this manner, a consumer can easily add a virtual version of a real object without having to enter a tedious unlocking code or complete another bothersome registration process.
- the virtual object may also be removed, allowing the real object to provide the proof of ownership.
- inserting the real object into physical environment 110 may provide permanent ownership of the virtual object counterpart, which may then be moved to a virtual inventory of the consumer.
- the correspondence of real and virtual positions may be done in a continuous fashion such that positioning and orientation of objects within physical environment 110 are continuously reflected in virtual environment 160 , providing the appearance of real-time updates for the consumer.
- each object may have an embedded identifier to allow physical environment 110 to track several different objects or multiple objects simultaneously. This may allow, for example, special interactions if particular combinations of objects are present in specific locations. In this manner, various types of virtual online and tangible or physical interactions can be mixed together to provide new experiences.
- One example of mixing virtual online and tangible interactions might be virtual triggers affecting the tangible environment, also referred to as traps or hotspots. If object 120 is moved to room 111 c , the playroom, then a trigger within the corresponding area in virtual environment 160 might be initiated. As a result of the trigger, corresponding virtual object 170 might then enact a short drama story on a stage with other virtual objects, for example.
- Interactive elements of physical environment 110 such as switches, levers, doors, cranks, and other mechanisms might also trigger events or other happenings within virtual environment 160 . Similarly in the other direction, elements of virtual environment 160 might also affect physical environment 110 .
- Decorative wallpaper might be selected to decorate virtual environment 160 , causing corresponding wallpaper to be displayed in physical environment 110 using, for example, a scrolling paper roll with preprinted wallpaper, LCD screens, electronic ink, or other methods.
- FIG. 2 presents a toy object interacting with an accessory object and a virtual object corresponding to the toy object, according to one embodiment of the present invention.
- Diagram 200 of FIG. 2 includes object 220 , network 245 , virtual object 270 , and accessory object 290 .
- Object 220 includes processor 221 , timer 222 , feedback system 223 , memory 224 , and sensors 230 .
- Feedback system 223 includes light emitting diode (LED) array 237 , speaker 238 , and motor 239 .
- Memory 224 includes history 225 and state 227 .
- History 225 includes event 226 a and event 226 b .
- Sensors 230 include position sensor 231 , touch sensor 232 , light sensor 233 , microphone 234 , video camera 235 , and accessory detector 236 .
- Virtual object 270 includes virtual position 271 and state 277 .
- Accessory object 290 includes accessory identifier 296 .
- object 220 corresponds to object 120 from FIG. 1
- network 245 corresponds to network 145
- virtual object 270 corresponds to virtual object 170 .
- FIG. 2 presents a more detailed view of components that may comprise a toy object.
- a rechargeable battery or another power source may provide the power for the components of object 220 .
- Processor 221 may comprise any processor or controller for carrying out the logic of object 220 .
- an embedded low power microcontroller may be suitable for minimizing power consumption and extending battery life.
- Timer 222 may comprise, for example, a real-time clock (RTC) for keeping track of the present date and time.
- RTC real-time clock
- a separate battery might provide power specifically for timer 222 so that accurate time can be maintained even if primary battery life is drained.
- Feedback system 223 includes various parts for providing feedback to a consumer, such as emitting lights from LED array 237 , emitting sound through speaker 238 , and moving object 220 using motor 239 .
- Object 220 also includes sensors 230 for detecting and determining inputs and other environmental factors, including position sensor 231 for determining a position of object 220 , touch sensor 232 for detecting a touch interaction with object 220 , light sensor 233 for detecting an ambient light level, microphone 234 for recording sound, video camera 235 for recording video, and accessory detector 236 for detecting a presence of an accessory object.
- sensors 230 for detecting and determining inputs and other environmental factors, including position sensor 231 for determining a position of object 220 , touch sensor 232 for detecting a touch interaction with object 220 , light sensor 233 for detecting an ambient light level, microphone 234 for recording sound, video camera 235 for recording video, and accessory detector 236 for detecting a presence of an accessory object.
- sensors may be used to initiate actions that affect the emotional and well-being parameters of object 220 , represented within state 227 .
- touch sensor 232 may be used to detect petting of object 220 , which may cause the happiness parameter of state 227 to increase or decrease depending on how often a consumer pets object 220 .
- Light sensor 233 may detect the amount of ambient light, which may be used to interpret how much time object 220 is given to sleep based on how long object 220 is placed in a dark environment. Sufficient sleep might cause the health parameter of state 227 to rise, whereas insufficient sleep may cause the health parameter to fall.
- Microphone 234 might be used to determine whether the consumer talks to object 220 or plays music for object 220 , with more frequent conversations and music positively affecting the friendship parameter.
- video camera 235 might record the expression of the consumer, where friendly and smiling expressions also positively affect the friendship parameter.
- object 220 the particular configuration of sensors, feedback systems, and state parameters shown in object 220 are merely exemplary and other embodiments may use different configurations depending on the desired application and available budget. For example, an application with fighting action figures might focus more on combat skills and battle training rather than nurturing and well-being parameters.
- accessory detector 236 any of several different tracking technologies may be used to detect the presence and proximity of accessory objects, similar to position sensor 231 .
- accessory detector 236 might comprise a RFID reader, and accessory identifier 296 of accessory object 290 may comprise a RFID tag.
- accessory detector 236 can read accessory identifier 296 , which indicates the presence of accessory object 290 .
- This can be used to simulate various actions with object 220 .
- accessory object 290 might comprise a plastic food toy that triggers a feeding action with object 220 .
- Other examples might include a brush for grooming, soap for bathing, trinkets for gifts, and other objects.
- placing object 220 within room 111 a might trigger feeding of food
- within room 111 b might trigger sleep
- within room 113 c might trigger play activities
- within room 111 d might trigger a bath
- within room 111 e grooming might trigger a room
- within room 111 f listening to music may then affect the parameters in state 227 as previously discussed.
- memory 224 also contains a record of past interactions with object 220 as history 225 .
- history 225 two events 226 a - 226 b are already recorded.
- Event 226 a describes that object 220 was bathed with a timestamp of Monday at 8:00 pm. This event might be registered, for example, if the consumer moves the object 220 to room 111 d of FIG. 1 , or if accessory object 290 representing a piece of soap is brought within close proximity to accessory detector 236 of object 220 .
- Timer 222 may have also been used to determine that the action also occurred on Monday at 8:00 pm.
- event 226 b describing that object 220 was fed a steak on Tuesday at 6:00 pm may have been registered by moving object 220 to room 111 a of FIG. 1 or bringing accessory object 290 representing a steak within close proximity to accessory detector 236 .
- These events might also be evaluated to influence the parameters in state 227 , as previously described.
- timer 222 is constantly updated to reflect the current time, which may also be used to determine the effect of any events on state 227 .
- object 220 may be preprogrammed with a particular optimal schedule, or how often it prefers to be bathed, fed, petted, and so forth. If, for example, timer 222 progresses to the point where the last feeding event was a week ago, this information might be used by processor 221 to decrease the happiness parameter of state 227 , as the pet may prefer to be fed at least once a day. On the other hand, repeatedly feeding the pet at closely spaced time intervals may decrease the health parameter, as overfeeding the pet is not optimal either. In this manner, time contextual interactivity can be provided to the consumer, enabling richer and more realistic experiences.
- object 220 may interface with virtual object 270
- object 220 may not always have a continuous connection to the virtual world with virtual object 270 , as the consumer might turn off the host computer providing the connection to network 245 , network 245 might have connectivity issues, wireless signals might receive strong interference, or other circumstances may terminate the availability of network communications over network 245 .
- history 255 may contain a record of events that occur offline, so that once network 245 is available again, state 277 of virtual object 270 can be synchronized with state 277 of object 220 .
- state 277 of virtual object 270 corresponding to object 220 is out of date, as the happiness and health parameters are only 30, whereas in object 220 the happiness and health parameters have increased by 20 to 50.
- This may be attributed to events 226 a - 226 b occurring offline, where event 226 a , bathing, raised the health parameter by 20 points, and where event 226 b , feeding steak, raised the happiness parameter by 20 points.
- state 277 might also be updated to synchronize with state 227 , also adding 20 points to the health and happiness parameters.
- virtual position 271 representing the position of virtual object 270 in a corresponding virtual world, may also be updated to correspond to a position based on input readings from position sensor 231 of object 220 , as previously described.
- This synchronization could also occur in the other direction as well.
- an online friend might send a virtual gift to virtual object 270 , causing the happiness parameter of state 277 to rise by 10 points.
- object 220 reestablishes a connection to virtual object 270 over previously down but now available network 245 , those additional 10 points to happiness might also be added to state 227 of object 220 .
- both online and offline play are supported and actions within an offline or online context can be credited to the other context, thereby increasing consumer satisfaction and encouraging further play.
- FIG. 2 focuses on toy objects
- the elements contained with object 220 including processor 221 , timer 222 , feedback system 223 , memory 224 , and sensors 230 might also be embedded in a corresponding environment, or tangible playset.
- the playset rather than the object, may gather sensor data and record events for exchanging with a virtual environment containing virtual object 270 .
- the environment and the object may share and divide particular duties with regard to communications to a virtual environment.
- FIG. 2 an embodiment where functions are primarily carried out by the toy object is presented in FIG. 2 .
- FIG. 3 presents a diagram demonstrating a toy object providing a time contextual interaction, according to one embodiment of the present invention.
- Diagram 300 of FIG. 3 includes objects 320 a - 320 b and accessory object 390 .
- Object 320 a includes timer 322 , history 325 a , state 327 a , articulating part 329 a , and LED 337 a .
- History 325 a includes event 326 a .
- Object 320 b includes history 325 b , state 327 b , articulating part 329 b , and LED 337 b .
- History 325 b includes event 326 b .
- objects 320 a - 320 b correspond to object 220 from FIG. 2
- accessory object 390 corresponds to accessory object 290 .
- FIG. 3 presents an example time contextual interaction that a toy object may provide.
- Object 320 a represents an initial state of the toy object, which is further explained through the information conveyed in timer 322 , history 325 a , and state 327 a .
- timer 322 indicates that the current time is Friday, 6:00 pm, or more than four days past the last time the pet was fed. This is also reflected in the low happiness parameter of 10 shown in state 327 a , which may have been calculated using predetermined rules regarding preferred feeding frequency.
- a speaker embedded in object 320 a might also, for example, play the sound of a grumbling stomach, or make a verbal request for food.
- accessory object 390 representing a steak
- object 320 a may transition to the state shown by object 320 b .
- state 327 b the happiness parameter has increased drastically from 10 to 80
- the action of feeding is noted in history 325 b as event 326 b
- the visual indicators change to indicate the state of happiness, with LED 337 b glowing a bright red and articulating part 329 b presenting perked up ears.
- history 325 b shows that event 326 b replaces event 326 a from history 325 a
- alternative embodiments may preserve both events, which may be used, for example, to detect whether overfeeding has occurred by analyzing the frequency of feeding events over time. In this manner, a toy object can provide various time contextual interactions, providing deeper game play mechanics than objects focusing on canned and immediate responses.
- FIG. 4 shows a flowchart describing the steps, according to one embodiment of the present invention, by which a toy object can be used with an environment and connected to a virtual world corresponding to the environment.
- Certain details and features have been left out of flowchart 400 that are apparent to a person of ordinary skill in the art.
- a step may comprise one or more substeps or may involve specialized equipment or materials, as known in the art.
- steps 410 through 440 indicated in flowchart 400 are sufficient to describe one embodiment of the present invention, other embodiments of the invention may utilize steps different from those shown in flowchart 400 .
- step 410 of flowchart 400 comprises object 120 establishing a first connection with virtual environment 160 hosted on server 150 via host computer 140 and network 145 , wherein virtual environment 160 contains virtual object 170 corresponding to object 120 .
- physical environment 110 may include a USB connector connecting directly to host computer 140 , which in turn connects to a broadband modem to access network 145 , which may comprise the public Internet.
- FIG. 1 only shows one particular method of connectivity, and other methods, such as wirelessly connecting to network 145 , might also be utilized.
- step 420 of flowchart 400 comprises object 220 corresponding to object 120 updating state 277 of virtual object 270 corresponding to virtual object 170 using the first connection established in step 410 and history 225 .
- timer 222 is compared to the timestamps of events 226 a - 226 b recorded in history 225 to determine a net effect to apply to state 227 , which also applies to state 277 of virtual object 270 .
- events that are evaluated the same regardless of timer 222 may also exist, in which case timer 222 may not need to be consulted.
- the net effect can then be sent over network 245 using the first connection established previously in step 410 , so that state 277 of virtual object 270 can be correspondingly updated.
- step 430 of flowchart 400 comprises object 120 determining a position of object 120 using position sensor 231 .
- position sensor 231 may use a variety of technologies to implement three-dimensional position and orientation detection.
- object 120 may determine that it is positioned in room 111 a of physical environment 110 and oriented facing outwards towards the consumer.
- step 440 of flowchart 400 comprises object 120 sending the position determined in step 430 using the first connection established in step 410 to cause a position of virtual object 170 in virtual environment 160 to correspond to the position of object 120 in physical environment 110 .
- this could also occur on a continuous basis so that the movements of object 120 and virtual object 170 appear to be synchronized in real-time to the consumer.
- position tracked toy objects providing time contextual interactivity can be used in both offline and online contexts.
- first connection might be explicitly terminated, an input might be detected from the variety of sensors contained in sensors 230 of FIG. 2 , history 225 may be correspondingly updated with new events, and a new second connection may be established with virtual object 270 over network 245 . State 277 may then be updated using this second connection to reflect the newly added events in history 225 . Additionally, as previously discussed, this process may also proceed in the other direction, where activity occurring to the virtual object in the online context can be credited to the tangible object when connected online.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Environmental & Geological Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Toys (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
Description
Claims (20)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/462,140 US8939840B2 (en) | 2009-07-29 | 2009-07-29 | System and method for playsets using tracked objects and corresponding virtual worlds |
EP10006315.5A EP2322258B1 (en) | 2009-07-29 | 2010-06-18 | System and method for playsets using tracked objects and corresponding virtual worlds |
US13/860,434 US9339729B2 (en) | 2009-07-29 | 2013-04-10 | System and method for playsets using tracked objects and corresponding virtual worlds |
US15/096,090 US9539506B2 (en) | 2009-07-29 | 2016-04-11 | System and method for playsets using tracked objects and corresponding virtual worlds |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/462,140 US8939840B2 (en) | 2009-07-29 | 2009-07-29 | System and method for playsets using tracked objects and corresponding virtual worlds |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/860,434 Continuation US9339729B2 (en) | 2009-07-29 | 2013-04-10 | System and method for playsets using tracked objects and corresponding virtual worlds |
Publications (2)
Publication Number | Publication Date |
---|---|
US20110028219A1 US20110028219A1 (en) | 2011-02-03 |
US8939840B2 true US8939840B2 (en) | 2015-01-27 |
Family
ID=43527534
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/462,140 Active 2031-09-25 US8939840B2 (en) | 2009-07-29 | 2009-07-29 | System and method for playsets using tracked objects and corresponding virtual worlds |
US13/860,434 Active 2030-03-19 US9339729B2 (en) | 2009-07-29 | 2013-04-10 | System and method for playsets using tracked objects and corresponding virtual worlds |
US15/096,090 Active US9539506B2 (en) | 2009-07-29 | 2016-04-11 | System and method for playsets using tracked objects and corresponding virtual worlds |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/860,434 Active 2030-03-19 US9339729B2 (en) | 2009-07-29 | 2013-04-10 | System and method for playsets using tracked objects and corresponding virtual worlds |
US15/096,090 Active US9539506B2 (en) | 2009-07-29 | 2016-04-11 | System and method for playsets using tracked objects and corresponding virtual worlds |
Country Status (2)
Country | Link |
---|---|
US (3) | US8939840B2 (en) |
EP (1) | EP2322258B1 (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130231193A1 (en) * | 2009-07-29 | 2013-09-05 | Disney Enterprises, Inc. | System and Method for Playsets Using Tracked Objects and Corresponding Virtual Worlds |
US20140067768A1 (en) * | 2012-08-30 | 2014-03-06 | Atheer, Inc. | Method and apparatus for content association and history tracking in virtual and augmented reality |
US9157617B1 (en) | 2014-10-22 | 2015-10-13 | Codeshelf | Modular hanging lasers to provide easy installation in a distribution center |
US9262741B1 (en) | 2015-04-28 | 2016-02-16 | Codeshelf | Continuous barcode tape based inventory location tracking |
US9327397B1 (en) | 2015-04-09 | 2016-05-03 | Codeshelf | Telepresence based inventory pick and place operations through robotic arms affixed to each row of a shelf |
US9836806B1 (en) | 2013-06-07 | 2017-12-05 | Intellifect Incorporated | System and method for presenting user progress on physical figures |
US10061468B2 (en) | 2012-12-21 | 2018-08-28 | Intellifect Incorporated | Enhanced system and method for providing a virtual space |
US10229608B2 (en) * | 2014-08-19 | 2019-03-12 | Intellifect Incorporated | Wireless communication between physical figures to evidence real-world activity and facilitate development in real and virtual spaces |
US10743732B2 (en) | 2013-06-07 | 2020-08-18 | Intellifect Incorporated | System and method for presenting user progress on physical figures |
US11113888B2 (en) * | 2016-12-26 | 2021-09-07 | Interdigital Ce Patent Holdings, Sas | Device and method for generating dynamic virtual contents in mixed reality |
US11285614B2 (en) * | 2016-07-20 | 2022-03-29 | Groove X, Inc. | Autonomously acting robot that understands physical contact |
Families Citing this family (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9501203B1 (en) * | 2010-06-16 | 2016-11-22 | Zynga Inc. | System and method for modifying a game state of a player of a social game |
US9186575B1 (en) | 2011-03-16 | 2015-11-17 | Zynga Inc. | Online game with animal-breeding mechanic |
WO2012162090A2 (en) | 2011-05-20 | 2012-11-29 | William Mark Corporation | App gadgets and methods therefor |
CA3164530C (en) * | 2011-10-28 | 2023-09-19 | Magic Leap, Inc. | System and method for augmented and virtual reality |
US9079113B2 (en) * | 2012-01-06 | 2015-07-14 | J. T. Labs Limited | Interactive personal robotic apparatus |
JP6243356B2 (en) | 2012-02-06 | 2017-12-06 | ホットヘッド ゲームズ インコーポレイテッド | Virtual opening of card boxes and packs |
US9649565B2 (en) * | 2012-05-01 | 2017-05-16 | Activision Publishing, Inc. | Server based interactive video game with toys |
US9849389B2 (en) * | 2012-10-03 | 2017-12-26 | Gree, Inc. | Method of synchronizing online game, and server device |
CN104718007A (en) * | 2012-10-04 | 2015-06-17 | 迪士尼企业公司 | Interactive objects for immersive environment |
GB2507073B (en) | 2012-10-17 | 2017-02-01 | China Ind Ltd | Interactive toy |
US9764229B2 (en) * | 2013-05-23 | 2017-09-19 | Disney Enterprises, Inc. | Unlocking of digital content based on geo-location of objects |
US9656172B2 (en) | 2013-08-16 | 2017-05-23 | Disney Enterprises, Inc. | Unlocking of virtual content through geo-location |
US9981197B2 (en) | 2014-02-06 | 2018-05-29 | Seebo Interactive, Ltd. | Connected kitchen toy device |
US9934613B2 (en) * | 2014-04-29 | 2018-04-03 | The Florida International University Board Of Trustees | Systems for controlling a movable object |
WO2015185629A2 (en) | 2014-06-06 | 2015-12-10 | Lego A/S | Interactive game apparatus and toy construction system |
EP3154652A4 (en) * | 2014-06-16 | 2018-02-21 | Watry, Krissa | Interactive cloud-based toy |
US20170189804A1 (en) * | 2014-06-23 | 2017-07-06 | Seebo Interactive Ltd. | Connected Toys System For Bridging Between Physical Interaction Of Toys In Reality To Virtual Events |
US10478723B2 (en) | 2014-06-30 | 2019-11-19 | Microsoft Technology Licensing, Llc | Track based play systems |
US10518188B2 (en) * | 2014-06-30 | 2019-12-31 | Microsoft Technology Licensing, Llc | Controlling physical toys using a physics engine |
US20150375115A1 (en) * | 2014-06-30 | 2015-12-31 | Microsoft Corporation | Interacting with a story through physical pieces |
US10537821B2 (en) | 2014-06-30 | 2020-01-21 | Microsoft Technology Licensing, Llc | Interactive play sets |
US9639159B2 (en) | 2014-07-25 | 2017-05-02 | Rovio Entertainment Ltd | Physical surface interaction |
EP2977856A1 (en) * | 2014-07-25 | 2016-01-27 | Rovio Entertainment Ltd | Physical surface interaction |
DK3200886T3 (en) | 2014-10-02 | 2023-10-16 | Lego As | GAME SYSTEM |
US10369477B2 (en) | 2014-10-08 | 2019-08-06 | Microsoft Technology Licensing, Llc | Management of resources within a virtual world |
US9696757B2 (en) | 2014-10-08 | 2017-07-04 | Microsoft Corporation | Transfer of attributes between generations of characters |
US9919226B2 (en) * | 2014-10-08 | 2018-03-20 | Microsoft Technology Licensing, Llc | Storage and charging device for game pieces |
WO2016172506A1 (en) | 2015-04-23 | 2016-10-27 | Hasbro, Inc. | Context-aware digital play |
US10052552B2 (en) * | 2015-05-28 | 2018-08-21 | Disney Enterprises, Inc. | Systems and methods for association of virtual gaming to physical environments |
US10238958B2 (en) * | 2015-06-03 | 2019-03-26 | Sony Interactive Entertainment America Llc | Tangible tradable collectibles having a digital copy |
US10616310B2 (en) | 2015-06-15 | 2020-04-07 | Dynepic, Inc. | Interactive friend linked cloud-based toy |
US20170014718A1 (en) * | 2015-07-14 | 2017-01-19 | Hothead Games, Inc. | Server daemon based gameplay management |
US10207180B2 (en) * | 2015-12-22 | 2019-02-19 | Intel Corporation | Multi-player game input with detection of context and physical object usage |
US10928915B2 (en) * | 2016-02-10 | 2021-02-23 | Disney Enterprises, Inc. | Distributed storytelling environment |
US9919213B2 (en) | 2016-05-03 | 2018-03-20 | Hothead Games Inc. | Zoom controls for virtual environment user interfaces |
CN105879410A (en) * | 2016-06-03 | 2016-08-24 | 深圳市领芯者科技有限公司 | Interactive biological system and interactive method based on sensing toy |
US10010791B2 (en) | 2016-06-28 | 2018-07-03 | Hothead Games Inc. | Systems and methods for customized camera views and customizable objects in virtualized environments |
US10004991B2 (en) | 2016-06-28 | 2018-06-26 | Hothead Games Inc. | Systems and methods for customized camera views in virtualized environments |
US11918928B2 (en) * | 2019-12-17 | 2024-03-05 | Disney Enterprises, Inc. | Virtual presentation of a playset |
US11154777B2 (en) * | 2020-03-19 | 2021-10-26 | Disney Enterprises, Inc. | Dynamic interactive media for a playset |
CN111672112B (en) * | 2020-06-05 | 2023-03-24 | 腾讯科技(深圳)有限公司 | Virtual environment display method, device, equipment and storage medium |
US11517812B2 (en) | 2021-02-19 | 2022-12-06 | Blok Party, Inc. | Application of RFID gamepieces for a gaming console |
Citations (49)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4802879A (en) | 1986-05-05 | 1989-02-07 | Tiger Electronics, Inc. | Action figure toy with graphics display |
US5158492A (en) | 1991-04-15 | 1992-10-27 | Elliott A. Rudell | Light activated doll |
US5746602A (en) * | 1996-02-27 | 1998-05-05 | Kikinis; Dan | PC peripheral interactive doll |
US5752880A (en) | 1995-11-20 | 1998-05-19 | Creator Ltd. | Interactive doll |
US5853327A (en) * | 1994-07-28 | 1998-12-29 | Super Dimension, Inc. | Computerized game board |
US5912454A (en) * | 1997-06-30 | 1999-06-15 | Microsoft Corporation | System and method for detecting a relative change in light intensity |
WO2000009229A1 (en) | 1998-08-13 | 2000-02-24 | Tiger Electronics, Ltd. | Action figure toy with communication device |
WO2000044460A2 (en) | 1999-01-28 | 2000-08-03 | Pleschutznig Lanette O | Interactively programmable toy |
US6227931B1 (en) * | 1999-07-02 | 2001-05-08 | Judith Ann Shackelford | Electronic interactive play environment for toy characters |
US6290565B1 (en) | 1999-07-21 | 2001-09-18 | Nearlife, Inc. | Interactive game apparatus with game play controlled by user-modifiable toy |
US20010045978A1 (en) | 2000-04-12 | 2001-11-29 | Mcconnell Daniel L. | Portable personal wireless interactive video device and method of using the same |
US20020049833A1 (en) | 1996-02-27 | 2002-04-25 | Dan Kikinis | Tailoring data and transmission protocol for efficient interactive data transactions over wide-area networks |
WO2002047013A2 (en) | 2000-11-14 | 2002-06-13 | 4Kids Entertainement Licensing, Inc. | Object recognition toys and games |
US20020094851A1 (en) | 2001-01-16 | 2002-07-18 | Rheey Jin Sung | Method of breeding robot pet using on-line and off-line systems simultaneously |
US20020098879A1 (en) | 2001-01-19 | 2002-07-25 | Rheey Jin Sung | Intelligent pet robot |
US6442450B1 (en) | 1999-01-20 | 2002-08-27 | Sony Corporation | Robot device and motion control method |
US6460851B1 (en) * | 1996-05-10 | 2002-10-08 | Dennis H. Lee | Computer interface apparatus for linking games to personal computers |
US6519506B2 (en) | 1999-05-10 | 2003-02-11 | Sony Corporation | Robot and control method for controlling the robot's emotions |
US6529802B1 (en) | 1998-06-23 | 2003-03-04 | Sony Corporation | Robot and information processing system |
US6554679B1 (en) * | 1999-01-29 | 2003-04-29 | Playmates Toys, Inc. | Interactive virtual character doll |
US20030080987A1 (en) | 2001-10-30 | 2003-05-01 | Rosenberg Louis B. | Methods and apparatus for providing haptic feedback in interacting with virtual pets |
US6560511B1 (en) | 1999-04-30 | 2003-05-06 | Sony Corporation | Electronic pet system, network system, robot, and storage medium |
US6708081B2 (en) | 2000-07-27 | 2004-03-16 | Yamaha Hatsudoki Kabushiki Kaisha | Electronic equipment with an autonomous function |
US20040096810A1 (en) * | 2002-11-20 | 2004-05-20 | Wells Harold Walter | Interactive playset |
US20040203317A1 (en) | 2003-04-08 | 2004-10-14 | David Small | Wireless interactive doll-houses and playsets therefor |
US20040214642A1 (en) * | 2001-11-14 | 2004-10-28 | 4Kids Entertainment Licensing, Inc. | Object recognition toys and games |
US6859682B2 (en) | 2002-03-28 | 2005-02-22 | Fuji Photo Film Co., Ltd. | Pet robot charging system |
US20050059483A1 (en) | 2003-07-02 | 2005-03-17 | Borge Michael D. | Interactive action figures for gaming schemes |
US20050111823A1 (en) | 1999-06-03 | 2005-05-26 | Opentv, Corp. | Networking smart toys |
US20050167919A1 (en) | 2003-11-14 | 2005-08-04 | Grant Alan H. | Interactive game with action figure identification |
US6934604B1 (en) | 1998-09-10 | 2005-08-23 | Sony Corporation | Robot apparatus, method of controlling robot apparatus, method of display, and medium |
US6963554B1 (en) | 2000-12-27 | 2005-11-08 | National Semiconductor Corporation | Microwire dynamic sequencer pipeline stall |
US6967566B2 (en) | 2002-04-05 | 2005-11-22 | Creative Kingdoms, Llc | Live-action interactive adventure game |
US7037166B2 (en) * | 2003-10-17 | 2006-05-02 | Big Bang Ideas, Inc. | Adventure figure system and method |
US7076331B1 (en) | 1998-11-30 | 2006-07-11 | Sony Corporation | Robot, method of robot control, and program recording medium |
US20060273909A1 (en) * | 2005-06-01 | 2006-12-07 | Morad Heiman | RFID-based toy and system |
US20070093173A1 (en) * | 2005-10-21 | 2007-04-26 | Yu Zheng | Interactive toy system |
US20070128979A1 (en) * | 2005-12-07 | 2007-06-07 | J. Shackelford Associates Llc. | Interactive Hi-Tech doll |
US7259778B2 (en) | 2003-07-01 | 2007-08-21 | L-3 Communications Corporation | Method and apparatus for placing sensors using 3D models |
US20080014830A1 (en) | 2006-03-24 | 2008-01-17 | Vladimir Sosnovskiy | Doll system with resonant recognition |
WO2008023371A2 (en) | 2006-08-24 | 2008-02-28 | Nisim Massiah | A doll capable of simulating eating and physical exercise |
US20080160877A1 (en) | 2005-04-26 | 2008-07-03 | Steven Lipman | Toys |
US7403202B1 (en) | 2005-07-12 | 2008-07-22 | Electronic Arts, Inc. | Computer animation of simulated characters using combinations of motion-capture data and external force modelling or other physics models |
US20090137323A1 (en) * | 2007-09-14 | 2009-05-28 | John D. Fiegener | Toy with memory and USB Ports |
US20090176432A1 (en) * | 2007-10-03 | 2009-07-09 | Mark Hardin | Electronic banking toy |
US20090300525A1 (en) * | 2008-05-27 | 2009-12-03 | Jolliff Maria Elena Romera | Method and system for automatically updating avatar to indicate user's status |
US7663488B2 (en) | 2007-06-25 | 2010-02-16 | Disney Enterprises, Inc. | System and method of virtually packaging multimedia |
US7912500B2 (en) * | 2005-05-03 | 2011-03-22 | Siemens Aktiengesellschaft | Mobile communication device, in particular in the form of a mobile telephone |
US8469766B2 (en) * | 2005-10-21 | 2013-06-25 | Patent Category Corp. | Interactive toy system |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7081033B1 (en) * | 2000-03-07 | 2006-07-25 | Hasbro, Inc. | Toy figure for use with multiple, different game systems |
JP2002018146A (en) * | 2000-07-04 | 2002-01-22 | Tomy Co Ltd | Interactive toy, reaction behavior generator and reaction behavior pattern generation method |
KR100459392B1 (en) * | 2000-08-21 | 2004-12-03 | 엘지전자 주식회사 | Toy performance apparatus and method using game |
FR2908324B1 (en) * | 2006-11-09 | 2009-01-16 | Parrot Sa | DISPLAY ADJUSTMENT METHOD FOR VIDEO GAMING SYSTEM |
US20090029771A1 (en) * | 2007-07-25 | 2009-01-29 | Mega Brands International, S.A.R.L. | Interactive story builder |
US8939840B2 (en) * | 2009-07-29 | 2015-01-27 | Disney Enterprises, Inc. | System and method for playsets using tracked objects and corresponding virtual worlds |
-
2009
- 2009-07-29 US US12/462,140 patent/US8939840B2/en active Active
-
2010
- 2010-06-18 EP EP10006315.5A patent/EP2322258B1/en active Active
-
2013
- 2013-04-10 US US13/860,434 patent/US9339729B2/en active Active
-
2016
- 2016-04-11 US US15/096,090 patent/US9539506B2/en active Active
Patent Citations (53)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4802879A (en) | 1986-05-05 | 1989-02-07 | Tiger Electronics, Inc. | Action figure toy with graphics display |
US5158492A (en) | 1991-04-15 | 1992-10-27 | Elliott A. Rudell | Light activated doll |
US5853327A (en) * | 1994-07-28 | 1998-12-29 | Super Dimension, Inc. | Computerized game board |
US5752880A (en) | 1995-11-20 | 1998-05-19 | Creator Ltd. | Interactive doll |
US20020049833A1 (en) | 1996-02-27 | 2002-04-25 | Dan Kikinis | Tailoring data and transmission protocol for efficient interactive data transactions over wide-area networks |
US5746602A (en) * | 1996-02-27 | 1998-05-05 | Kikinis; Dan | PC peripheral interactive doll |
US6460851B1 (en) * | 1996-05-10 | 2002-10-08 | Dennis H. Lee | Computer interface apparatus for linking games to personal computers |
US5912454A (en) * | 1997-06-30 | 1999-06-15 | Microsoft Corporation | System and method for detecting a relative change in light intensity |
US6529802B1 (en) | 1998-06-23 | 2003-03-04 | Sony Corporation | Robot and information processing system |
WO2000009229A1 (en) | 1998-08-13 | 2000-02-24 | Tiger Electronics, Ltd. | Action figure toy with communication device |
US6934604B1 (en) | 1998-09-10 | 2005-08-23 | Sony Corporation | Robot apparatus, method of controlling robot apparatus, method of display, and medium |
US7076331B1 (en) | 1998-11-30 | 2006-07-11 | Sony Corporation | Robot, method of robot control, and program recording medium |
US6442450B1 (en) | 1999-01-20 | 2002-08-27 | Sony Corporation | Robot device and motion control method |
WO2000044460A2 (en) | 1999-01-28 | 2000-08-03 | Pleschutznig Lanette O | Interactively programmable toy |
US6554679B1 (en) * | 1999-01-29 | 2003-04-29 | Playmates Toys, Inc. | Interactive virtual character doll |
US6560511B1 (en) | 1999-04-30 | 2003-05-06 | Sony Corporation | Electronic pet system, network system, robot, and storage medium |
US7089083B2 (en) | 1999-04-30 | 2006-08-08 | Sony Corporation | Electronic pet system, network system, robot, and storage medium |
US6519506B2 (en) | 1999-05-10 | 2003-02-11 | Sony Corporation | Robot and control method for controlling the robot's emotions |
US6760646B2 (en) | 1999-05-10 | 2004-07-06 | Sony Corporation | Robot and control method for controlling the robot's motions |
US20050111823A1 (en) | 1999-06-03 | 2005-05-26 | Opentv, Corp. | Networking smart toys |
US6227931B1 (en) * | 1999-07-02 | 2001-05-08 | Judith Ann Shackelford | Electronic interactive play environment for toy characters |
US6290565B1 (en) | 1999-07-21 | 2001-09-18 | Nearlife, Inc. | Interactive game apparatus with game play controlled by user-modifiable toy |
US20010045978A1 (en) | 2000-04-12 | 2001-11-29 | Mcconnell Daniel L. | Portable personal wireless interactive video device and method of using the same |
US6708081B2 (en) | 2000-07-27 | 2004-03-16 | Yamaha Hatsudoki Kabushiki Kaisha | Electronic equipment with an autonomous function |
WO2002047013A2 (en) | 2000-11-14 | 2002-06-13 | 4Kids Entertainement Licensing, Inc. | Object recognition toys and games |
US6963554B1 (en) | 2000-12-27 | 2005-11-08 | National Semiconductor Corporation | Microwire dynamic sequencer pipeline stall |
US20020094851A1 (en) | 2001-01-16 | 2002-07-18 | Rheey Jin Sung | Method of breeding robot pet using on-line and off-line systems simultaneously |
US20020098879A1 (en) | 2001-01-19 | 2002-07-25 | Rheey Jin Sung | Intelligent pet robot |
US20030080987A1 (en) | 2001-10-30 | 2003-05-01 | Rosenberg Louis B. | Methods and apparatus for providing haptic feedback in interacting with virtual pets |
US20040214642A1 (en) * | 2001-11-14 | 2004-10-28 | 4Kids Entertainment Licensing, Inc. | Object recognition toys and games |
US6859682B2 (en) | 2002-03-28 | 2005-02-22 | Fuji Photo Film Co., Ltd. | Pet robot charging system |
US6967566B2 (en) | 2002-04-05 | 2005-11-22 | Creative Kingdoms, Llc | Live-action interactive adventure game |
US20040096810A1 (en) * | 2002-11-20 | 2004-05-20 | Wells Harold Walter | Interactive playset |
US20040203317A1 (en) | 2003-04-08 | 2004-10-14 | David Small | Wireless interactive doll-houses and playsets therefor |
US7259778B2 (en) | 2003-07-01 | 2007-08-21 | L-3 Communications Corporation | Method and apparatus for placing sensors using 3D models |
US20050059483A1 (en) | 2003-07-02 | 2005-03-17 | Borge Michael D. | Interactive action figures for gaming schemes |
US7037166B2 (en) * | 2003-10-17 | 2006-05-02 | Big Bang Ideas, Inc. | Adventure figure system and method |
US20050167919A1 (en) | 2003-11-14 | 2005-08-04 | Grant Alan H. | Interactive game with action figure identification |
US20080160877A1 (en) | 2005-04-26 | 2008-07-03 | Steven Lipman | Toys |
US7912500B2 (en) * | 2005-05-03 | 2011-03-22 | Siemens Aktiengesellschaft | Mobile communication device, in particular in the form of a mobile telephone |
US20060273909A1 (en) * | 2005-06-01 | 2006-12-07 | Morad Heiman | RFID-based toy and system |
US7403202B1 (en) | 2005-07-12 | 2008-07-22 | Electronic Arts, Inc. | Computer animation of simulated characters using combinations of motion-capture data and external force modelling or other physics models |
US20070093173A1 (en) * | 2005-10-21 | 2007-04-26 | Yu Zheng | Interactive toy system |
US8469766B2 (en) * | 2005-10-21 | 2013-06-25 | Patent Category Corp. | Interactive toy system |
US8157611B2 (en) * | 2005-10-21 | 2012-04-17 | Patent Category Corp. | Interactive toy system |
US20070128979A1 (en) * | 2005-12-07 | 2007-06-07 | J. Shackelford Associates Llc. | Interactive Hi-Tech doll |
US20080014830A1 (en) | 2006-03-24 | 2008-01-17 | Vladimir Sosnovskiy | Doll system with resonant recognition |
WO2008023371A2 (en) | 2006-08-24 | 2008-02-28 | Nisim Massiah | A doll capable of simulating eating and physical exercise |
US7663488B2 (en) | 2007-06-25 | 2010-02-16 | Disney Enterprises, Inc. | System and method of virtually packaging multimedia |
US20090137323A1 (en) * | 2007-09-14 | 2009-05-28 | John D. Fiegener | Toy with memory and USB Ports |
US8545335B2 (en) * | 2007-09-14 | 2013-10-01 | Tool, Inc. | Toy with memory and USB ports |
US20090176432A1 (en) * | 2007-10-03 | 2009-07-09 | Mark Hardin | Electronic banking toy |
US20090300525A1 (en) * | 2008-05-27 | 2009-12-03 | Jolliff Maria Elena Romera | Method and system for automatically updating avatar to indicate user's status |
Non-Patent Citations (7)
Title |
---|
Bandai America Announces the Next Generation Tamagotchi: Tamagotchi Connection. < http://www.bandai.com/news/news.cfm?wn—id=71> Retrieved Sep. 30, 2009. |
Bandai America Announces the Next Generation Tamagotchi: Tamagotchi Connection. Retrieved Sep. 30, 2009. |
Cassell, J., et al. Shared Reality: Physical Collaboration with a Virtual Peer, Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), pp. 259-260. Apr. 4-9, Amsterdam, NL. (2000). |
Hinske, S., et al. Interactive Educational Play with Augmented Toy Environments, ERCIM News, Special on Technology-enhanced learning, Issue 71, Oct. 2007. |
Myers, Jack. Neopets.com Fulfills Promise of Immersive Advertising (PDF). Jack Myers Report. Jack Myers, LLC. (Mar. 18, 2004). Retrieved Sep. 30, 2009. |
Myers, Jack. Neopets.com Fulfills Promise of Immersive Advertising (PDF). Jack Myers Report. Jack Myers, LLC. <http://www.mediavillage.com/pdf/2004/03-18-04.pdf> (Mar. 18, 2004). Retrieved Sep. 30, 2009. |
Roger K. C., et al. Metazoa Ludens: Mixed Reality Environment for Playing Computer Games with Pets, The International Journal of Virtual Reality, 2006, 5(3):53-58. |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130231193A1 (en) * | 2009-07-29 | 2013-09-05 | Disney Enterprises, Inc. | System and Method for Playsets Using Tracked Objects and Corresponding Virtual Worlds |
US9339729B2 (en) * | 2009-07-29 | 2016-05-17 | Disney Enterprises, Inc. | System and method for playsets using tracked objects and corresponding virtual worlds |
US10019845B2 (en) | 2012-08-30 | 2018-07-10 | Atheer, Inc. | Method and apparatus for content association and history tracking in virtual and augmented reality |
US20140067768A1 (en) * | 2012-08-30 | 2014-03-06 | Atheer, Inc. | Method and apparatus for content association and history tracking in virtual and augmented reality |
US11763530B2 (en) | 2012-08-30 | 2023-09-19 | West Texas Technology Partners, Llc | Content association and history tracking in virtual and augmented realities |
US11120627B2 (en) | 2012-08-30 | 2021-09-14 | Atheer, Inc. | Content association and history tracking in virtual and augmented realities |
US9589000B2 (en) * | 2012-08-30 | 2017-03-07 | Atheer, Inc. | Method and apparatus for content association and history tracking in virtual and augmented reality |
US10725607B2 (en) | 2012-12-21 | 2020-07-28 | Intellifect Incorporated | Enhanced system and method for providing a virtual space |
US10061468B2 (en) | 2012-12-21 | 2018-08-28 | Intellifect Incorporated | Enhanced system and method for providing a virtual space |
US9836806B1 (en) | 2013-06-07 | 2017-12-05 | Intellifect Incorporated | System and method for presenting user progress on physical figures |
US10176544B2 (en) | 2013-06-07 | 2019-01-08 | Intellifect Incorporated | System and method for presenting user progress on physical figures |
US10743732B2 (en) | 2013-06-07 | 2020-08-18 | Intellifect Incorporated | System and method for presenting user progress on physical figures |
US10229608B2 (en) * | 2014-08-19 | 2019-03-12 | Intellifect Incorporated | Wireless communication between physical figures to evidence real-world activity and facilitate development in real and virtual spaces |
US9157617B1 (en) | 2014-10-22 | 2015-10-13 | Codeshelf | Modular hanging lasers to provide easy installation in a distribution center |
US9327397B1 (en) | 2015-04-09 | 2016-05-03 | Codeshelf | Telepresence based inventory pick and place operations through robotic arms affixed to each row of a shelf |
US9262741B1 (en) | 2015-04-28 | 2016-02-16 | Codeshelf | Continuous barcode tape based inventory location tracking |
US11285614B2 (en) * | 2016-07-20 | 2022-03-29 | Groove X, Inc. | Autonomously acting robot that understands physical contact |
US11113888B2 (en) * | 2016-12-26 | 2021-09-07 | Interdigital Ce Patent Holdings, Sas | Device and method for generating dynamic virtual contents in mixed reality |
US11580706B2 (en) | 2016-12-26 | 2023-02-14 | Interdigital Ce Patent Holdings, Sas | Device and method for generating dynamic virtual contents in mixed reality |
Also Published As
Publication number | Publication date |
---|---|
US20130231193A1 (en) | 2013-09-05 |
US9339729B2 (en) | 2016-05-17 |
US9539506B2 (en) | 2017-01-10 |
EP2322258A3 (en) | 2015-07-29 |
US20110028219A1 (en) | 2011-02-03 |
EP2322258B1 (en) | 2019-05-29 |
EP2322258A2 (en) | 2011-05-18 |
US20160220896A1 (en) | 2016-08-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9539506B2 (en) | System and method for playsets using tracked objects and corresponding virtual worlds | |
US10089772B2 (en) | Context-aware digital play | |
EP2744579B1 (en) | Connected multi functional system and method of use | |
Fernaeus et al. | How do you play with a robotic toy animal? A long-term study of Pleo | |
Hjorth et al. | Ambient play | |
Druin et al. | Robots for kids: exploring new technologies for learning | |
US11348298B2 (en) | Connected avatar technology | |
Bloch et al. | Disposable love: The rise and fall of a virtual pet | |
CN107000210A (en) | Apparatus and method for providing lasting partner device | |
Guo et al. | Design-in-play: improving the variability of indoor pervasive games | |
CN102656542A (en) | Camera navigation for presentations | |
Wasko | Children’s virtual worlds: The latest commercialization of children’s culture | |
Bylieva et al. | Virtual pet: trends of development | |
Tyni et al. | Hybrid Playful Experiences: Playing Between Material and Digital-Hybridex Project, Final Report | |
Katsuno et al. | Haptic creatures: tactile affect and human-robot intimacy in Japan | |
Sicart | Playing Software: Homo Ludens in Computational Culture | |
Wang et al. | Internet of toys: an e-Pet overview and proposed innovative social toy service platform | |
Jacobsson | Tinkering with interactive materials-Studies, concepts and prototypes | |
Dorin | Building artificial life for play | |
Ye | Sustainable HCI: Fashion Minimalism and Human-Robot Companionship | |
Schmitz | Tangible interaction with anthropomorphic smart objects in instrumented environments | |
Bonarini | Playful Robots | |
Lundgren | Facets of fun: On the design of computer augmented entertainment artifacts | |
Heljakka | Toy, stories, desing, play, display, adult experience with designer toys | |
Bobrow | AutomaTiles: tangible cellular automata for playful engagement with systems thinking |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: DISNEY ENTERPRISES, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HEATHERLY, CHRISTOPHER W.;SHIN, CHRISTOPHER K.;BERLING, NATASHA;AND OTHERS;REEL/FRAME:023144/0435 Effective date: 20090722 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551) Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |