WO2011124916A1 - Interacting toys - Google Patents

Interacting toys Download PDF

Info

Publication number
WO2011124916A1
WO2011124916A1 PCT/GB2011/050684 GB2011050684W WO2011124916A1 WO 2011124916 A1 WO2011124916 A1 WO 2011124916A1 GB 2011050684 W GB2011050684 W GB 2011050684W WO 2011124916 A1 WO2011124916 A1 WO 2011124916A1
Authority
WO
WIPO (PCT)
Prior art keywords
toy
data
interaction
avatar
interactions
Prior art date
Application number
PCT/GB2011/050684
Other languages
French (fr)
Other versions
WO2011124916A4 (en
Inventor
Steven Lipman
Original Assignee
Librae Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Librae Limited filed Critical Librae Limited
Priority to EP11716627A priority Critical patent/EP2555840A1/en
Priority to US13/639,411 priority patent/US20130244539A1/en
Priority to JP2013503174A priority patent/JP5945266B2/en
Priority to CN2011800279300A priority patent/CN103201000A/en
Publication of WO2011124916A1 publication Critical patent/WO2011124916A1/en
Publication of WO2011124916A4 publication Critical patent/WO2011124916A4/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H33/00Other toys
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/28Arrangements of sound-producing means in dolls; Means in dolls for producing sounds
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Definitions

  • This invention relates to toys.
  • this invention relates to toys such as dolls that interact with each other.
  • Embedded computers and micro-processors have improved toys for children. They have been used most extensively in educational toys, but have also been used in interactive toys. ActiMates® Barney® is one example of an interactive toy which responds to interaction from a child by appropriate vocalisations, and can sing-a-long to videos.
  • a toy comprising: a processor for generating interactions between such toy and at least one other toy capable of interacting with such toy; an interaction tracking engine for generating data related to the interactions; and a memory connected to said interaction tracking engine for storing said data.
  • the toy further comprises means for outputting said data.
  • the memory is adapted to store said data relating to a plurality of interactions.
  • the type of said data is predetermined.
  • the data includes a measure of the interactions. More preferably, the measure includes a count related to the interactions and/or the measure is a temporal measure. Preferably, the measure includes at least one of the following: the total number of separate interactions, the total number of separate interactions between the toy and each other specific toy, the total number of incidences of the toy using a particular interaction, the time of day of the interaction and the total time the toy has participated in interactions. Preferably, the data includes whether a predetermined interaction, such as a specific phrase and/or word, has been used during an interaction.
  • the interaction is an audible interaction (for example speech), and/or a physical interaction.
  • the toy further comprises means for analysing said data, wherein said analysing means determines when a predetermined target value, associated with said data, has been reached. More preferably, the predetermined target value is a count related to said data and/or the predetermined target value is a duration.
  • the predetermined target value includes at least one of: the total number of separate interactions, the total number of separate interactions between the toy and each other specific toy, the total number of incidences of the toy using a particular interaction, the time of day of the interaction and the total time the toy has participated in interactions.
  • the toy further comprises means for outputting said analysis.
  • the analysis outputting means incorporates a unique identifier, associated with said toy, with said analysis.
  • the data outputting means incorporates a unique identifier, associated with said toy, with said data.
  • the toy is a computer, and the form of said toy is represented by an avatar on said computer's screen.
  • the toy is an individual object.
  • a server comprising: means for communicating with a plurality of toys, means for receiving data related to each said toy; means for processing said data; and means for allocating points to each said toy in dependence on said processed data.
  • the points are stored in memory associated with each respective toy.
  • the server further comprises means for comparing the points associated with each toy. More preferably, the comparison means is adapted to generate a ranked list of toys according to the number of points associated with each respective toy. Yet more preferably, the processing means determines when a predetermined target value, associated with said data, has been reached.
  • the predetermined target value is a count related to said data and/or is a duration.
  • the toy is a toy substantially as herein described.
  • the predetermined target value includes at least one of: the total number of separate interactions, the total number of separate interactions between the toy and each other specific toy, the total number of incidences of the toy using a particular interaction, the time of day of the interaction and the total time the toy has participated in interactions.
  • an augmented reality system including: a processor; means for receiving a code; an avatar generation engine adapted to generate an avatar in dependence on said code; and means for outputting data representing an image comprising the avatar generated by the avatar generation engine.
  • the code receiving means is adapted to receive a code via a manual input.
  • the code receiving means is adapted to receive a code via a camera.
  • the code is on a toy, and the toy may be a doll or a card.
  • the system further comprises means for communicating with a physical toy.
  • the communication means includes a wireless adapter.
  • the system further comprises means for identifying said toy, and a theme stored within said toy, wherein said avatar and said toy then communicate within said theme.
  • the interaction includes, speech and actions.
  • the system further comprises: means for receiving image data, said image data representing an image of a physical toy in a physical environment, wherein said outputting means is adapted to output data representing said image comprising the generated avatar together with the image of the physical toy in the physical environment; means for outputting data representing an image comprising the generated avatar together with the image of the physical toy in the physical environment; and means for receiving activity data representing an action of the physical toy; wherein the processor is adapted to analyse said received activity data and to generate an action for performance by the avatar in response to the received activity data, whereby said avatar and said physical toy appear to interact in said physical environment.
  • an augmented reality system comprising: a processor; an avatar generation engine adapted to generate an avatar; means for outputting data representing an image comprising the generated avatar together with the image of the physical toy in the physical environment; and means for receiving activity data representing an action of the physical toy; wherein the processor is adapted to analyse said received activity data and to generate an action for performance by the avatar in response to the received activity data, whereby said avatar and said physical toy appear to interact in said physical environment.
  • the activity data represents speech and/or a physical action.
  • Apparatus and method features may be interchanged as appropriate, and may be provided independently one of another. Any feature in one aspect of the invention may be applied to other aspects of the invention, in any appropriate combination. In particular, method aspects may be applied to apparatus aspects, and vice versa.
  • Figure 1 is a schematic illustration of a doll
  • Figure 2 (Prior art) is a schematic illustration of a wireless communications dongle;
  • Figure 3 is a schematic illustration of a doll with interactions tracking;
  • Figure 4 is a diagram of an augmented reality device
  • FIG. 1 shows a schematic representation of the known doll, with the hardware components required to allow the doll to communicate, and perform other such tasks.
  • the doll 100 as shown in Figure 1 , comprises a processor 102 that includes a wireless module 104.
  • the processor is in communication with memory 106, ROM 108, and RAM 1 10.
  • An IR/RF transmitter/receiver is connected to the processor/wireless module and is enabled to transmit/receive signals to/from other such dolls.
  • the doll is also connected to a loud speaker 1 14.
  • a USB controller 1 16 is used to update the memory 106, and also to charge, via the charger circuitry 1 18, the battery 120.
  • the memory 106 stores information relating to conversations that the dolls can have, and is accessed by the processor when it is compiling speech.
  • the ROM 108 is used to store permanent information relating to the doll, such as the doll's name and ID number. This information is used in the initialisation procedure when setting up a network of dolls.
  • the RAM 1 10 stores information relating to the current conversation and is used in order to produce more realistic conversation by storing information relating to the phrases already used for example.
  • Each doll 100 contains in memory 106: a data set containing the doll's name, and other variables defined during a conversation; a set of instructions which produces the conversation; and a set of audio data.
  • the variables defined during the conversation are only stored in the controller doll.
  • the dolls are adapted to download theme via a PC from a website, and then converse in that theme with other such dolls.
  • FIG. 1 shows a schematic representation of the USB communications dongle 1600, attached to a PC 304, and in wireless communication with the dolls 100.
  • the dongle contains a wireless module 104, an IR/RF transmitter/receiver 212, and an interface 1602. These components, except the interface 1602, are the same as contained within the doll 100, as described above.
  • the PC 304 is utilised as the processor 1604, instead of the dongle having an independent processor as the doll 100 has, and so the PC effectively becomes a virtual doll able to communicate with the physical dolls 100.
  • the virtual doll is provided with an animated avatar shown on the PC monitor, that may be similar in appearance to the real doll, and whereby the animation of the avatar is synchronised with the speech of the doll.
  • the PC has stored in memory 1606 an emulator for emulating the processor of the toy.
  • the website is arranged to allow the user to download various themes, and also to interact with other users. This enables the users to interact both in the virtual world - via chat rooms, games, competitions, or the like - and in the physical world - by playing with other users with the communicating dolls.
  • a website that enables a user to log in, and register his/her details, such as the type and number of toy dolls he/she has, the name of his/her toy doll(s), etc.
  • the website provides features such as: friendship groups via social network, live chat, downloadable stories (related to the downloadable themes), and communicating characters with real voices. Characters can be named, styled, saved, and posted up to the website so that other users can vote upon them. This allows users to compete with each other for points that are awarded to the users based on the voting, the number of posts to the website, etc.
  • Limited Edition labels are provided in the doll clothes.
  • the Limited Edition labels are provided with a code that can be entered into the website to accumulate points, or to receive special gifts, etc.
  • the website is provided with the functionality to rank users, with the results being updated continually, but with weekly, monthly, and annual competitions.
  • the competitions can be regional, national, and international. This enables an "X Factor" style competition amongst home country users and internationally on a global scale, and allows prizes to be awarded, and for bronze, silver, gold, loyalty card/awards to be provided to the users.
  • the users can accumulate points in dependence on a number of other factors, such as:
  • the type of theme that is downloaded in to the doll for example, downloading a more socially acceptable theme, i.e. health or sport related, provides more points than a less socially acceptable theme
  • the user can be aided by his/her online friends in order to accumulate points more quickly.
  • the users can create groups (of his/her friends) and compete with other such groups in a group ranking competition, similar to the individual user competition described above.
  • Figure 3 shows a schematic diagram of a toy doll.
  • the doll 200 comprises similar components to the prior art doll 100, but further comprises an interaction tracking engine 202 that includes additional memory 204.
  • the interaction tracking engine is connected to the processor 102.
  • doll 200 is adapted to download themes via a PC from a website, and then converse in that theme with other such dolls.
  • the conversations are constructed in a similar way to that described in International Patent Publication No. WO2009/010760, which is hereby incorporated by reference; see in particular Page 12 line 28 to Page 18 line 2.
  • the interaction tracking engine is utilised to track the interactions and store in memory 204 the statistics relating to those interactions.
  • the statistics that are stored include, but are not limited to, any some or all of the following measures:
  • an interaction with a PC avatar doll is counted as if it were an interaction with a physical doll.
  • the interaction tracking engine monitors the interactions between dolls utilising doll identifiers, specific to each type of doll (for example, all type A dolls have the same identifier), or alternatively the identifiers are specific to each individual doll.
  • the interaction tracking engine is adapted to create a database, stored in memory 204, that lists the interactions between the doll and other dolls using the doll identifiers. Further, since each phrase, or word, has an identifier, any of the phrases, or words, can be tracked in the database in the same way as described above.
  • the "Golden Phrase” is tracked to determine if/when the phrase is used by the doll.
  • the user By connecting the doll to a PC, logging into the website, and then downloading the latest theme, the user is provided with the "Golden Phrase” within the theme.
  • the "Golden Phrase” is announced, and when the user hears the "Golden Phrase” during an interaction between a group of dolls, the event of the doll using the "Golden Phrase” is tracked and stored in the interaction tracking database.
  • the website verifies that the "Golden Phrase” has been used and awards points, or a prize, to the user.
  • an interaction is constructed by allocating weightings to each possible response at any point in the interaction.
  • the "Golden Phrase” is generally provided with a low weighting (i.e. the probability that the "Golden Phrase” is used during an interaction is relatively low), and thus the user may be required to initiate interaction in any one theme a number of times before the "Golden Phrase” is used; this ensures that the life span of any one theme is increased, and the users gain reward points in the process.
  • points can be awarded for reaching certain targets of any of the statistics; for example, once the user's doll has had 5, 10, 20, etc interactions in one particular time period (e.g. a week, a month, or a year) a number of points will be awarded. Any other statistic can be used as the basis for awarding points, for example the length of the interactions (i.e. points for each 10 minutes of interaction in a week).
  • Figure 4 shows a schematic diagram of the back-end server 400 that communicates with the doll 200 via the user's PC 402 and the website 404; the back-end server facilitates the operation of the website and the awarding and storing of points as described above.
  • the Doll's unique ID is verified by checking the Doll ID memory 406 located in the server.
  • the user's ID is verified by checking the User ID memory 408. All of the data transmitted from the Doll to the server, or from the server to the doll 200/user 410, is passed through the data interface 412.
  • the server processor 414 processes the data received via the data interface, and determines the number of reward points that are to be assigned to the user 410 based on the interaction data downloaded from the doll using a rewards engine 416.
  • the doll is adapted to determine the number of reward points that are due by pre-processing the data accumulated by the interaction tracking engine before transmitting the data to the server. In this way, the amount of data transferred between the doll and the server can be reduced.
  • rewards can be provided to the user once certain goals have been achieved.
  • the reward engine 416 accesses the data relating to the number of points the user 410 has stored in the memory 418 within the points accumulator 420. Once a goal number of points has been achieved a reward is provided to the user.
  • the reward engine can provide listings of the users with the most points, thus enabling the weekly, monthly, and yearly rewards to be given to the appropriate users.
  • the reward engine is adapted to provide the doll with the "Golden Phrase”; details of the "Golden Phrase” are described above.
  • the creator of the server end data i.e. a webmaster or the like, inputs the current "Golden Phrase” by referencing a word/phrase using the word/phrase ID number already used within the themes. Alternatively, a special word/phrase can be incorporated into the theme during its generation for use as the "Golden Phrase”.
  • Figure 5 shows the format of the data 500 transmitted from the doll to the server. As can be seen the data comprises the information described above that is tracked by the interaction tracking engine.
  • a camera 600 such as a webcam, connected to the user's PC 602 is used to obtain a real-time video stream of a play area.
  • the augmented reality is initiated when a user introduces an object 604, with an imprinted code 606, into the play area 609.
  • the functionality may be provided through the website 404, or via a stand-alone application on the user's PC 602.
  • the online website/application provides additional functionality to the user.
  • the website/application recognises the code using the code recognition engine 610 (the code may or may not be visible to the user) and then interprets that code into an online image, or text using the avatar generation engine; for example, the code could represent the doll that the code is imprinted on, and then an avatar 608 representing that doll is generated and shown on the user's PC screen 612.
  • the avatar and the real doll can then communicate, via the USB communications dongle described above, as if the avatar were a real doll.
  • the real doll and the virtual doll in the form of an avatar can both appear on the user's PC screen.
  • the code can be inputted into the PC manually.
  • the code recognition engine interprets the code and the avatar generation engine, generates an avatar 608.
  • the avatar is shown on the PC screen, but the physical doll and the avatar can still interact via the communications dongle.
  • the user is required to log on to the website before the augmented reality can be generated, and so the user's details are known, i.e. what dolls they own; this is accomplished using both the user and the doll ID's. Therefore, when the interaction is generated it is known what theme the physical doll currently has within memory. Thus the correct theme can be used by the avatar. This can also be accomplished by identifying the doll, connected wirelessly to the PC via the communications dongle, using the doll's unique identifier, and so a user's friend's doll can also interact with an avatar at the same time as the user's doll.
  • the means for providing augmented reality can recognise a plurality of objects, and produce a plurality of avatars on the user's PC screen. These avatars then communicate with each other, and/or a physical doll.
  • the code 608 is used to provide a reference to a special theme that is then automatically downloaded to the doll, and then the doll and the avatar communicate within that theme.
  • object recognition software that enables the website/application to recognise certain objects, such as items of specific clothing, toy sports equipment, etc.
  • the course of the interaction may be altered, for example, the weightings of the next phrases can be altered such that it is more likely that the dolls will communicate about the object that has just been introduced. For example, if a toy jumper is introduced, the weighting for communicating about clothes shopping is increased by an order of magnitude. In this way, the user can influence the interaction between the dolls.
  • the augmented reality system can recognise a code for, for example, a pregnant doll, and will then spawn an avatar in the form of a baby.
  • the code can represent any virtual object.
  • the code could represent a new pair of shoes that are automatically shown on the real doll's feet on the user's PC screen. This is put into effect using the object recognition software described above, since the doll can be recognised, and hence the image of the shoes can be placed correctly on the screen to make it appear as though the shoes are actually on the doll.
  • the user introduces the object, a card or the like, into the play area points are given to the user's account. Depending on the type of object a different number of points are given.
  • each doll has unique identifier (tag or number), and every unit is identifiable as a unit.
  • the user logs on the website, connects the doll to the PC (the doll being recognised using the unique identifier), and the user can then input personal details about the doll; i.e. name, favourite colour, favourite pets, favourite music etc.
  • the user is then able to acquire points on his/her account, associated with the doll, by way of:

Abstract

The present invention relates to toys that are enabled to interact with other such toys, and in particular a toy comprising: a processor (102) for generating interactions between such toy (200) and at least one other toy capable of interacting with such toy; an interaction tracking engine (202) for generating data related to the interactions; and a memory (106) connected to said interaction tracking engine for storing said data. The invention further relates to a server comprising: means for communicating with a plurality of toys, means for receiving data related to each said toy; means for processing said data; and means for allocating points to each said toy in dependence on said processed data.

Description

Interacting Toys
This invention relates to toys. In particular, although not exclusively, this invention relates to toys such as dolls that interact with each other.
Embedded computers and micro-processors have improved toys for children. They have been used most extensively in educational toys, but have also been used in interactive toys. ActiMates® Barney® is one example of an interactive toy which responds to interaction from a child by appropriate vocalisations, and can sing-a-long to videos.
Interaction tracking
According to one aspect of the present invention there is provided a toy comprising: a processor for generating interactions between such toy and at least one other toy capable of interacting with such toy; an interaction tracking engine for generating data related to the interactions; and a memory connected to said interaction tracking engine for storing said data.
Preferably, the toy further comprises means for outputting said data.
Preferably, the memory is adapted to store said data relating to a plurality of interactions.
Preferably, the type of said data is predetermined.
Preferably, the data includes a measure of the interactions. More preferably, the measure includes a count related to the interactions and/or the measure is a temporal measure. Preferably, the measure includes at least one of the following: the total number of separate interactions, the total number of separate interactions between the toy and each other specific toy, the total number of incidences of the toy using a particular interaction, the time of day of the interaction and the total time the toy has participated in interactions. Preferably, the data includes whether a predetermined interaction, such as a specific phrase and/or word, has been used during an interaction.
Preferably, the interaction is an audible interaction (for example speech), and/or a physical interaction.
Preferably, the toy further comprises means for analysing said data, wherein said analysing means determines when a predetermined target value, associated with said data, has been reached. More preferably, the predetermined target value is a count related to said data and/or the predetermined target value is a duration.
Preferably, the predetermined target value includes at least one of: the total number of separate interactions, the total number of separate interactions between the toy and each other specific toy, the total number of incidences of the toy using a particular interaction, the time of day of the interaction and the total time the toy has participated in interactions.
Preferably, the toy further comprises means for outputting said analysis. More preferably, the analysis outputting means incorporates a unique identifier, associated with said toy, with said analysis.
Preferably, the data outputting means incorporates a unique identifier, associated with said toy, with said data. Preferably, the toy is a computer, and the form of said toy is represented by an avatar on said computer's screen.
Preferably, the toy is an individual object. According to a further aspect of the present invention there is provided a server comprising: means for communicating with a plurality of toys, means for receiving data related to each said toy; means for processing said data; and means for allocating points to each said toy in dependence on said processed data.
Preferably, the points are stored in memory associated with each respective toy. Preferably, the server further comprises means for comparing the points associated with each toy. More preferably, the comparison means is adapted to generate a ranked list of toys according to the number of points associated with each respective toy. Yet more preferably, the processing means determines when a predetermined target value, associated with said data, has been reached.
Preferably, the predetermined target value is a count related to said data and/or is a duration.
Preferably, the toy is a toy substantially as herein described.
Preferably, the predetermined target value includes at least one of: the total number of separate interactions, the total number of separate interactions between the toy and each other specific toy, the total number of incidences of the toy using a particular interaction, the time of day of the interaction and the total time the toy has participated in interactions.
Augmented reality with intercommunication
According to a yet further aspect of the present invention there is provided an augmented reality system, including: a processor; means for receiving a code; an avatar generation engine adapted to generate an avatar in dependence on said code; and means for outputting data representing an image comprising the avatar generated by the avatar generation engine.
Preferably, the code receiving means is adapted to receive a code via a manual input.
Preferably, the code receiving means is adapted to receive a code via a camera.
Preferably, the code is on a toy, and the toy may be a doll or a card.
Preferably, the system further comprises means for communicating with a physical toy. Preferably, the communication means includes a wireless adapter. Preferably, the system further comprises means for identifying said toy, and a theme stored within said toy, wherein said avatar and said toy then communicate within said theme. Preferably, the interaction includes, speech and actions. Preferably, the system further comprises: means for receiving image data, said image data representing an image of a physical toy in a physical environment, wherein said outputting means is adapted to output data representing said image comprising the generated avatar together with the image of the physical toy in the physical environment; means for outputting data representing an image comprising the generated avatar together with the image of the physical toy in the physical environment; and means for receiving activity data representing an action of the physical toy; wherein the processor is adapted to analyse said received activity data and to generate an action for performance by the avatar in response to the received activity data, whereby said avatar and said physical toy appear to interact in said physical environment.
According to a yet further aspect of the present invention there is provided an augmented reality system, comprising: a processor; an avatar generation engine adapted to generate an avatar; means for outputting data representing an image comprising the generated avatar together with the image of the physical toy in the physical environment; and means for receiving activity data representing an action of the physical toy; wherein the processor is adapted to analyse said received activity data and to generate an action for performance by the avatar in response to the received activity data, whereby said avatar and said physical toy appear to interact in said physical environment.
Preferably, the activity data represents speech and/or a physical action.
Apparatus and method features may be interchanged as appropriate, and may be provided independently one of another. Any feature in one aspect of the invention may be applied to other aspects of the invention, in any appropriate combination. In particular, method aspects may be applied to apparatus aspects, and vice versa.
Embodiments of this invention will now be described, by way of example only, with reference to the accompanying drawings, of which: Figure 1 (Prior art) is a schematic illustration of a doll;
Figure 2 (Prior art) is a schematic illustration of a wireless communications dongle; Figure 3 is a schematic illustration of a doll with interactions tracking;
Figure 4 is a diagram of an augmented reality device;
The basic features and operation of such interacting toys are known in the art, for example in International Patent Publication No. WO2009/010760; however a brief description is provided below to aid in the understanding of the present invention.
The following description relates to a known toy, such as a toy doll, that is enabled to communicate with other such toys; the dolls are adapted to coordinate the speech between the dolls. Figure 1 shows a schematic representation of the known doll, with the hardware components required to allow the doll to communicate, and perform other such tasks. The doll 100, as shown in Figure 1 , comprises a processor 102 that includes a wireless module 104. The processor is in communication with memory 106, ROM 108, and RAM 1 10. An IR/RF transmitter/receiver is connected to the processor/wireless module and is enabled to transmit/receive signals to/from other such dolls. The doll is also connected to a loud speaker 1 14. A USB controller 1 16 is used to update the memory 106, and also to charge, via the charger circuitry 1 18, the battery 120. The memory 106 stores information relating to conversations that the dolls can have, and is accessed by the processor when it is compiling speech. The ROM 108 is used to store permanent information relating to the doll, such as the doll's name and ID number. This information is used in the initialisation procedure when setting up a network of dolls. The RAM 1 10 stores information relating to the current conversation and is used in order to produce more realistic conversation by storing information relating to the phrases already used for example.
Each doll 100 contains in memory 106: a data set containing the doll's name, and other variables defined during a conversation; a set of instructions which produces the conversation; and a set of audio data. The variables defined during the conversation are only stored in the controller doll. The dolls are adapted to download theme via a PC from a website, and then converse in that theme with other such dolls.
A USB communications dongle is also described in International Patent Publication No. WO2009/010760, that enables a PC to interact wirelessly with a toy. Figure 2 shows a schematic representation of the USB communications dongle 1600, attached to a PC 304, and in wireless communication with the dolls 100. The dongle contains a wireless module 104, an IR/RF transmitter/receiver 212, and an interface 1602. These components, except the interface 1602, are the same as contained within the doll 100, as described above. However, the PC 304 is utilised as the processor 1604, instead of the dongle having an independent processor as the doll 100 has, and so the PC effectively becomes a virtual doll able to communicate with the physical dolls 100. The virtual doll is provided with an animated avatar shown on the PC monitor, that may be similar in appearance to the real doll, and whereby the animation of the avatar is synchronised with the speech of the doll. In order to run the conversations, the PC has stored in memory 1606 an emulator for emulating the processor of the toy.
The website is arranged to allow the user to download various themes, and also to interact with other users. This enables the users to interact both in the virtual world - via chat rooms, games, competitions, or the like - and in the physical world - by playing with other users with the communicating dolls.
In preferred embodiments of the present invention, a website, described in further detail below, is provided that enables a user to log in, and register his/her details, such as the type and number of toy dolls he/she has, the name of his/her toy doll(s), etc. The website provides features such as: friendship groups via social network, live chat, downloadable stories (related to the downloadable themes), and communicating characters with real voices. Characters can be named, styled, saved, and posted up to the website so that other users can vote upon them. This allows users to compete with each other for points that are awarded to the users based on the voting, the number of posts to the website, etc.
Also, in the physical world, Limited Edition labels are provided in the doll clothes. The Limited Edition labels are provided with a code that can be entered into the website to accumulate points, or to receive special gifts, etc. The website is provided with the functionality to rank users, with the results being updated continually, but with weekly, monthly, and annual competitions. The competitions can be regional, national, and international. This enables an "X Factor" style competition amongst home country users and internationally on a global scale, and allows prizes to be awarded, and for bronze, silver, gold, loyalty card/awards to be provided to the users.
The users can accumulate points in dependence on a number of other factors, such as:
· Creating an account on the website (for example this may provide bonus points)
• The type of theme that is downloaded in to the doll (for example, downloading a more socially acceptable theme, i.e. health or sport related, provides more points than a less socially acceptable theme)
· The statistics of the interactions between the physical dolls (see below for further details)
• Doing well in quizzes
• Playing games, including top trump style card games, on-line games, etc
• Finding special cards within physical card packs bought by the user
· Hearing the "Golden Phrase" of the week (see below for further details)
Further, during the challenges, quizzes, games etc, the user can be aided by his/her online friends in order to accumulate points more quickly. The users can create groups (of his/her friends) and compete with other such groups in a group ranking competition, similar to the individual user competition described above.
Interaction Tracking
Figure 3 shows a schematic diagram of a toy doll. The doll 200 comprises similar components to the prior art doll 100, but further comprises an interaction tracking engine 202 that includes additional memory 204. The interaction tracking engine is connected to the processor 102.
As with doll 100, doll 200 is adapted to download themes via a PC from a website, and then converse in that theme with other such dolls. The conversations are constructed in a similar way to that described in International Patent Publication No. WO2009/010760, which is hereby incorporated by reference; see in particular Page 12 line 28 to Page 18 line 2.
During the interactions between the dolls, the interaction tracking engine is utilised to track the interactions and store in memory 204 the statistics relating to those interactions. The statistics that are stored include, but are not limited to, any some or all of the following measures:
• the total number of separate interactions between the doll and any other doll, or dolls (for example, the interaction could be between a group of more than two dolls);
• the total number of separate interactions between the doll, and each other specific doll (i.e. the number of separate interactions between Doll A and Doll B, between Doll A and Doll C, etc);
• the total number of incidences of the doll using a particular word or phrase (i.e. the number of times Doll A says "I have a new dress!");
• the total time the doll has participated in interactions;
• the total time the doll has participated in interactions since the last time the doll was connected to a PC and the website;
• whether or not a specific phrase has been used during an interaction; and · the time of day that each interaction occurred
In the above, an interaction with a PC avatar doll is counted as if it were an interaction with a physical doll. The interaction tracking engine monitors the interactions between dolls utilising doll identifiers, specific to each type of doll (for example, all type A dolls have the same identifier), or alternatively the identifiers are specific to each individual doll. The interaction tracking engine is adapted to create a database, stored in memory 204, that lists the interactions between the doll and other dolls using the doll identifiers. Further, since each phrase, or word, has an identifier, any of the phrases, or words, can be tracked in the database in the same way as described above.
Using the phrase/word tracking ability the "Golden Phrase" is tracked to determine if/when the phrase is used by the doll. By connecting the doll to a PC, logging into the website, and then downloading the latest theme, the user is provided with the "Golden Phrase" within the theme. On the website the "Golden Phrase" is announced, and when the user hears the "Golden Phrase" during an interaction between a group of dolls, the event of the doll using the "Golden Phrase" is tracked and stored in the interaction tracking database. Thus, when the doll is once again logged in to the website, the website verifies that the "Golden Phrase" has been used and awards points, or a prize, to the user.
As is known, an interaction is constructed by allocating weightings to each possible response at any point in the interaction. The "Golden Phrase" is generally provided with a low weighting (i.e. the probability that the "Golden Phrase" is used during an interaction is relatively low), and thus the user may be required to initiate interaction in any one theme a number of times before the "Golden Phrase" is used; this ensures that the life span of any one theme is increased, and the users gain reward points in the process. Likewise, points can be awarded for reaching certain targets of any of the statistics; for example, once the user's doll has had 5, 10, 20, etc interactions in one particular time period (e.g. a week, a month, or a year) a number of points will be awarded. Any other statistic can be used as the basis for awarding points, for example the length of the interactions (i.e. points for each 10 minutes of interaction in a week).
Figure 4 shows a schematic diagram of the back-end server 400 that communicates with the doll 200 via the user's PC 402 and the website 404; the back-end server facilitates the operation of the website and the awarding and storing of points as described above. The Doll's unique ID is verified by checking the Doll ID memory 406 located in the server. Likewise, when the user logs into the website the user's ID is verified by checking the User ID memory 408. All of the data transmitted from the Doll to the server, or from the server to the doll 200/user 410, is passed through the data interface 412. The server processor 414 processes the data received via the data interface, and determines the number of reward points that are to be assigned to the user 410 based on the interaction data downloaded from the doll using a rewards engine 416. Alternatively, the doll is adapted to determine the number of reward points that are due by pre-processing the data accumulated by the interaction tracking engine before transmitting the data to the server. In this way, the amount of data transferred between the doll and the server can be reduced. As discussed above, rewards can be provided to the user once certain goals have been achieved. Hence, the reward engine 416 accesses the data relating to the number of points the user 410 has stored in the memory 418 within the points accumulator 420. Once a goal number of points has been achieved a reward is provided to the user. Since the memory 418 stores data for all users the reward engine can provide listings of the users with the most points, thus enabling the weekly, monthly, and yearly rewards to be given to the appropriate users. In addition, the reward engine is adapted to provide the doll with the "Golden Phrase"; details of the "Golden Phrase" are described above. The creator of the server end data, i.e. a webmaster or the like, inputs the current "Golden Phrase" by referencing a word/phrase using the word/phrase ID number already used within the themes. Alternatively, a special word/phrase can be incorporated into the theme during its generation for use as the "Golden Phrase". Figure 5 shows the format of the data 500 transmitted from the doll to the server. As can be seen the data comprises the information described above that is tracked by the interaction tracking engine.
Augmented reality
As shown in Figure 6, there is also provided means for generating an augmented reality. A camera 600, such as a webcam, connected to the user's PC 602 is used to obtain a real-time video stream of a play area. The augmented reality is initiated when a user introduces an object 604, with an imprinted code 606, into the play area 609. The functionality may be provided through the website 404, or via a stand-alone application on the user's PC 602.
By providing the ability to recognise codes or the like using the camera connected to the user's PC the online website/application provides additional functionality to the user. The website/application recognises the code using the code recognition engine 610 (the code may or may not be visible to the user) and then interprets that code into an online image, or text using the avatar generation engine; for example, the code could represent the doll that the code is imprinted on, and then an avatar 608 representing that doll is generated and shown on the user's PC screen 612. The avatar and the real doll can then communicate, via the USB communications dongle described above, as if the avatar were a real doll. Hence, the real doll and the virtual doll in the form of an avatar can both appear on the user's PC screen. Alternatively, in the absence of a camera 600, the code can be inputted into the PC manually. In a similar way to that described above, the code recognition engine then interprets the code and the avatar generation engine, generates an avatar 608. However, only the avatar is shown on the PC screen, but the physical doll and the avatar can still interact via the communications dongle.
The user is required to log on to the website before the augmented reality can be generated, and so the user's details are known, i.e. what dolls they own; this is accomplished using both the user and the doll ID's. Therefore, when the interaction is generated it is known what theme the physical doll currently has within memory. Thus the correct theme can be used by the avatar. This can also be accomplished by identifying the doll, connected wirelessly to the PC via the communications dongle, using the doll's unique identifier, and so a user's friend's doll can also interact with an avatar at the same time as the user's doll.
Further, the means for providing augmented reality can recognise a plurality of objects, and produce a plurality of avatars on the user's PC screen. These avatars then communicate with each other, and/or a physical doll. Alternatively, the code 608 is used to provide a reference to a special theme that is then automatically downloaded to the doll, and then the doll and the avatar communicate within that theme.
In addition, there is provided object recognition software that enables the website/application to recognise certain objects, such as items of specific clothing, toy sports equipment, etc. By introducing the object, the course of the interaction may be altered, for example, the weightings of the next phrases can be altered such that it is more likely that the dolls will communicate about the object that has just been introduced. For example, if a toy jumper is introduced, the weighting for communicating about clothes shopping is increased by an order of magnitude. In this way, the user can influence the interaction between the dolls.
In an alternative, the augmented reality system can recognise a code for, for example, a pregnant doll, and will then spawn an avatar in the form of a baby. This is expanded such that the code can represent any virtual object. For example, the code could represent a new pair of shoes that are automatically shown on the real doll's feet on the user's PC screen. This is put into effect using the object recognition software described above, since the doll can be recognised, and hence the image of the shoes can be placed correctly on the screen to make it appear as though the shoes are actually on the doll. When the user introduces the object, a card or the like, into the play area points are given to the user's account. Depending on the type of object a different number of points are given. The cards or the like can be obtained from the website itself, or from retail shops. In summary, each doll has unique identifier (tag or number), and every unit is identifiable as a unit. The user logs on the website, connects the doll to the PC (the doll being recognised using the unique identifier), and the user can then input personal details about the doll; i.e. name, favourite colour, favourite pets, favourite music etc. The user is then able to acquire points on his/her account, associated with the doll, by way of:
• connecting the doll to the website (i.e. every time you connect you get a single point);
• based on the statistical parameters discussed above;
• collecting golden key phrases
· playing games
By accumulating points through playing with the physical doll the user is competing with other users and weekly, monthly, and annually winners are announced who receive prizes.
It is of course to be understood that the invention is not intended to be restricted to the details of the above embodiments which are described by way of example only, and modifications of detail can be made within the scope of the invention. Each feature disclosed in the description, and (where appropriate) the claims and drawings may be provided independently or in any appropriate combination.

Claims

CLAIMS Interaction tracking
1. A toy comprising: a processor for generating interactions between such toy and at least one other toy capable of interacting with such toy; an interaction tracking engine for generating data related to the interactions; and a memory connected to said interaction tracking engine for storing said data.
2. A toy according to Claim 1 further comprising means for outputting said data.
3. A toy according to Claim 1 or 2, wherein said memory is adapted to store said data relating to a plurality of interactions.
4. A toy according to Claim 1 , 2 or 3, wherein the type of said data is predetermined.
5. A toy according to any of Claims 1 to 4, wherein said data includes a measure of the interactions.
6. A toy according to Claim 5, wherein said measure includes a count related to said interactions.
7. A toy according to Claim 5 or 6, wherein said measure is a temporal measure.
8. A toy according to Claim 5, 6 or 7, wherein said measure includes at least one of the following: the total number of separate interactions, the total number of separate interactions between the toy and each other specific toy, the total number of incidences of the toy using a particular interaction, the time of day of the interaction, and the total time the toy has participated in interactions.
9. A toy according to any of the preceding claims, wherein said data includes whether a predetermined interaction, such as a specific phrase and/or word, has been used during an interaction.
10. A toy according to any of the preceding claims, wherein said interaction is an audible interaction.
1 1. A toy according to any of the preceding claims, wherein said interaction is a physical interaction.
12. A toy according to any of the preceding claims, further comprising means for analysing said data, wherein said analysing means determines when a predetermined target value, associated with said data, has been reached.
13. A toy according to Claim 12, wherein said predetermined target value is a count related to said data.
14. A toy according to Claim 12 or 13, wherein said predetermined target value is a duration.
15. A toy according to any of Claims 12, 13 or 14, wherein said predetermined target value includes at least one of: the total number of separate interactions, the total number of separate interactions between the toy and each other specific toy, the total number of incidences of the toy using a particular interaction, the time of day of the interaction and the total time the toy has participated in interactions.
16. A toy according to any of Claims 12 to 15, further comprising means for outputting said analysis.
17. A toy according to Claim 16, wherein said analysis outputting means incorporates a unique identifier, associated with said toy, with said analysis.
18. A toy according to any of Claims 2 to 17, wherein said data outputting means incorporates a unique identifier, associated with said toy, with said data.
19. A toy according to any of the preceding claims, wherein said toy is a computer, and the form of said toy is represented by an avatar on said computer's screen.
20. A toy according to any of the preceding claims, wherein said toy is an individual object.
21. A server comprising: means for communicating with a plurality of toys, means for receiving data related to each said toy; means for processing said data; and means for allocating points to each said toy in dependence on said processed data.
22. A server according to Claim 21 , wherein said points are stored in memory associated with each respective toy.
23. A server according to Claim 21 or 22, further comprising means for comparing the points associated with each toy.
24. A server according to Claim 23, wherein said comparison means is adapted to generate a ranked list of toys according to the number of points associated with each respective toy.
25. A server according to Claim 24, wherein said processing means determines when a predetermined target value, associated with said data, has been reached.
26. A server according to Claim 25, wherein said predetermined target value is a count related to said data.
27. A server according to Claim 25 or 26, wherein said predetermined target value is a duration.
28. A server according to any of Claims 21 to 27, wherein each said toy is a toy according to any of Claims 1 to 20.
29. A server according to Claim 28 when dependent on any of Claims 25, 26 or 27, wherein said predetermined target value includes at least one of: the total number of separate interactions, the total number of separate interactions between the toy and each other specific toy, the total number of incidences of the toy using a particular interaction, the time of day of the interaction and the total time the toy has participated in interactions.
30. A toy according to any of Claims 1 to 20, further comprising a server according to any of Claims 21 to 29.
Augmented reality with intercommunication
31. An augmented reality system, including:
a processor;
means for receiving a code;
an avatar generation engine adapted to generate an avatar in dependence on said code;
means for outputting data representing an image comprising the avatar generated by the avatar generation engine; and
means for communicating with a physical toy.
32. A system according to Claim 31 , wherein said code receiving means is adapted to receive a code via a manual input.
33. A system according to Claim 31 or 32, wherein said code receiving means is adapted to receive a code via a camera.
34. A system according to any of Claim 31 , 32 or 33, wherein said code is on a toy.
35. A system according to Claim 34, wherein said toy is a doll or a card.
36. A system according to any of Claims 31 to 35, wherein said communication means includes a wireless adapter.
37. A system according to Claim 36, further comprising means for identifying said toy, and a theme stored within said toy, wherein said avatar and said toy then communicate within said theme.
38. A system according to Claim 37, wherein said communication includes, speech and actions.
39. An augmented reality system as claimed in any of claims 31 to 38, further comprising:
means for receiving image data, said image data representing an image of a physical toy in a physical environment, wherein said outputting means is adapted to output data representing said image comprising the generated avatar together with the image of the physical toy in the physical environment;
means for outputting data representing an image comprising the generated avatar together with the image of the physical toy in the physical environment; and
means for receiving activity data representing an action of the physical toy;
wherein the processor is adapted to analyse said received activity data and to generate an action for performance by the avatar in response to the received activity data, whereby said avatar and said physical toy appear to interact in said physical environment.
40. An augmented reality system, comprising:
a processor;
an avatar generation engine adapted to generate an avatar;
means for outputting data representing an image comprising the generated avatar together with the image of the physical toy in the physical environment; and means for receiving activity data representing an action of the physical toy;
wherein the processor is adapted to analyse said received activity data and to generate an action for performance by the avatar in response to the received activity data, whereby said avatar and said physical toy appear to interact in said physical environment.
41. An augmented reality system, as claimed in claim 39 or 40, wherein said activity data represents speech.
42. An augmented reality system, as claimed in claim 39, 40 or 41 , wherein said activity data represents a physical action.
43. A toy substantially as herein described with reference to Figures 3 and 4.
44. An augmented reality system substantially as herein described with reference to Figure 5.
PCT/GB2011/050684 2010-04-06 2011-04-06 Interacting toys WO2011124916A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP11716627A EP2555840A1 (en) 2010-04-06 2011-04-06 Interacting toys
US13/639,411 US20130244539A1 (en) 2010-04-06 2011-04-06 Interacting toys
JP2013503174A JP5945266B2 (en) 2010-04-06 2011-04-06 AC toy
CN2011800279300A CN103201000A (en) 2010-04-06 2011-04-06 Interacting toys

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1005718.0 2010-04-06
GBGB1005718.0A GB201005718D0 (en) 2010-04-06 2010-04-06 Interacting toys

Publications (2)

Publication Number Publication Date
WO2011124916A1 true WO2011124916A1 (en) 2011-10-13
WO2011124916A4 WO2011124916A4 (en) 2011-12-08

Family

ID=42228919

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2011/050684 WO2011124916A1 (en) 2010-04-06 2011-04-06 Interacting toys

Country Status (6)

Country Link
US (1) US20130244539A1 (en)
EP (1) EP2555840A1 (en)
JP (1) JP5945266B2 (en)
CN (1) CN103201000A (en)
GB (1) GB201005718D0 (en)
WO (1) WO2011124916A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8795022B2 (en) 2007-07-19 2014-08-05 Hydrae Limited Interacting toys

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130078886A1 (en) * 2011-09-28 2013-03-28 Helena Wisniewski Interactive Toy with Object Recognition
GB2507073B (en) * 2012-10-17 2017-02-01 China Ind Ltd Interactive toy
CN105278477A (en) * 2014-06-19 2016-01-27 摩豆科技有限公司 Method and device for operating interactive doll
TWI559966B (en) * 2014-11-04 2016-12-01 Mooredoll Inc Method and device of community interaction with toy as the center
CN106552421A (en) * 2016-12-12 2017-04-05 天津知音网络科技有限公司 AR o systems
JP7331349B2 (en) * 2018-02-13 2023-08-23 カシオ計算機株式会社 Conversation output system, server, conversation output method and program
CN114053732B (en) * 2022-01-14 2022-04-12 北京优艾互动科技有限公司 Doll linkage method and system based on data processing

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1092458A1 (en) * 1999-04-30 2001-04-18 Sony Corporation Electronic pet system, network system, robot, and storage medium
GB2423943A (en) * 2005-04-26 2006-09-13 Steven Lipman Communicating Toy
WO2009010760A2 (en) 2007-07-19 2009-01-22 Steven Lipman Interacting toys
US20090137323A1 (en) * 2007-09-14 2009-05-28 John D. Fiegener Toy with memory and USB Ports
US20100018382A1 (en) * 2006-04-21 2010-01-28 Feeney Robert J System for Musically Interacting Avatars

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4368962B2 (en) * 1999-02-03 2009-11-18 株式会社カプコン Electronic toys
JP2001318872A (en) * 2000-05-10 2001-11-16 Nec Corp Communication system and communication method
TW572767B (en) * 2001-06-19 2004-01-21 Winbond Electronics Corp Interactive toy
JP2003039361A (en) * 2001-07-24 2003-02-13 Namco Ltd Information providing system, robot, program, and information storage medium
US7862428B2 (en) * 2003-07-02 2011-01-04 Ganz Interactive action figures for gaming systems
JP2005211232A (en) * 2004-01-28 2005-08-11 Victor Co Of Japan Ltd Communication support apparatus
US20080153594A1 (en) * 2005-10-21 2008-06-26 Zheng Yu Brian Interactive Toy System and Methods
JP2009000472A (en) * 2007-06-22 2009-01-08 Wise Media Technology Inc Radio tag and growth toy device by network communication
JP4677593B2 (en) * 2007-08-29 2011-04-27 株式会社国際電気通信基礎技術研究所 Communication robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1092458A1 (en) * 1999-04-30 2001-04-18 Sony Corporation Electronic pet system, network system, robot, and storage medium
GB2423943A (en) * 2005-04-26 2006-09-13 Steven Lipman Communicating Toy
US20100018382A1 (en) * 2006-04-21 2010-01-28 Feeney Robert J System for Musically Interacting Avatars
WO2009010760A2 (en) 2007-07-19 2009-01-22 Steven Lipman Interacting toys
US20090137323A1 (en) * 2007-09-14 2009-05-28 John D. Fiegener Toy with memory and USB Ports

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8795022B2 (en) 2007-07-19 2014-08-05 Hydrae Limited Interacting toys
US8827761B2 (en) 2007-07-19 2014-09-09 Hydrae Limited Interacting toys

Also Published As

Publication number Publication date
JP5945266B2 (en) 2016-07-05
GB201005718D0 (en) 2010-05-19
CN103201000A (en) 2013-07-10
JP2013523304A (en) 2013-06-17
US20130244539A1 (en) 2013-09-19
EP2555840A1 (en) 2013-02-13
WO2011124916A4 (en) 2011-12-08

Similar Documents

Publication Publication Date Title
US20130244539A1 (en) Interacting toys
US8332544B1 (en) Systems, methods, and devices for assisting play
US10229608B2 (en) Wireless communication between physical figures to evidence real-world activity and facilitate development in real and virtual spaces
WO2019134462A1 (en) Implementation method and apparatus for multi-people somatosensory dance, and electronic device and storage medium
WO2015063610A2 (en) Computer systems and computer-implemented methods for conducting and playing personalized games based on vocal and non-vocal game entries
CN111274151B (en) Game testing method, related device and storage medium
WO2019124059A1 (en) Game device, game method and recording medium
JP2013523304A5 (en)
CN111936213A (en) Generating Meta-Game resources with social engagement
US20210205715A1 (en) Contextual ads for esports fans
JP7194509B2 (en) Server system, game system and matching processing method
US20230001300A1 (en) Computer system, game system, and control method of computer system
WO2022003938A1 (en) Information processing system, information processing method, computer program, and device
CA3051053C (en) Physical element linked computer gaming methods and systems
Park et al. The interplay between real money trade and narrative structure in massively multiplayer online role-playing games
Bischke et al. Biofeedback implementation in a video game environment
Żuchowska et al. Affective games provide controlable context: proposal of an experimental framework
KR101285743B1 (en) Method and server for controlling game simulation
KR20130114314A (en) Method and apparatus for indicating engagement information of each user
KR20130060635A (en) Game method for providing a prize through a predetermined game
Pnevmatikakis et al. Game and multisensory driven ecosystem to an active lifestyle
Eklund et al. Geofort: Mobile game for motivating physical activity using gamification and augmented reality
JP2023146391A (en) Server system, program, and live reporting distribution method of game live reporting play
JP2023005031A (en) System, method and program for supporting betting in game
KR20230155965A (en) System for processing information using collective intelligence and method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11716627

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2013503174

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2011716627

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2011716627

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 13639411

Country of ref document: US