WO2001070361A2 - Interactive toy applications - Google Patents

Interactive toy applications Download PDF

Info

Publication number
WO2001070361A2
WO2001070361A2 PCT/IL2001/000268 IL0100268W WO0170361A2 WO 2001070361 A2 WO2001070361 A2 WO 2001070361A2 IL 0100268 W IL0100268 W IL 0100268W WO 0170361 A2 WO0170361 A2 WO 0170361A2
Authority
WO
WIPO (PCT)
Prior art keywords
toy
interactive
user
functionality
toys
Prior art date
Application number
PCT/IL2001/000268
Other languages
French (fr)
Other versions
WO2001070361A3 (en
Inventor
Oz Gabai
Jacob Gabai
Nathan Weiss
Nimrod Sandlerman
Zvika Pfeffer
Noam Yuran
Sherman Rosenfeld
Susan Eve Vecht-Lifschitz
Original Assignee
Creator Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US60/192,013 priority Critical
Priority to US19201100P priority
Priority to US19201400P priority
Priority to US19201300P priority
Priority to US19201200P priority
Priority to US60/192,011 priority
Priority to US60/192,012 priority
Priority to US60/192,014 priority
Priority to US60/193,697 priority
Priority to US19369900P priority
Priority to US19370400P priority
Priority to US19370200P priority
Priority to US19369700P priority
Priority to US19370300P priority
Priority to US60/193,702 priority
Priority to US60/193,703 priority
Priority to US60/193,699 priority
Priority to US60/193,704 priority
Priority to US60/195,863 priority
Priority to US19586500P priority
Priority to US60/195,861 priority
Priority to US19586400P priority
Priority to US19586100P priority
Priority to US19586300P priority
Priority to US19586600P priority
Priority to US60/195,862 priority
Priority to US60/195,866 priority
Priority to US60/195,865 priority
Priority to US60/195,864 priority
Priority to US19586200P priority
Priority to US19622700P priority
Priority to US60/196,227 priority
Priority to US60/197,579 priority
Priority to US19757600P priority
Priority to US19757800P priority
Priority to US60/197,576 priority
Priority to US19757900P priority
Priority to US19757700P priority
Priority to US60/197,578 priority
Priority to US60/197,573 priority
Priority to US19757300P priority
Priority to US60/197,577 priority
Priority to US60/200,647 priority
Priority to US20051300P priority
Priority to US20063900P priority
Priority to US20064000P priority
Priority to US20050800P priority
Priority to US20064100P priority
Priority to US20064700P priority
Priority to US60/200,641 priority
Priority to US60/200,513 priority
Priority to US60/200,640 priority
Priority to US60/200,508 priority
Priority to US60/200,639 priority
Priority to US20317700P priority
Priority to US20324400P priority
Priority to US60/203,177 priority
Priority to US20318200P priority
Priority to US20317500P priority
Priority to US60/203,182 priority
Priority to US60/203,244 priority
Priority to US60/203,175 priority
Priority to US20420100P priority
Priority to US20420000P priority
Priority to US60/204,201 priority
Priority to US60/204,200 priority
Priority to US20712800P priority
Priority to US20712600P priority
Priority to US60/207,126 priority
Priority to US60/207,128 priority
Priority to US60/208,105 priority
Priority to US20810500P priority
Priority to US60/208,392 priority
Priority to US20839100P priority
Priority to US20839200P priority
Priority to US20839000P priority
Priority to US60/208,390 priority
Priority to US60/208,391 priority
Priority to US60/209,471 priority
Priority to US20947100P priority
Priority to US60/210,443 priority
Priority to US21044500P priority
Priority to US60/210,445 priority
Priority to US21044300P priority
Priority to US21269600P priority
Priority to US60/212,696 priority
Priority to US21536000P priority
Priority to US60/215,360 priority
Priority to US21623800P priority
Priority to US21623700P priority
Priority to US60/216,237 priority
Priority to US60/216,238 priority
Priority to US60/217,357 priority
Priority to US21735700P priority
Priority to US60/219,234 priority
Priority to US21923400P priority
Priority to US22027600P priority
Priority to US60/220,276 priority
Priority to US22193300P priority
Priority to US60/221,933 priority
Priority to US22387700P priority
Priority to US60/223,877 priority
Priority to US60/227,112 priority
Priority to US22711200P priority
Priority to US22937100P priority
Priority to US60/229,371 priority
Priority to US60/229,648 priority
Priority to US22964800P priority
Priority to US60/231,105 priority
Priority to US23110500P priority
Priority to US23110300P priority
Priority to US60/231,103 priority
Priority to US60/234,895 priority
Priority to US60/234,883 priority
Priority to US23489500P priority
Priority to US23488300P priority
Priority to US23932900P priority
Priority to US60/239,329 priority
Priority to US25336200P priority
Priority to US60/253,362 priority
Priority to US25033200P priority
Priority to US60/250,332 priority
Priority to US60/254,699 priority
Priority to US25469900P priority
Priority to US26735001P priority
Priority to US60/267,350 priority
Application filed by Creator Ltd. filed Critical Creator Ltd.
Priority claimed from AU44498/01A external-priority patent/AU4449801A/en
Publication of WO2001070361A2 publication Critical patent/WO2001070361A2/en
Publication of WO2001070361A3 publication Critical patent/WO2001070361A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/02Marketing, e.g. market research and analysis, surveying, promotions, advertising, buyer profiling, customer management or rewards; Price estimation or determination
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/28Arrangements of sound-producing means in dolls; Means in dolls for producing sounds
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H30/00Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
    • A63H30/02Electrical arrangements
    • A63H30/04Electrical arrangements using wireless transmission
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Abstract

In an interactive toy environment, in which a plurality of interactive toys are interconnected via a computer network and in which interactive toys interact with one or more users, an inter-toy communication system in which the interaction of a toy with its user is affected by the interaction of either that toy or another toy with another user. The interaction of a toy with its user is personalized and depends on knowledge of the characteristics of both the toy and its user. Interactive toys have real time conversations with users. Networked interactive toys are further able to communicate with computers on the network so that, if authorized, they are aware of the activities of other toys and of their users. Networked interactive toys may thus utilize information from any computer on the network. Interactive toy applications making use of these features are also provided.

Description

INTERACTIVE TOY APPLICATIONS

CROSS-REFERENCE TO RELATED APPLICATIONS This application claims priority from the following co-pending US Provisional

Applications:

60/192,011; 60/192,012; 60/192,013; 60/192,014; 60/193,697; 60/193,699; 60/193,702; 60/193,703; 60/193,704; 60/195,861; 60/195,862; 60/195,863; 60/195,864; 60/195,865; 60/195,866; 60/196,227; 60/197,573; 60/197,576; 60/197,577; 60/197,578; 60/197,579; 60/200,508; 60/200,513; 60/200,639; 60/200,640; 60/200,641; 60/200,647; 60/203,175; 60/203,177; 60/203,182; 60/203244; 60/204200; 60/204,201; 60/207,126; 60/207,128; 60/208,105; 60/208,390; 60/208,391; 60/208,392; 60/209,471; 60/210,443; 60/210,445; 60/212,696; 60/215,360; 60/216,237; 60/216,238; 60/217,357; 60/219,234; 60/220,276; 60/221,933; 60/223,877; 60/227,112; 60/229,371; 60/229,648; 60/231,103; 60/231,105; 60/234,883; 60/239,329; 60/250,332; 60/253,362; 60/254,699; 60/234,895 and also from a U.S. Provisional Application sent for filing on February 8, 2001; entitled "Interactive toy applications".

FIELD OF THE INVENTION

The present invention relates to toys, in general, and particularly to toys used in conjunction with a computer system.

BACKGROUND OF THE INVENTION

Toys used in conjunction with a computer system are well known in the art. The following patents are believed to represent the state of the art: U.S. Patent No. 5,746,602 to Kikinis entitled "PC Peripheral Interactive Doll"; US Patent No.5,752,880 to Gabai et al. entitled "Interactive Doll"; U.S. Patent No. 6,022,273 to Gabai et al. entitled "Interactive Doll"; U.S. Patent No. 6,053,797 to Tsang entitled "Interactive Toy"; U.S. Patent No. 6,059,237 to Choi entitled "Interactive Toy Train"; U.S. Patent No. 6,064,854 to Peters et al. entitled "Computer assisted interactive entertainment/educational character goods"; U.S. Patent No. 6,089,942 to Chan entitled "Interactive Toys"; U.S. Patent No. 6,149,490 to Hampton entitled "Interactive Toy"; U.S. Patent No. 6,160,986 to Gabai et. al. entitled "Interactive Toy"; and U.S. Patent No. 6,075,195 to Gabai et al. Computerized toys are also described in the following published PCT applications: PCT/IL96/00157 (WO 97/18871); PCT/IL98/00223 (WO 98/53456); PCT/IL98/00224 (WO 98/52667); PCT/IL98/00225 (WO 98/53567); PCT/IL98/00392 (WO 99/08762); PCT/IL98/00406 (WO99/10065); PCT/IL99/00202 (WO99/54015); PCT/IL99/00271 (WO 99/60358); PCT/IL99/00637 (WO 00/31613); PCT/IL00/00130 (WO 00/51697).

The disclosures of all publications mentioned in the specification and of the publications cited therein are hereby incorporated by reference.

SUMMARY OF THE INVENTION

The present invention seeks to provide improved methods and apparatus for applications of interactive toys.

There is thus provided in accordance with a preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a methodology for obtaining and utilizing information including employing at least one of the plurality of interactive toys to obtain information via the user, and utilizing the information obtained via the user in an application which is not limited to user involvement.

Also, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information also including: obtaining required permission of at least one of a user and a person legally capable of providing permission in respect of the user.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein the information is utilized in marketing.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein at least one of the plurality of interactive toys provides information on purchasing.

Moreover, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein the information is utilized in advertising.

Further, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein at least one of the information is utilized in designing advertising.

Still further, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein at least one of the information is utilized in directing advertising.

Yet further, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein at least one of the information is utilized in classifying users according to user profiles at least partly derived from the information.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein the information includes not only information directly provided by the user but also information derived from user behavior sensed by the at least one interactive toy.

Also, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information, wherein the information includes information derived from user behavior sensed by the at least one interactive toy, which behavior is non-commercial behavior.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein the information is obtained at least partially by employing speech recognition.

Moreover, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein the information is obtained at least partially by employing speech recognition.

Further, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein the information is obtained at least partially by employing speech recognition from disparate cultural groups of users.

Yet further, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein criteria employed in speech recognition are updated in response to information received indicating the efficacy of the speech recognition.

Still further, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein the at least one toy is employed at least partially to prompt the user to speak.

Also in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein the at least one toy is employed at least partially to prompt the user to say certain words.

Additionally in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein the at least one toy is employed at least partially to develop a language model.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein the information is utilized at least partially as a diagnostic tool for evaluating performance of at least one of a computer and an interactive toy. Additionally, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein the information is utilized at least partially as a diagnostic tool for evaluating performance of at least one user.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein the information is utilized at least partially as a diagnostic tool for evaluating performance of content employed by the at least one interactive toy.

Moreover, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein the information is utilized at least partially as a diagnostic tool for evaluating utility of teaching methods.

Further, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein the information is utilized at least partially as a toy design tool.

Yet further, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein the information is utilized at least partially as a game design tool.

Still further, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein the information is utilized at least partially for evaluating changes in performance of at least one user over time.

Also in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein the information is utilized at least partially for evaluating nutrition habits of the at least one user.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a methodology for obtaining and utilizing information wherein the infomiation is utilized at least partially as a diagnostic tool for evaluating utility of educational methodologies.

There is thus provided in accordance with another preferred embodiment of the present invention, a schedule monitoring toy system including: an at least partially verbal- input interactive toy operative to learn personal information about a child, and toy content operative to actuate the verbal-input interactive toy to present to the child at least one personalized, verbal scheduling prompt based on at least one item of personal information which the verbal-input interactive toy has learned about the child.

Also in accordance with a preferred embodiment of the present invention, there is provided a schedule monitoring toy system wherein the toy content includes personalized content which at least partly conforms to at least one personal characteristic of the user, the personal characteristic being learned by the user's toy.

There is thus provided in accordance with another preferred embodiment of the present invention, a schedule monitoring toy system including: a verbal-input interactive toy, a parental input receiver operative to recognize a parent and to receive therefrom at least one parental input regarding at least one desired schedule item, and toy content actuating the verbal-input interactive toy to present to a child a timely verbal presentation of the at least one desired schedule item.

There is thus provided in accordance with another preferred embodiment of the present invention, a schedule-monitoring toy system including: a mobile, verbal-input interactive toy, a scheduler operative to receive an input regarding at least one schedule item, a child locator operative to locate a child within a predetermined area, and a prompter operative, at a time appropriate to the at least one schedule item, to locate the child and to deliver at least one verbal prompt for the at least one schedule item.

Also in accordance with a preferred embodiment of the present invention, there is provided a schedule-monitoring toy system wherein the prompter is operative to physically approach the child.

There is thus provided in accordance with another preferred embodiment of the present invention, a schedule monitoring toy system including: a verbal-input interactive toy operative to perform speech recognition, and toy content actuating the verbal-input interactive toy to present to a child: at least one timely, interactive verbal scheduling prompt, and at least one anthropomorphic response to recognized speech content produced by a child responsive to the prompt.

There is thus provided in accordance with another preferred embodiment of the present invention, a schedule monitoring toy system including: a verbal-input interactive toy, a schedule input receiver operative to receive, from at least one authorized source, information regarding a plurality of schedule items, a free time database operative to receive, from at least one authorized source, information regarding at least one free time activities authorized to be performed by a child during his free time, and toy content actuating the verbal-input interactive toy to present to the child: a timely verbal presentation of each of the plurality of schedule items, and a verbal presentation, presented at a time not occupied by any of the plurality of schedule items, prompting the child to perform at least one of the free time activities.

There is thus provided in accordance with another preferred embodiment of the present invention, a follow-me toy system including: a mobile toy, and a user-following mechanism tracking the user and guiding the toy to follow the user as the user moves within a working area.

There is thus provided in accordance with another preferred embodiment of the present invention, a networked diary toy system including: a verbal-input interactive toy, a network interface connecting the verbal-input interactive toy to a computer network including at least one networked computer, a diary database storing at least one diary item for an individual user, and verbal content for presenting diary items from the diary database, wherein at least a portion of the verbal content is stored on the at least one networked computer and arrives at the verbal-input interactive toy via the network interface.

There is thus provided in accordance with another preferred embodiment of the present invention, a speech-responsive networked diary toy system including: a toy, a network interface connecting the toy to a computer network including at least one networked computer, a diary database storing at least one diary item for an individual user, a speech- recognition unit residing at least in part in the at least one networked computer and communicating with the toy via the network and the network interface, and toy content actuating the toy to present at least one diary item responsive to user utterances recognized by the speech recognition unit.

There is thus provided in accordance with another preferred embodiment of the present invention, a supervised networked organizer system including: an organizer subsystem operative to perform at least one organizing function involving multiple individual users, and a supervision subsystem storing at least one supervisor identity and automatically providing to each individual supervisor, inputs from the organizer system.

Also, in accordance with a preferred embodiment of the present invention, there is provided a supervised networked organizer system wherein the organizer subsystem includes multiple interactive toys associated with the multiple individual users.

Also, in accordance with a preferred embodiment of the present invention, there is provided a supervised networked organizer system which is adapted for use by multiple individual users at least some of which are children and by individual supervisors at least some of which are parents of at least some of the children. Additionally, in accordance with a preferred embodiment of the present invention, there is provided a supervised networked organizer system wherein the organizer subsystem includes override functionality which enables the individual supervisor to override inputs received by the organizer subsystem from at least one of the multiple individual users.

There is thus provided in accordance with another preferred embodiment of the present invention, a child-messaging toy system including: a verbal-input interactive toy including child propinquity indicating functionality, a message database operative to accept at least one message to be delivered to a child whose propinquity to the toy is indicated to exist, and a message delivery controller including: an audible annunciator operative to provide a personalized audible output to the child requesting that the child come into propinquity with the toy, and a message output generator, operative in response to an indication of propinquity of the child to the toy for providing at least one message from the message database to the child.

There is thus provided in accordance with another preferred embodiment of the present invention, a child-messaging toy system including: a verbal-input interactive toy including child propinquity indicating functionality, a timed message database operative to accept at least one time-specific message to be delivered to a child whose propinquity to the toy is indicated to exist at least one predetermined time, and a message delivery controller including: an audible annunciator operative to provide a personalized audible output to the child requesting that the child come into propinquity with the toy, and a message output generator, operative in response to an indication of propinquity of the child to the toy for providing at least one time-specific message from the timed message database to the child.

Also, in accordance with a preferred embodiment of the present invention, there is provided a child-messaging toy system also including a message delivery indication that the time-specific message has not been delivered to the child at the predetermined time.

There is thus provided in accordance with another preferred embodiment of the present invention, a virtual parenting toy system including: a verbal-input interactive toy operative to play at least one game with a child, the verbal-input interactive toy including verbal-input interactive toy content operative to actuate the verbal-input interactive toy to present to the child: at least one verbal prompt to perform at least one task, and at least one verbal offer to play the at least one game with the child once the at least one task is performed, and a compliance monitor operative to accept at least one indication that the at least one task has been performed and in response to the indication, to actuate the at least one game. There is thus provided in accordance with another preferred embodiment of the present invention, a virtual parenting toy system including: an interactive toy including: a child want indication-recognizing functionality operative to recognize at least one indication of a child want, a child want reporting functionality for providing an output indication of a child want recognized by the child want indication-recognizing functionality, and a child want satisfying functionality operative to satisfy the child want reported by the child want reporting functionality.

Also, in accordance with a preferred embodiment of the present invention, there is provided a virtual parenting toy system wherein the child want satisfying functionality is controlled by a child want satisfying input which may be received other than from the child.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a virtual parenting toy system wherein the child want satisfying functionality includes: advertisement content responsive to the child want indication and offering a plurality of advertised items, and child preference eliciting functionality ascertaining a preference of the child for a given item from among the plurality of advertised items and providing a child preference output, and transactional functionality operative in response to the child preference output for purchasing the given item.

There is thus provided in accordance with another preferred embodiment of the present invention, a toy system including: an interactive toy including, free time indication functionality designating at least one time slot during which a child has free time and may participate in toy interaction, and free time utilization functionality operative in response to an output from the free time indication functionality for providing entertainment to the child during the at least one time slot.

Also, in accordance with a preferred embodiment of the present invention, there is provided a toy system wherein the free time indication functionality includes a schedule input receiver operative to receive schedule information regarding a plurality of schedule items and to compute therefrom the at least one time slot.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a toy system wherein the free time indication functionality is responsive to an overriding parent input for defining the at least one time-slot.

There is thus provided in accordance with another preferred embodiment of the present invention, a user-location monitoring toy diary including: a schedule database storing, for each of a multiplicity of schedule items, a time thereof and coordinates of a location thereof, wherein the multiplicity of locations are represented within a single coordinate system, a user tracker operative to track the current location of the user, and a prompter operative to prompt the user to conform to the schedule if the user's current location does not conform to the stored location of a current schedule item.

There is thus provided in accordance with another preferred embodiment of the present invention, a schedule-monitoring toy system including: a verbal-input interactive toy operative to interact with a user, a schedule database storing the user's schedule, and a schedule reminder actuating the verbal-input interactive toy to present to the user at least one verbal scheduling prompt which includes content which is not related to scheduling.

Also, in accordance with a preferred embodiment of the present invention, there is provided a schedule-monitoring toy system wherein the prompt includes at least one game.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a schedule-monitoring toy system wherein the prompt includes at least one joke.

Also, in accordance with a preferred embodiment of the present invention, there is provided a schedule-monitoring toy system wherein the prompt offers the user a value credit for compliance with the prompt and stores the credit for the user if the user fulfills a compliance criterion.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a schedule-monitoring toy system wherein the prompt includes content which emotionally prepares the user to cope with an emotionally traumatic item in the schedule database.

There is thus provided in accordance with another preferred embodiment of the present invention, a computerized guide system for a blind user, the guide including: a portable interactive computerized device including: route definition functionality operative to receive route input from a blind user for selecting a user route, hazard detection functionality operative to detect at least one hazard along the user route, and audio warning functionality operative in response to an output from the hazard detection functionality to provide the user with an audio warning regarding presence of the hazard.

Also, in accordance with a preferred embodiment of the present invention, there is provided a computerized guide system wherein the interactive device is networked with at least one other such device.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a computerized guide system wherein the interactive device is operative to provide hazard information to the at least one other such device. Moreover, in accordance with a preferred embodiment of the present invention, there is provided a computerized guide system wherein the interactive device is operative to broadcast the hazard infomiation in real time.

There is thus provided in accordance with another preferred embodiment of the present invention, a parental surrogate toy including: a toy, a child behavior report receiver, and a toy controller including: a behavioral report configuration definer allowing a parent to define at least one parameter of child behavior which is of interest, a child monitor operative to monitor the parameter of child behavior and to provide a report relating to the at least one parameter to the child behavior report receiver.

There is thus provided in accordance with another preferred embodiment of the present invention, a web browsing system including: an interactive toy being connectable to the Internet, the interactive toy including a user interface having web browsing functionality.

Also, in accordance with a preferred embodiment of the present invention, there is provided a web browsing system also including a computer which serves as an intermediary between the interactive toy and the Internet.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a web browsing system wherein the user interface also has non-web browsing functionality.

Moreover, in accordance with a preferred embodiment of the present invention, there is provided a web browsing system wherein the user interface provides the web browsing functionality within the context of a game.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided a web browsing system wherein in the context of the game the web browsing functionality provides an answer to a question posed in the game.

Further, in accordance with a preferred embodiment of the present invention, there is provided a web browsing system wherein in the context of the game the web browsing functionality provides non-rational web browsing.

Yet further, in accordance with a preferred embodiment of the present invention, there is provided a web browsing system wherein the web browsing functionality produces content which is subsequently employed by the toy.

Still further, in accordance with a preferred embodiment of the present invention, there is provided a web browsing system wherein the content is added to a stored user profile.

Also, in accordance with a preferred embodiment of the present invention, there is provided a web browsing system wherein the user interface also includes interrogation functionality for obtaining information from other interactive toys networked therewith.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a web browsing system wherein the user interface includes a voice interactive functionality.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided a web browsing system wherein at least one user characteristic ascertained from earlier interaction between the toy and a user is employed as an input in the web browsing functionality.

Moreover, in accordance with a preferred embodiment of the present invention, there is provided a web browsing system wherein the at least one user characteristic is employed by the web browsing functionality for matching the user with an activity offering functionality.

Further, in accordance with a preferred embodiment of the present invention, there is provided a web browsing system wherein the activity offering functionality is an employment agency functionality.

There is thus provided in accordance with another preferred embodiment of the present invention, a knowledge management system including: an interactive toy being connectable to the Internet, the interactive toy including a user interface having information management functionality.

Also, in accordance with a preferred embodiment of the present invention, there is provided a knowledge management system wherein the information management functionality includes at least one of information retrieval functionality, information synthesis functionality and information filtering functionality.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a lαiowledge management system wherein the user interface includes a voice interactive functionality.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided a knowledge management system wherein the user interface includes a telephone dialer.

Moreover, in accordance with a preferred embodiment of the present invention, there is provided a knowledge management system wherein the user interface includes a telephone inquiry functionality.

Further, in accordance with a preferred embodiment of the present invention, there is provided a knowledge management system wherein the user interface includes a download to diary functionality.

Still further, in accordance with a preferred embodiment of the present invention, there is provided a knowledge management system wherein the information management functionality includes matching functionality operative to match potential donors with potential charity recipients.

Yet further, in accordance with a preferred embodiment of the present invention, there is provided a l nowledge management system wherein the matching functionality employs user profile information collected by the toy.

Also, in accordance with a preferred embodiment of the present invention, there is provided a Imowledge management system wherein the information management functionality includes matching functionality operative to match potential volunteers with potential charity organizations.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a knowledge management system wherein the information management functionality includes user status determination functionality operative to sense a wellness status of a user and help functionality operative to place the user into contact with functionalities equipped to enhance the wellness status of the user.

Moreover, in accordance with a preferred embodiment of the present invention, there is provided a knowledge management system wherein the information management functionality includes user status determination functionality operative to sense a happiness status of a user and help functionality operative to place the user into contact with functionalities equipped to enhance the happiness status of the user.

Further, in accordance with a preferred embodiment of the present invention, there is provided a Imowledge management system wherein the user status determination functionality includes voice responsive functionality.

Yet further, in accordance with a preferred embodiment of the present invention, there is provided a knowledge management system wherein the information management functionality includes matching functionality operative to possessions of potential donors with potential charity recipients.

There is thus provided in accordance with another preferred embodiment of the present invention, an interactive persona system including: a three-dimensional artificial person including: a computer, and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona.

Also, in accordance with a preferred embodiment of the present invention, there is provided an interactive persona system wherein the three-dimensional artificial person is locomotive.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided an interactive persona system wherein the voice responsive interactive functionality employs artificial intelligence.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided an interactive persona system wherein the three-dimensional artificial person has at least one of an appearance and voice which is characteristic of the persona.

Moreover, in accordance with a preferred embodiment of the present invention, there is provided an interactive persona system wherein the pattern of behavior is at least partially programmable.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided an interactive persona system wherein the pattern of behavior is at least partially programmable by a user.

Further, in accordance with a preferred embodiment of the present invention, there is provided an interactive persona system wherein the pattern of behavior is at least partially programmable other than by a user.

Yet further, in accordance with a preferred embodiment of the present invention, there is provided an interactive persona system wherein the pattern of behavior is at least partially remotely programmable.

Still further, in accordance with a preferred embodiment of the present invention, there is provided an interactive persona system wherein the pattern of behavior is at least partially controllable via a computer network.

Also, in accordance with a preferred embodiment of the present invention, there is provided an interactive persona system wherein the pattern of behavior is at least partially controllable via a computer network in real time.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided an interactive persona system wherein the pattern of behavior is that of a teacher.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided an interactive persona system wherein the persona is of a known non- teacher.

Moreover, in accordance with a preferred embodiment of the present invention, there is provided an interactive persona system wherein the pattern of behavior is that of a coach.

Further, in accordance with a preferred embodiment of the present invention, there is provided an interactive persona system wherein the persona is of a known coach.

Yet further, in accordance with a preferred embodiment of the present invention, there is provided an interactive persona system wherein the pattern of behavior is that of a guide.

Still further, in accordance with a preferred embodiment of the present invention, there is provided an interactive persona system wherein the persona is of a known guide.

Yet further, in accordance with a preferred embodiment of the present invention, there is provided an interactive persona system wherein the pattern of behavior is determined at least in part by at least one user characteristic known to the artificial person.

Also, in accordance with a preferred embodiment of the present invention, there is provided an interactive persona system wherein the pattern of behavior is determined at least in part by at least one user characteristic known to the artificial person.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a inter-toy communication system including: at least one interactive toy operative for interaction with a plurality of users, wherein the interaction of the at least one interactive toy with at least one of the plurality of users is affected by the interaction of the at least one interactive toy with another one of the plurality of users.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a inter-toy communication system including: at least one interactive toy operative for interaction with a plurality of users, wherein the interaction of the at least one interactive toy with at least two of the plurality of users is dependent on knowledge of the toy of which user it is interacting with and characteristics of the user known to the toy.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a inter-toy communication system including: a plurality of interactive toys operative for interaction with at least one user, wherein the interaction of one of the plurality of interactive toys with the at least one user is affected by the interaction of another of the plurality of toys with the at least one user.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a computer network, a multi-toy communication system including: at least one first interactive toy operative for communication with the computer network, at least one second interactive toy operative for communication with the computer network via the at least one first interactive toy.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a computer network, a multi-toy location system including: location functionality operative to sense at least predetermined propinquity between at least two of the plurality of interactive toys.

Also, in accordance with a preferred embodiment of the present invention, there is provided a multi-toy location system also including: propinquity notification functionality operative in response to an output from the location functionality indicating the sensed at least predetermined propinquity for notifying at least one of the at least two of the plurality of interactive toys of at least the predetermined propinquity.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a multi-toy location system wherein the location functionality includes toy voice recognition functionality.

There is thus provided in accordance with another preferred embodiment of the present invention in an interactive toy environment comprising a plurality of interactive toys, at least one of which being normally in interactive communication via a computer with a computer network, said computer including a toy communication functionality comprising: a toy recognition functionality enabling said computer to recognize the identity of a toy which is not normally in interactive communication therewith, when said toy comes into communication propinquity therewith; and a communication establishing functionality operative following recognition of the identity of a toy which is not normally in interactive communication therewith, when said toy comes into communication propinquity therewith for establishing interactive communication therewith.

Also, in accordance with a preferred embodiment of the present invention, there is provided a toy communication functionality wherein the communication establishing functionality is operative in response to an authorization received from a user of the at least one toy which is normally in interactive communication with the computer network via the computer.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a multi-toy coordinated activity system including: a plurality of interactive toys operative for interaction via a computer network, and a coordinated activity functionality operative via the computer network to cause the plurality of interactive toys to coordinate their actions in a coordinated activity.

Also, in accordance with a preferred embodiment of the present invention, there is provided a multi-toy coordinated activity system wherein the plurality of interactive toys are located at disparate locations.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a multi-toy coordinated activity system wherein the coordinated activity functionality causes the plurality of interactive toys to communicate with each other at least partially not in real time.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a communication system providing communication between at least one of multiple toys and at least one toy and at least one user, the system including: a plurality of interactive toys operative for interaction via a computer network, a communications functionality operative at least partially via the computer network to cause at least some of the plurality of interactive toys to communicate with each other at least partially not in real time.

Also, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein the communications functionality includes a text message to voice conversion functionality.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein the communication functionality includes a message to voice conversion functionality, which provides a vocal output having characteristics of at least one of the plurality of interactive toys.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein the communications functionality includes an e mail communication functionality.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein at least some of the plurality of interactive toys has an e-mail address which is distinct for that of a user thereof.

Moreover, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein the e-mail communication functionality enables transmission of movement instructions to at least one of the plurality of interactive toys.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a communication system providing communication between at least one of multiple toys and at least one user, the system including: a plurality of interactive toys operative for interaction via a computer network, a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate with at least one user via a telephone link.

Also, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein the communications functionality includes an interactive voice response computer operative to enable the user to communicate by voice with the at least one of the plurality of interactive toys.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein the communications functionality enables a user to provide instructions to at least one of the plurality of interactive toys to carry out physical functions.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein the communications functionality is operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate with at least one user via at least another one of the plurality of interactive toys and a telephone link.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a communication system providing communication between at least one of multiple toys and at least one user, the system including: a plurality of interactive toys operative for interaction via a computer network, and a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate with at least one user via at least another one of the interactive toys. There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a communication system providing communication between at least one of multiple toys and at least one user, the system including: a plurality of interactive toys operative for interaction via a computer network, and a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate motions of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys.

Also, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein the communications functionality employs software instructions to the first one of the plurality of interactive toys for transmission to the another of the plurality of interactive toys.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein the communications functionality employs information regarding sensed motion of the first one of the plurality of interactive toys for transmission to the another of the plurality of interactive toys.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein the another of the plurality of interactive toys generally replicates the motion of the first one of the plurality of interactive toys.

Moreover, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein the another of the plurality of interactive toys generally replicates the motion of the first one of the plurality of interactive toys.

Further, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein the another of the plurality of interactive toys generally replicates the motion of the first one of the plurality of interactive toys.

Yet further, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein the communications functionality is operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate motions and speech of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys.

Still further, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein the communications functionality is operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate synchronized motion and speech of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys.

Also, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein the communications functionality is operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate motions and speech of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys.

Additionally in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein the communications functionality is operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate synchronized motion and speech of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein the communications functionality is operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate motions and speech of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy communication system wherein the communications functionality is operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate synchronized motion and speech of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a communication system providing communication between at least one of multiple toys and at least one user, the system including: a plurality of interactive toys operative for interaction via a computer network, a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate with at least one user via at least another one of the interactive toys.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a computer network, an integrated toy-game functionality including: a game which may be played by a user, and at least one interactive toy containing game- specific functionality which participates in playing the game.

Also, in accordance with a preferred embodiment of the present invention, there is provided a toy-game functionality wherein the game-specific functionality enables the interactive toy to play the game as a player.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a toy-game functionality wherein the game-specific functionality enables the interactive toy to assist the user in playing the game.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided a toy-game functionality wherein the game-specific functionality enables the interactive toy to be employed by the user as a user interface in playing the game.

Moreover, in accordance with a preferred embodiment of the present invention, there is provided a toy-game functionality wherein the game-specific functionality enables the interactive toy to have voice interaction with the user in the course of playing the game.

Also, in accordance with a preferred embodiment of the present invention, there is provided a toy-game functionality wherein the game-specific functionality enables the interactive toy to have voice interaction with the user in the course of playing the game.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a toy-game functionality wherein the game-specific functionality enables the interactive toy to have voice interaction with the user in the course of playing the game.

Moreover, in accordance with a preferred embodiment of the present invention, there is provided a toy-game functionality wherein the game-specific functionality enables the interactive toy to have voice interaction with the user in the course of playing the game.

Further, in accordance with a preferred embodiment of the present invention, there is provided a toy-game functionality wherein the game-specific functionality enables the interactive toy to be responsive to at least one current characteristic of the user as sensed by the interactive toy.

Still further, in accordance with a preferred embodiment of the present invention, there is provided a toy-game functionality wherein the game-specific functionality enables the interactive toy to be responsive to at least one current characteristic of the user as sensed by the interactive toy.

Yet further, in accordance with a preferred embodiment of the present invention, there is provided a toy-game functionality wherein the game-specific functionality enables the interactive toy to be responsive to at least one current characteristic of the user as sensed by the interactive toy.

Also, in accordance with a preferred embodiment of the present invention, there is provided a toy-game functionality wherein the game-specific functionality enables the interactive toy to be responsive to at least one current characteristic of the user as sensed by the interactive toy.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a toy-game functionality wherein the game-specific functionality enables the interactive toy to be responsive to at least one cun-ent characteristic of the user as sensed by the interactive toy.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided a toy-game functionality wherein the game-specific functionality enables the interactive toy to be responsive to at least one current characteristic of the user as sensed by the interactive toy.

Moreover, in accordance with a preferred embodiment of the present invention, there is provided a toy-game functionality wherein the game-specific functionality enables the interactive toy to be responsive to at least one current characteristic of the user as sensed by the interactive toy.

Further, in accordance with a preferred embodiment of the present invention, there is provided a toy-game functionality wherein the game-specific functionality enables the interactive toy to be responsive to at least one current characteristic of the user as sensed by the interactive toy.

Still further, in accordance with a preferred embodiment of the present invention, there is provided a toy-game functionality wherein the game is a multi-user game which may be played over a network.

Yet further, in accordance with a preferred embodiment of the present invention, there is provided a toy-game functionality wherein the game-specific functionality is operative to mediate between at least two users playing the game.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, an interpersonal interaction communication system providing communication between multiple users via multiple toys, the system including: a plurality of interactive toys operative for interaction via a computer network, at least some of the plurality of interactive toys having a functionality of forming personal profiles of users thereof based on accumulated past interactions therewith, and a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate content with at least another of the plurality of interactive toys, the content being operative to produce interaction between respective users thereof.

Also, in accordance with a preferred embodiment of the present invention, there is provided an interpersonal interaction communication system wherein the content is operative to produce personal interaction between respective users thereof.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided an interpersonal interaction communication system wherein the content is operative to produce a personal meeting between respective users thereof.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided an interpersonal interaction communication system wherein the content is operative to produce pseudo-accidental personal meetings between respective users thereof.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided an interpersonal interaction communication system wherein the content is operative in the context of a game.

Moreover, in accordance with a preferred embodiment of the present invention, there is provided an interpersonal interaction communication system wherein the content is operative to produce interactions between interactive toys which are in physical propinquity therebetween.

Further, in accordance with a preferred embodiment of the present invention, there is provided an interpersonal interaction communication system wherein at least some of the interactive toys have a persona which share at least one personal characteristic with an identifiable person.

Still further, in accordance with a preferred embodiment of the present invention, there is provided an interpersonal interaction communication system wherein the identifiable person is the user of a given interactive toy.

Yet further, in accordance with a preferred embodiment of the present invention, there is provided an interpersonal interaction communication system wherein the content is operative to produce conversations between respective users thereof and employs at least some personal information about the respective users based on accumulated past interactions therewith.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided an interpersonal interaction communication system wherein the content is operative to produce personal meetings between respective users thereof at predetermined locations.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a toy cloning functionality including: developing an interactive toy personality based on interactions between an interactive toy and at least one of another interactive toy and a user, and transferring at least a portion of the interactive toy personality to at least one clone.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a toy cloning functionality wherein the interactive toy personality includes a toy life history.

Also, in accordance with a preferred embodiment of the present invention, there is provided a toy cloning functionality wherein the toy life history is stored in a database.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided a toy cloning functionality wherein the toy personality is stored in a database.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a toy cloning functionality wherein the at least one clone includes a toy.

Moreover, in accordance with a preferred embodiment of the present invention, there is provided a toy cloning functionality wherein the toy has a persona which is prima facie incompatible with the toy personality.

Further, in accordance with a preferred embodiment of the present invention, there is provided a toy cloning functionality wherein the at least one clone includes an animated virtual character.

Yet further, in accordance with a preferred embodiment of the present invention, there is provided a toy cloning functionality wherein following the transferring, the interactive toy personality continues to develop generally identically both in the interactive toy and in the clone.

Still further, in accordance with a preferred embodiment of the present invention, there is provided a toy cloning functionality wherein following the transferring, the interactive toy personality continues to develop at least partially independently both in the interactive toy and in the clone.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided a toy cloning functionality also including transferring at least one clone personality from the at least one clone to the interactive toy.

Also, in accordance with a preferred embodiment of the present invention, there is provided a toy cloning functionality wherein the interactive toy personality incorporates features based on multiple toy life histories.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided a toy cloning functionality also including: associating at least one physical feature of the interactive toy with the clone.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a toy personality functionality including: developing a plurality of interactive toy personalities based on interactions between an interactive toy and at least one of another interactive toy and a user, and endowing at least one interactive toy with at least two of the plurality of interactive toy personalities and with a mechanism for exhibiting at least one selectable personality at a given time.

Also, in accordance with a preferred embodiment of the present invention, there is provided a toy personality functionality wherein the at least one interactive toy exhibits at least one selectable personality in accordance with a toy-perceived personality of a corresponding user.

There is thus provided in accordance with another preferred embodiment of the present invention, an interactive toy system including: at least one interactive toy, and an interactive toy functionality at least partially resident at the at least one interactive toy and including user assistance functionality providing an output to the user which assists the user in user functioning.

Also, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the user assistance functionality is at least partially mechanical.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the user assistance functionality is at least partially verbal.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the interactive toy frmctionality includes tooth brushing functionality.

Moreover, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the at least one interactive toy is connected to a computer network.

Also, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the user assistance functionality is at least partially visible.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the user assistance functionality includes a guide dog functionality.

There is thus provided in accordance with another preferred embodiment of the present invention, an interactive toy system including: at least one interactive toy, an interactive toy functionality at least partially resident at the at least one interactive toy and including teaching functionality providing a teaching output to the user which assists the user in learning.

Also, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the teaching functionality includes a point to object - name of object teaching functionality.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the point to object - name of object teaching functionality includes selectable language functionality.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the teaching functionality includes different home and school environment teaching functionalities.

Moreover, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the school environment teaching functionality interactively involves at least a plurality of interactive toys.

Also, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the teaching frmctionality includes both verbal and body language teaching functionality for teaching a foreign language.

Further, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the teaching functionality is automatically actuable by an event in a non-teaching functionality of the at least one interactive toy.

Also, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the teaching functionality includes virtual classroom teaching functionality. Additionally, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the teaching functionality includes fellow student functionality wherein the at least one interactive toy acts as a fellow student to a user.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the teaching functionality includes behavior corrective functionality at least partially involving pre-acquired knowledge of at least one characteristic of the user obtained by at least one interactive toy.

Also, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the behavior corrective functionality also at least partially involves currently acquired knowledge of the at least one characteristic of the user obtained by at least one interactive toy.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the at least one interactive toy includes sensing functionality for sensing at least one of the following user parameters: breath constituents, blood pressure, breathing activity, heart activity, and language.

There is thus provided in accordance with another preferred embodiment of the present invention, an interactive toy system including: at least one interactive toy, an interactive toy functionality at least partially resident at the at least one interactive toy and including user-specific event driven frmctionality providing an output to the user which is dependent on pre-acquired Imowledge of at least one characteristic of the user obtained by at least one interactive toy.

Additionally, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the user-specific event driven functionality includes musical output functionality.

Also, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the at least one interactive toy includes a plurality of interactive toys which cooperate to provide various coordinated parts in a musical output.

Furthermore, in accordance with a preferred embodiment of the present invention, there is provided an interactive toy system wherein the at least one interactive toy has at least one known persona.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a system for obtaining and utilizing information including: at least one of the plurality of interactive toys, employed to obtain information via a user, and the information obtained via the user and utilized in an application which is not limited to user involvement.

There is thus provided in accordance with another preferred embodiment of the present invention, a schedule monitoring toy methodology including: learning personal information about a child by means of an at least partially verbal-input interactive toy, and actuating the verbal-input interactive toy by means of toy content so as to present to the child at least one personalized, verbal scheduling prompt based on at least one item of personal information which the verbal-input interactive toy has learned about the child.

There is thus provided in accordance with another preferred embodiment of the present invention, a schedule monitoring toy methodology including: utilizing a verbal-input interactive toy, recognizing a parent by means of a parental input receiver, receiving from the receiver at least one parental input regarding at least one desired schedule item, and actuating the verbal-input interactive toy by toy content so as to present to a child a timely verbal presentation of the at least one desired schedule item.

There is thus provided in accordance with another preferred embodiment of the present invention, a schedule-monitoring toy methodology including: verbally inputting a mobile, interactive toy, receiving an input from a scheduler regarding at least one schedule item, locating a child by means of a child locator within a predetermined area, and locating the child by means of a prompter at a time appropriate to the at least one schedule item, and delivering at least one verbal prompt from the prompter for the at least one schedule item to the child.

There is thus provided in accordance with another preferred embodiment of the present invention, a schedule monitoring toy methodology including: perfomiing speech recognition by means of a verbal-input interactive toy, and actuating the verbal-input interactive toy by means of toy content so as to present to a child: at least one timely, interactive verbal scheduling prompt, and at least one anthropomorphic response to recognized speech content produced by a child responsive to the prompt.

There is thus provided in accordance with another preferred embodiment of the present invention, a schedule monitoring toy methodology including: activating a verbal- input interactive toy, receiving from at least one authorized source, information regarding a plurality of schedule items so as to input the information to a schedule input receiver, receiving from at least one authorized source, information regarding at least one free time activities authorized to be performed by a child during his free time, so as to input the infomiation to a free time database, and actuating the verbal-input interactive toy by means of toy content so as to present to the child: a timely verbal presentation of each of the plurality of schedule items, and a verbal presentation, presented at a time not occupied by any of the plurality of schedule items so as to prompt the child to perform at least one of the free time activities.

There is thus provided in accordance with another preferred embodiment of the present invention, a follow-me toy methodology including: activating a mobile toy, tracking a user by means of a user-following mechanism, and, guiding the toy to follow the user as the user moves within a working area.

There is thus provided in accordance with another preferred embodiment of the present invention, a networked diary toy methodology including: activating a verbal-input interactive toy, comiecting, by means of a network interface, the verbal-input interactive toy to a computer network including at least one networked computer, storing at least one diary item for an individual user in a diary database, and presenting diary items from the diary database by means of verbal content, wherein at least a portion of the verbal content is stored on the at least one networked computer and arrives at the verbal-input interactive toy via the network interface.

There is thus provided in accordance with another preferred embodiment of the present invention, a speech-responsive networked diary toy system including: activating a toy, connecting the toy to a computer network including at least one networked computer by means of a network interface, storing at least one diary item for an individual user in a diary database, communicating with the toy by means of a speech-recognition unit residing at least in part in the at least one networked computer and via the network and the network interface, and actuating the toy by toy content so as to present at least one diary item responsive to user utterances recognized by the speech recognition unit.

There is thus provided in accordance with another preferred embodiment of the present invention, a supervised networked organizer methodology including: performing at least one organizing function involving multiple individual users by means of an organizer subsystem, and storing at least one supervisor identity and automatically providing to each individual supervisor, inputs from the organizer system by means of a supervision subsystem.

There is thus provided in accordance with another preferred embodiment of the present invention, a child-messaging toy methodology including: indicating child propinquity by means of a child propinquity frmctionality in a verbal-input interactive toy, accepting at least one message to be delivered to a child whose propinquity to the toy is indicated to exist form a message database, and controlling message delivery by means of a message controller including: providing a personalized audible output to the child requesting that the child come into propinquity with the toy from an audible annunciator, and providing at least one message from the message database to the child from a message output generator in response to an indication of propinquity of the child to the toy.

There is thus provided in accordance with another preferred embodiment of the present invention, a child-messaging toy methodology including: indicating child propinquity by means of a child propinquity functionality in a verbal-input interactive toy, accepting at least one time-specific message to be delivered to a child whose propinquity to the toy is indicated to exist at least one predetermined time from a timed message database, controlling message delivery by mean's of a message controller including: providing a personalized audible output to the child requesting that the child come into propinquity with the toy by means of an audible annunciator, and providing at least one time-specific message from the timed message database to the child from a message output generator, in response to an indication of propinquity of the child to the toy.

There is thus provided in accordance with another preferred embodiment of the present invention, a virtual parenting toy methodology including: playing at least one game with a child together with a verbal-input interactive toy, actuating the verbal-input interactive toy by means of verbal-input interactive toy content so as to present to the child the verbal-input interactive toy content including: at least one verbal prompt to perform at least one task, and at least one verbal offer to play the at least one game with the child once the at least one task is performed, and accepting at least one indication that the at least one task has been performed and in response to the indication, so as to actuate the at least one game by means of a compliance monitor.

There is thus provided in accordance with another preferred embodiment of the present invention, a virtual parenting toy methodology including: activating an interactive toy including: recognizing at least one indication of a child want by means of a child want indication-recognizing functionality, providing an output indication of a child want recognized by the child want indication-recognizing functionality by a child want reporting functionality, and satisfying a the child want reported by the child want reporting functionality by means of a child want satisfying functionality.

There is thus provided in accordance with another preferred embodiment of the present invention, a toy methodology system including: activating an interactive toy including, designating at least one time slot during which a child has free time and may participate in a toy interaction, by means of a free time indication functionality, and providing entertainment to the child during the at least one time slot by a free time utilization functionality in response to an output from the free time indication functionality.

There is thus provided in accordance with another preferred embodiment of the present invention, a user-location monitoring toy diary methodology including: storing, for each of a multiplicity of schedule items, a time thereof and coordinates of a location thereof, wherein the multiplicity of locations are represented within a single coordinate system of a schedule database, tracking a current location of the user by a user tracker, and prompting the user to conform to the schedule if the user's current location does not conform to the stored location of a current schedule item by means of a prompter.

There is thus provided in accordance with another preferred embodiment of the present invention, a schedule-monitoring toy methodology including: interacting with a user by means of a verbal-input interactive toy, storing the user's schedule in a schedule database, and actuating the verbal-input interactive toy so as to present to the user at least one verbal scheduling prompt which includes content which is not related to scheduling by a scheduler.

There is thus provided in accordance with another preferred embodiment of the present invention, a computerized guide methodology for a blind user, the guide including: activating a portable interactive computerized device including: receiving a route input from a blind user for selecting a user route from a route definition frmctionality, detecting at least one hazard along the user route by a hazard detection functionality, and providing the user with an audio warning regarding presence of the hazard in response to an output from the hazard detection functionality by means of an audio warning functionality.

There is thus provided in accordance with another preferred embodiment of the present invention, a parental surrogattoy methodology including: activating a toy, receiving a child behavior report by means of a child behavior report receiver, and controlling the child by a toy controller including: allowing a parent to define at least one parameter of child behavior which is of interest by a behavioral report configuration definer, and monitoring the parameter of child behavior by a child monitor so as to provide a report relating to the at least one parameter to the child behavior report receiver.

There is thus provided in accordance with another preferred embodiment of the present invention, a web browsing methodology including: connecting an interactive toy to the Internet, and web-browsing by means of a user interface on the interactive toy. There is thus provided in accordance with another preferred embodiment of the present invention, a Imowledge management methodology including: connecting an interactive toy to the Internet, and managing information by an information management functionality on the interactive toy.

There is thus provided in accordance with another preferred embodiment of the present invention, an interactive persona methodology including: activating a three- dimensional artificial person including a computer, and employing the computer and the three-dimensional artificial person having a pattern of behavior associated with a defined persona by means of a voice responsive interactive functionality, and, interacting with a user in a manner which mimics behavior characteristic of the persona.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a inter-toy communication methodology including: interacting with a plurality of users by means of least one interactive toy, and wherein the interacting with a plurality of users by means of least one interactive toy is affected by the interaction of the at least one interactive toy with another one of the plurality of users.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a inter-toy communication methodology including: interacting with a plurality of users by at least one interactive toy, and wherein the interacting of the at least one interactive toy with at least two of the plurality of users is dependent on knowledge of the toy of which user it is interacting with and characteristics of the user known to the toy.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a inter-toy communication methodology including: interacting with at least one user by a plurality of interactive toy, wherein the interacting of one of the plurality of interactive toys with the at least one user is affected by the interaction of another of the plurality of toys with the at least one user.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a computer network, a multi-toy communication methodology including: communicating of at least one first interactive toy with the computer network, and, communicating of at least one second interactive toy with the computer network via the at least one first interactive toy. There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a computer network, a multi-toy location methodology including: sensing at least predetermined propinquity between at least two of the plurality of interactive toys by means of a location functionality.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment, including a plurality of interactive toys, at least one of which being normally in interactive communication via a computer with a computer network, the computer including a toy communication functionality, a toy communication methodology including: enabling the computer to recognize the identity of a toy which is not normally in interactive communication therewith, when the toy comes into communication propinquity therewith by means of a toy recognition functionality, and establishing interactive communication with a communication establishing functionality following recognition of the identity of a toy which is not normally in interactive communication therewith, when the toy comes into communication propinquity therewith.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a multi-toy coordinated activity methodology including: interacting of a plurality of interactive toys via a computer network, and causing the plurality of interactive toys to coordinate their actions in a coordinated activity by means of a coordinated activity functionality.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a communication methodology providing communication between at least one of the plurality of toys and at least one toy and at least one user, the methodology including: interacting of the plurality of interactive toys via a computer network, causing at least some of the plurality of interactive toys to communicate with each other at least partially not in real time by means of a communications functionality operative at least partially via the computer network.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a communication methodology providing communication between at least one of multiple toys and at least one user, the methodology including: interacting of the plurality of interactive toys via a computer network, causing at least one of the plurality of interactive toys to communicate with at least one user via a telephone link by means of a communications functionality operative at least partially via the computer network.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys intercomiected via a network, a communication methodology providing communication between at least one of multiple toys and at least one user, the methodology including: interacting of the plurality of interactive toys via a computer network, causing at least one of the plurality of interactive toys to communicate with at least one user via at least another one of the interactive toys by means of a communications functionality, wherein the functionality is operative at least partially via the computer network.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys intercomiected via a network, a communication methodology providing communication between at least one of multiple toys and at least one user, the methodology including: interacting of a plurality of interactive toys via a computer network, causing at least one of the plurality of interactive toys to communicate motions of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys by means of a communications functionality, wherein the functionality is operative at least partially via the computer network.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a communication methodology providing communication between at least one of multiple toys and at least one user, the methodology including: interacting of a plurality of interactive toys via a computer network, causing at least one of the plurality of interactive toys to communicate with at least one user via at least another one of the interactive toys by means of a communications functionality, wherein the functionality is operative at least partially via the computer network.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a computer network, an integrated toy-game methodology including: playing of a game by a user, and participating in playing the game of at least one interactive toy containing game-specific functionality. There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, an interpersonal interaction communication methodology providing communication between multiple users via multiple toys, the methodology including: interacting of a plurality of interactive toys via a computer network, at least some of the plurality of interactive toys having a functionality of forming personal profiles of users thereof based on accumulated past interactions therewith, and causing at least one of the plurality of interactive toys to communicate content with at least another of the plurality of interactive toys by means of a communications functionality, wherein the functionality is operative at least partially via the computer network, and, the content being operative to produce interaction between respective users thereof.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a toy cloning methodology including: developing an interactive toy personality based on interactions between an interactive toy and at least one of another interactive toy and a user, transferring at least a portion of the interactive toy personality to at least one clone.

There is thus provided in accordance with another preferred embodiment of the present invention, in an interactive toy environment including a plurality of interactive toys interconnected via a network, a toy personality functionality including: developing a plurality of interactive toy personalities based on interactions between an interactive toy and at least one of another interactive toy and a user, and endowing at least one interactive toy with at least two of the plurality of interactive toy personalities and with a mechanism for exhibiting at least one selectable personality at a given time.

There is thus provided in accordance with another preferred embodiment of the present invention, an interactive toy methodology including: activating at least one interactive toy, providing an output to a user which assists the user in user functioning by means of an interactive toy functionality at least partially resident at the at least one interactive toy.

There is thus provided in accordance with another preferred embodiment of the present invention, an interactive toy methodology including: activating at least one interactive toy, and providing a teaching output to a user which assists the user in learning by an interactive toy functionality at least partially resident at the at least one interactive toy.

There is thus provided in accordance with another preferred embodiment of the present invention, an interactive toy system including: activating at least one interactive toy, providing an output to a user which is dependent on pre-acquired knowledge of at least one characteristic of the user obtained by the at least one interactive toy, and wherein the output is provided by an interactive toy functionality at least partially resident at the at least one interactive toy, and driving the output in respect of a user-specific event.

BRIEF DESCRIPTION OF THE DRAWINGS

The present invention will be understood and appreciated from the following detailed description, taken in conjunction with the drawings in which:

Fig. 1 is a simplified partly-pictorial partly-schematic illustration of an interactive toy system providing methodology for obtaining and utilizing information in accordance with a preferred embodiment of the present invention;

Fig. 2A is a simplified table in the context of Fig. 1 showing a database record comprising infomiation obtained from a user;

Fig. 2B is simplified table in the context of Fig. 1 showing a database record utilized by a toy system in obtaining information from users;

Fig. 2C is a simplified table in the context of Fig. 1 showing a database of information obtained by a toy system wide from users world wide;

Fig. 2D is a simplified table in the context of Fig. 1 showing a database record utilized by a toy system in obtaining infomiation from users;

Fig. 3 is a simplified flowchart of the information obtaining functionality of Fig. 1;

Fig. 4, is a simplified flowchart of the information utilizing functionality of Fig. 1;

Fig. 5 is a simplified schematic illustration in the context of Fig. 1, showing a screen display of a permission obtaining procedure;

Fig. 6 is a simplified partly-pictorial partly-schematic illustration of methodology for obtaining information about purchasing and utilizing the information in marketing, in accordance with a preferred embodiment of the present invention;

Fig. 7A is a simplified table, in the context of Fig. 6, showing a purchase report message;

Fig. 7B is a simplified table, in the context of Fig. 6, showing a world wide purchase database record;

Fig. 8 is a simplified flowchart of the information obtaining and utilizing methodology of Fig. 6;

Fig. 9 is a simplified pictorial illustration of a methodology for obtaining and utilizing infomiation in accordance with a preferred embodiment of the present invention;

Fig. 10A is a simplified flowchart in the context of Fig. 9, showing an information obtaining functionality;

Fig. 10B is a simplified flowchart, in the context of Fig. 9, showing another information obtaining functionality; Fig. 11 is a simplified flowchart, in the context of Fig. 9, showing infomiation utilization functionality;

Fig. 12 is a simplified flowchart showing an information-utilization functionality in the context of Fig. 9;

Fig. 13 is a simplified flowchart showing the effectiveness measurement functionality of Fig. 12;

Fig. 14 is a simplified pictorial illustration of a methodology for obtaining information and utilizing the information in classifying users, in accordance with a preferred embodiment of the present invention;

Fig. 15 is a simplified flowchart in the context of Fig. 14 showing an information- obtaining functionality;

Fig. 16 is a simplified table in the context of Fig. 15, showing a database used for the purpose of deriving a user profile;

Fig. 17 is a simplified flowchart showing a profile-deriving functionality in the context of Figs. 15 and 16;

Fig. 18 which is a simplified pictorial illustration of a methodology for prompting the user to say certain words, in accordance with a preferred embodiment of the present invention;

Fig. 19 is a simplified flowchart of the speech recognition criteria-updating functionality of Fig. 18;

Fig. 20 is a simplified flowchart showing a methodology for employing speech recognition in obtaining information from disparate cultural groups, in the context of Fig. 18;

Fig. 21 is a simplified flowchart in the context of Fig. 20 of a preferred method for utilizing information obtained from disparate cultural groups in updating criteria employed in speech recognition;

Fig. 22 is a simplified table in the context of Figs. 20 and 21, showing database of word pronunciation models derived from information obtained from users from disparate cultural groups;

Fig. 23 is a simplified table in the context of Figs. 20 and 21 showing a database used in a methodology of utilizing information obtained regarding word pronunciation variations of cultural groups for the purpose of updating criteria employed in speech recognition;

Fig. 24 is a simplified flowchart in the context of Fig. 23 showing a methodology for comparing the efficacy of two word pronunciation models in relation to a user; Fig. 25 is a simplified flowchart of a speech recognition criteria-updating functionality of Fig. 24;

Fig. 26 is a simplified partly-pictorial partly-schematic illustration of a methodology of obtaining and utilizing information as a diagnostic tool for evaluating the performance of a computer in accordance with a preferred embodiment of the present invention;

Fig. 27A is a simplified table in the context of Fig. 26 showing a game request message;

Fig. 27B is a simplified table in the context of Fig. 26 showing a database record of accumulated game requests;

Fig. 28A is a simplified flowchart of the infomiation obtaining functionality of Fig. 26;

Fig. 28B is a simplified flowchart of the information utilization functionality of Fig. 26;

Fig. 29 is a simplified partly-pictorial partly-schematic illustration of a methodology of obtaining and utilizing information as a diagnostic tool for evaluating the performance of an interactive toy, in accordance with a preferred embodiment of the present invention;

Fig. 30 A is a simplified table in the context of Fig. 29 showing a sensor signal message;

Fig. 30B is a simplified table in the context of Fig. 29 showing a database record;

Fig. 30C is a simplified table in the context of Fig. 29 showing a database record;

Fig. 30D is a simplified table in the context of Fig. 29 showing a database record;

Fig. 31 A is a simplified flowchart of the information utilization functionality of Fig.

29;

Fig. 3 IB is a simplified flowchart of another infomiation utilization functionality of

Fig. 29;

Fig. 32 is a simplified partly-pictorial partly-schematic illustration of a methodology for obtaining and utilizing information as a diagnostic tool for evaluating the performance of a user over time in accordance to a preferred embodiment of the present invention;

Fig. 33 A is a simplified table in the context of Fig. 32 showing a report message;

Fig. 33B is a simplified table in the context of Fig. 32 showing an individual record of a database comprising information regarding the performance of a user; Fig. 34 is a simplified flowchart of the information obtaining functionality of Fig. 32;

Fig. 35 is a simplified partly-pictorial partly-schematic illustration of methodology for obtaining and utilizing information as a diagnostic tool for evaluating content employed by an interactive toy in accordance with a preferred embodiment of the present invention;

Fig. 36A is simplified table in the context of Fig. 35 showing a data report message;

Fig. 36B is a simplified table in the context of Fig. 35 showing a database record;

Fig. 37 is a simplified flowchart of the infomiation obtaining and utilizing functionality of Fig. 35;

Fig. 38 is a simplified pictorial illustration of a methodology for obtaining infomiation and utilizing the information for the purpose of evaluating teaching methods and/or educational methodologies, in accordance with a preferred embodiment of the present invention;

Fig. 39 is a simplified flowchart in the context of Fig. 38 showing functionality for evaluating teaching methods as well as functionality for evaluating educational methodologies;

Fig. 40 is a simplified table of a database in the context of Fig. 39 showing a typical outcome of an evaluation procedure of Fig. 39;

Fig. 41 is a simplified pictorial illustration of a methodology for obtaining information that may be used for the purpose of game design in accordance with a preferred embodiment of the present invention;

Fig. 42 is a simplified flowchart of the infomiation obtaining functionality of Fig. 41;

Fig. 43 is a simplified table of a database record showing riddle rating functionality of Fig. 42;

Fig. 44 is a simplified pictorial illustration of a schedule monitoring toy system comprising a personal information item learning and a scheduling prompt presentation functionality in accordance with a preferred embodiment of the present invention;

Figs. 45A and 45B are simplified flowcharts respectively illustrating the learning functionality and the presentation functionality of Fig. 44;

Fig. 46 is a simplified pictorial illustration of a schedule monitoring toy system comprising a parental input reception, a schedule item presentation and anthropomorphic response functionality in accordance with a preferred embodiment of the present invention; Fig. 47 is a simplified flowchart of the parental input reception functionality of Fig. 46;

Fig. 48 is a simplified flowchart of the schedule item presentation and the antlnOpomorphic response functionality of Fig. 46;

Fig. 49 is a simplified pictorial illustration of a schedule monitoring toy system comprising child locating functionality and verbal prompt delivery functionality in accordance with a preferred embodiment of the present invention;

Fig. 50 is a simplified flowchart of the child locating and verbal prompt delivery functionality of Fig. 49;

Fig. 51 is a simplified flowchart of a schedule monitoring toy system comprising authorized free-time activity prompting functionality and a schedule functionality, in accordance with a preferred embodiment of the present invention;

Figs. 52A and 52B are simplified tables respectively illustrating a typical schedule record of a user and a simplified free-time database, in accordance with a preferred embodiment of the present invention;

Fig. 53 which is a simplified pictorial illustration showing a toy which follows a user, in accordance with a preferred embodiment of the present invention;

Fig. 54 is a simplified diagram of the detection and navigation unit of Fig. 53;

Fig. 55 is a simplified diagram illustrating the detection and navigation functionality of the detection and navigation unit of Fig. 54;

Fig. 56 is a simplified flowchart illustrating the navigation functionality of the detection and navigation unit of Fig. 54;

Fig. 57 which is a simplified pictorial illustration of a networked diary toy system comprising networked diaiy data storage functionality in accordance with a preferred embodiment of the present invention;

Fig. 58 is a simplified flowchart of the network interface connection functionality of Fig. 57;

Fig. 59 A is a simplified partly pictorial partly schematic illustration of a speech responsive networked diaiy toy system comprising a speech recognition unit residing in a networked computer and a diary item actuated in response to a user utterance, in accordance with a preferred embodiment of the present invention;

Fig. 59B is a simplified block diagram illustration of the speech recognition and response generation of Fig. 59 A; Fig. 59C is a simplified flowchart illustrating the response actuation functionality of Fig. 59A;

Fig. 60 is a simplified pictorial illustration of a supervised networked organizer system in accordance with a preferred embodiment of the present invention;

Fig. 61 is a simplified flowchart of the organizing and supervision functionality of Fig. 60;

Fig. 62 is a pictorial illustration of a child-messaging toy system comprising propinquity indicating functionality and an amiunciator requesting a user to come into propinquity with toy, in accordance with a preferred embodiment of the present invention;

Fig. 63 is a simplified flowchart of the message delivery functionality and the failure reporting functionality of Fig. 62;

Fig. 64 is a simplified flowchart of the propinquity indication functionality of Figs. 62 and 63;

Fig. 65 is a simplified pictorial illustration of a virtual parenting toy system, in accordance with a preferred embodiment of the present invention;

Fig. 66 is a simplified flowchart of the prompting and compliance monitoring functionality of Fig. 65;

Fig. 67 is a simplified pictorial illustration of a virtual parenting toy system in accordance with a preferred embodiment of the present invention;

Fig. 68 is a simplified flowchart of the want indication recognizing, want satisfying, authorization request and preference eliciting functionality of Fig. 67;

Fig. 69 is a simplified block diagram illustration of the want indication-recognizing functionality of Figs. 67 and 68;

Fig. 70 is a simplified pictorial illustration of a toy system comprising free time indication frmctionality and free time utilization functionality, in accordance with a preferred embodiment of the present invention;

Fig. 71 is a table illustrating the free time indication functionality of Fig. 70;

Fig. 72 is a simplified flowchart of the entertainment providing in user's free-time functionality of Fig. 70;

Fig. 73 is a simplified flowchart of the entertainment providing in user's free-time functionality of Fig. 70;

Fig. 74 is a simplified partly pictorial partly schematic illustration of a user-location monitoring toy diary comprising time and coordinates data storage functionality, location tracking functionality and prompter frmctionality in accordance with a preferred embodiment of the present invention;

Fig. 75A is a simplified table of a typical database record of a schedule database of Fig. 74;

Fig. 75B is a simplified table of a typical database record of the traffic database 7012 of Fig. 74;

Fig. 76 is a simplified flowchart of the data storage, location tracking and prompter functionality of Fig. 74;

Fig. 77 is a simplified pictorial illustration of a schedule monitoring toy system comprising scheduling prompt which includes content which is not related to scheduling, in accordance with a preferred embodiment of the present invention;

Fig. 78 is a simplified flowchart of the content selection functionality of Fig. 77;

Fig. 79 is a simplified flowchart of the traumatic schedule item detecting functionality of Fig. 78;

Fig. 80 is a simplified flowchart illustrating a procedure of assigning traumatic weight to word groups obtained in a method described in Fig. 79;

Fig. 81 is a table illustrating an exemplary database obtained in methods described in Figs. 79 and 80;

Fig. 82 is a simplified flowchart of the traumatic schedule item detecting functionality and of the prompt type selection functionality of Figs. 77 and 78;

Fig. 83 is table illustrating a typical database utilized in order to select content as described in Fig. 82;

Fig. 84 is a simplified partly pictorial partly schematic illustration of a computerized guide system for a blind user comprising a networked portable interactive device in accordance with a preferred embodiment of the present invention;

Fig. 85 A is a simplified table in the context of Fig. 84 showing a destination database;

Fig. 85B is a simplified table in the context of Fig. 84 showing a route-definition database;

Fig. 85C is a simplified table in the context of Fig. 84 showing a multiple user guiding database;

Fig. 86 is a simplified flowchart in the context of Fig. 84 showing route definition functionality of a computerized guide system for a blind user; Fig. 87 is a simplified flowchart in the context of Fig. 84 showing the audio warning functionality and the hazard information providing functionality of the system of Fig. 84;

Fig. 88 is a simplified flowchart of a parental surrogate toy comprising child behavior monitor operative monitor a selected parameter of child behavior, in accordance with a preferred embodiment of the present invention;

Fig. 89 is a simplified pictorial illustration of a game comprising toy web browsing functionality in accordance with a preferred embodiment of the present invention

Fig. 90 is a simplified flowchart of the web browsing functionality of Fig. 89;

Fig. 91 is a simplified pictorial illustration of a web browsing system wherein browsing produces content which is subsequently employed by a toy in accordance with a preferred embodiment of the present invention;

Fig. 92 is a simplified flowchart of content generation functionality of Fig. 91;

Fig. 93 is a simplified table in the context of Fig. 91 showing a database utilized in storing content generated via web;

Fig. 94 is a simplified pictorial illustration of a web browsing system comprising a toy and providing an interrogation functionality for obtaining information from other interactive toys in accordance with a preferred embodiment of the present invention;

Fig. 95 is a simplified flowchart of the interrogation functionality of Fig. 94;

Fig. 96 is a simplified pictorial illustration of a web browsing system that employs as an input a user characteristic ascertained from an interaction between user and toy in accordance with a preferred embodiment of the present invention;

Fig. 97 is a simplified flowchart of the web browsing functionality of Fig. 96;

Fig. 98 is a simplified table in the context of Fig. 97 showing a database record utilized in matching an activity to a user;

Fig. 99 is a simplified flowchart of a web browsing system providing employment agency functionality in accordance with a preferred embodiment of the present invention;

Fig. 100 is a simplified pictorial illustration of a Imowledge management system comprising information retrieval functionality, information synthesis functionality and information filtering functionality in accordance with a preferred embodiment of the present invention.

Fig. 101 is a simplified flowchart illustration of the information retrieval, filtering and synthesis functionality of Fig. 100; Fig. 102 is a simplified pictorial illustration of a knowledge management system comprising a phone dialer in accordance with a prefened embodiment of the present invention;

Fig. 103 is a simplified flowchart illustration of telephone inquiry functionality of Fig. 102;

Fig. 104 is a simplified flowchart illustration of the telephone dialer functionality of Fig. 102;

Fig. 105 is a simplified block diagram illustration of a matching functionality employing user profile infomiation collected by an interactive toy in accordance with a preferred embodiment of the present invention;

Fig. 106 is a simplified flowchart illustration of the matching functionality of Fig. 105;

Fig. 107 is a simplified table in the context of Fig. 106 showing a database record utilized by toy in user profiling;

Fig. 108 is a simplified flowchart illustration of user status determination functionality and help functionality provided by a Imowledge management system comprising an interactive toy in accordance with a preferred embodiment of the present invention;

Fig. 109 is a simplified block diagram illustration of the symptom and megular behavior detection functionality of Fig. 108;

Fig. 110 is a simplified table in the context of Fig. 108 showing a database record utilized by a computer in order to detect symptoms of possible illness or emotional distress;

Fig. I l l is a simplified pictorial illustration of a Imowledge management system comprising a matching functionality which is operative to match potential donors and potential charity recipients in accordance with a preferred embodiment of the present invention;

Fig. 112 is a simplified flowchart in the context of Fig. I l l showing possession reporting functionality;

Fig. 113 is a simplified table in the context of Fig. 112 showing a database utilized by a Imowledge management system in matching potential donors with potential charity recipients;

Fig. 114 is a simplified flowchart illustrating the matching functionality performed by the knowledge management system illustrated in Fig. I l l; Fig. 115 is a simplified partly pictorial partly schematic illustration of an interactive persona system comprising a three-dimensional artificial person in accordance with a preferred embodiment of the present invention;

Fig. 116 is a simplified partly pictorial partly schematic illustration of three- dimensional artificial person of Fig. 115;

Fig. 117 is a simplified flowchart illustration of the interaction functionality of three-dimensional artificial person of Figs. 115 and 116;

Fig. 118 is a simplified flowchart illustration of another interaction functionality of three-dimensional artificial person of Figs. 115 and 116;

Figs. 119A and 119B are simplified flowchart illustrations in the context of Figs. 115, 116, 117 and 118;

Fig. 120 is a simplified partly pictorial partly schematic illustration of an interactive persona system comprising a toy having a persona of a known non-teacher and a pattern of behavior of a teacher in accordance with a preferred embodiment of the present invention;

Fig. 121 is a simplified flowchart illustration in the context of Fig. 120 showing a teaching functionality provided by a toy having a persona of a famous non-teacher;

Fig. 122 is a simplified pictorial illustration of an interactive persona system comprising a toy having a persona of a coach in accordance with a prefened embodiment of the present invention;

Fig. 123, is a simplified flowchart illustration in the context of Fig. 122 showing coaching functionality of an interactive persona system comprising a toy having a persona of a famous coach;

Fig. 124 is a simplified schematic illustration in the context of Fig. 122 showing a locomotive toy having a persona of a coach in accordance with a prefereed embodiment of the present invention;

Fig. 125 is a simplified partly pictorial partly schematic illustration of a three- dimensional artificial guide, a central computer and the flow of infomiation therebetween in accordance with a prefened embodiment of the present invention;

Fig. 126 is a simplified flowchart describing the functionality of the interactive persona system of Fig. 125;

Fig. 127A is a block diagram illustration of another functionality of the interactive persona system of Fig. 125;

Fig. 127B is a flowchart illustration in the context of Fig. 125 showing the functionality of Fig. 127A; Fig. 128 is a simplified pictorial illustration of an interactive toy having a persona of a comedian in accordance with a preferred embodiment of the present invention;

Fig. 129 is a simplified flowchart illustration in the context of Fig. 128;

Fig. 130 is a simplified table in the context of Fig. 129 showing a database record utilized in joke selection;

Fig. 131 is a simplified pictorial illustration of a plurality of toys having persona providing content to a user in accordance with a preferred embodiment of the present invention;

Fig. 132 is a simplified flowchart illustration of the content providing functionality of Fig. 131;

Fig. 133 is a simplified table in the context of Fig. 132 showing a database record utilized in content selection for a plurality of toys having persona;

Fig. 134 is a simplified partly pictorial partly schematic illustration of an inter-toy communication system comprising an interactive toy operative for interaction with a plurality of users in accordance with a prefened embodiment of the present invention;

Fig. 135 is a simplified table in the context of Fig. 134, showing a database record 10016 of user interaction;

Fig. 136 is a simplified flowchart illustration of the communication functionality of Fig. 134;

Fig. 137 is a simplified partly pictorial partly schematic illustration of an inter-toy communication system comprising an interactive toy operative for interaction with a plurality of users in accordance with a prefened embodiment of the present invention;

Fig. 138 is simplified flowchart illustration of the communication functionality of Fig. 137;

Fig. 139 is a simplified partly pictorial partly schematic illustration of an inter-toy commimication system comprising a plurality of interactive toy operative for interaction with at least one user in accordance with a preferred embodiment of the present invention;

Figs. 140A and 140B which, taken together, are a flowchart illustration of the communication functionality of Fig. 139;

Fig. 141 is a simplified partly pictorial partly schematic illustration of a multi-toy communication system in accordance with a prefened embodiment of the present invention;

Fig. 142 is a simplified flowchart illustration of the communication functionality of Fig. 141; Fig. 143 is a simplified pictorial illustration of an interactive toy system comprising propinquity sensing and toy voice recognition functionality in accordance with a preferred embodiment of the present invention;

Fig. 144 is a simplified flowchart illustration of propinquity sensing and toy voice recognition functionality of Fig. 143;

Fig. 145 is a simplified pictorial illustration of communication establishing functionality of a computer and a toy which is not nomially in communication therewith in accordance with a preferred embodiment of the present invention;

Fig. 146 is a simplified block diagram illustration of communication functionality of Fig. 145;

Fig. 147 is a simplified flowchart illustration of identification and communication establishing functionality of Fig. 146;

Fig. 148 is a simplified table in the context of Fig. 147 showing a database record that enables a user to authorize a computer to communicate with a visiting toy;

Fig. 149 is a simplified pictorial illustration of a multi-toy coordinated activity system in accordance with a prefened embodiment of the present invention;

Fig. 150 is a simplified flowchart of the coordinated activity functionality of Fig. 149;

Fig. 151 is a simplified flowchart of the activity coordination frmctionality of Fig. 150;

Fig. 152 is a simplified pictorial illustration of a multi-toy coordinated activity system comprising coordinated activity over disparate locations and toy communication which is not in real time in accordance with a prefened embodiment of the present invention;

Fig. 153 is a simplified flowchart of the coordination and communication functionality of Fig. 152;

Fig. 154 is a simplified partly pictorial partly schematic illustration of a communication system providing communication between at least one of multiple toys and at least one toy and at least one user in accordance with a prefened embodiment of the present invention;

Fig. 155 is a simplified flowchart of the communications functionality of Fig. 154;

Fig. 156 is a simplified partly pictorial partly schematic illustration of a communication system providing communication between at least one of multiple toys and at least one toy and at least one user in accordance with a prefened embodiment of the present invention;

Fig. 157 is a simplified flowchart of the communications functionality of Fig. 156;

Fig. 158 is a simplified flowchart in the context of Fig. 156 showing another communications functionality of the communication system of Fig. 156;

Fig. 159 is a simplified partly pictorial partly schematic illustration of a communication system providing coimiiunication between at least one of multiple toys and at least one toy and at least one user in accordance with a prefened embodiment of the present invention;

Fig. 160 is a simplified flowchart of the communications functionality of Fig. 159;

Fig. 161 is a simplified pictorial illustration of user and toy telephone communication functionality in accordance with a prefened embodiment of the present invention;

Fig. 162 is a simplified flowchart of the communication functionality of Fig. 161;

Fig. 163 is a simplified flowchart in the context of Fig. 161 showing another communication functionality of the communication system of Fig. 161;

Fig. 164 is a simplified partly pictorial partly schematic illustration of an interactive toy communication system providing communications functionality, which enables a user to communicate with an interactive toy and another user via a telephone link in accordance with a prefen-ed embodiment of the present invention;

Figs. 165 A and 165B taken together are a simplified flowchart of the communications functionality of Fig. 164;

Fig. 166 is a simplified partly pictorial partly schematic illustration of a communication system providing communication between at least one of multiple toys and at least one user in accordance with a prefen-ed embodiment of the present invention;

Fig. 167 is a simplified flowchart of the communications frmctionality of Fig. 166;

Fig. 168 is a simplified pictorial illustration of speech and motion communication functionality in accordance with a preferred embodiment of the present invention;

Fig. 169 is a simplified flowchart of the motion and speech communication functionality of Fig. 168;

Fig. 170 is a simplified pictorial illustration of a toy-game functionality wherein a toy participates in a game as a player in accordance with a prefened embodiment of the present invention;

Fig. 171 is a flowchart of the gaming frmctionality of Fig. 170; Fig. 172 is a simplified flowchart showing response to sensed user characteristic functionality of Fig. 170;

Fig. 173 is a simplified block diagram illustration of the emotional state sensing functionality of Fig. 172;

Fig. 174 is a simplified table in the context of Fig. 172;

Fig. 175 is a simplified pictorial illustration of a toy- game functionality wherein toy assists user in playing a game in accordance with a preferred embodiment of the present invention;

Fig. 176 is a simplified pictorial illustration in the context of Fig. 175 showing a voice interaction functionality in a game;

Fig. 177 is a simplified pictorial illustration of a toy-game functionality wherein a toy is employed as a user interface to a game in accordance with a preferred embodiment of the present invention;

Fig. 178 is a simplified flowchart of the toy-interface functionality of Fig. 177;

Fig. 179 is another simplified flowchart of the toy interface functionality of Fig. 177;

Fig. 180 is a simplified pictorial illustration in the context of Fig. 177;

Fig. 181 is a simplified pictorial illustration in the context of Fig. 177 showing a multi-user game played over a network;

Fig. 182 is a simplified pictorial illustration in the context of Fig. 181 showing toy- mediation functionality in a multi-user game in accordance with a preferred embodiment of the present invention;

Fig. 183 is a simplified pictorial illustration of an interactive toy system comprising an interpersonal interaction communication system operative to produce conversations between users in accordance with a prefen-ed embodiment of the present invention;

Fig. 184 is a simplified flowchart of the conversation producing functionality of Fig. 183;

Fig. 185 A is a simplified table in the context of Fig. 184, showing a database record utilized in detecting compatibility between users;

Fig. 185B is a simplified table in the context of Fig. 184 showing a database record utilized in selecting conversation stimulating content for toys;

Fig. 186 is a simplified pictorial illustration of an interpersonal interaction communication system wherein toys have a persona which share personal characteristics with an identifiable person in accordance with a preferred embodiment of the present invention;

Fig. 187 which is a simplified flowchart of the conversation functionality of Fig. 186;

Fig. 188 is a simplified table in the context of Fig. 186 showing a database record utilized in selection of content according to toy persona and to a characteristic of a user;

Fig. 189 is a simplified partly pictorial partly diagrammatic illustration of an interpersonal interaction communication system operative to produce a personal meeting between users in accordance with a preferred embodiment of the present invention;

Fig. 190 is a simplified flowchart of the meeting producing functionality of Fig. 189;

Fig. 191 A is a simplified table in the context of Fig. 190 showing a database record utilized in checking compatibility between users;

Fig. 191B is a simplified table in the context of Fig. 191 A showing a database record utilized in checking compatibility of users;

Fig. 191C is a simplified table in the context of Fig. 191 A showing a database record utilized in profiling users. The database record illustrated is obtained by manipulating a multiplicity of records illustrated in Fig. 191 A;

Fig. 192 is a simplified pictorial illustration of an interpersonal interaction communication system operative to produce pseudo-accidental meetings in accordance with a preferred embodiment of the present invention;

Fig. 193 is a simplified flowchart of the meeting producing frmctionality of Fig. 192;

Fig. 194 is a simplified table in the context of Fig. 193 showing a database record utilized in producing a pseudo accidental meeting of users;

Fig. 195 is a simplified pictorial illustration of an interpersonal interaction communication system operative to produce pseudo-accidental meetings in accordance with another prefened embodiment of the present invention;

Fig. 196 is a simplified table in the context of Fig. 195 showing a database record utilized in producing a pseudo-accidental meeting;

Fig. 197 is a simplified pictorial illustration of an interpersonal interaction communication system operative to produce a meeting between users in the context of a game in accordance with yet another prefened embodiment of the present invention;

Fig. 198 is a simplified flowchart of the gaming frmctionality of Fig. 197; Fig. 199 is a simplified partly pictorial partly schematic illustration of an interactive toy environment comprising a plurality of interactive toys interconnected via a network providing a toy cloning functionality in accordance with a prefened embodiment of the present invention;

Fig. 200 is a simplified block diagram illustration in the context of Fig. 199 showing databases involved in the toy personality storage functionality of Fig. 199 and flow of information involved in the toy personality development functionality of Fig. 199;

Fig. 201 A is a simplified table of a database record of a database of Fig. 200;

Fig. 20 IB is a simplified table in the context of Fig. 200;

Fig. 202 is a flowchart of the toy personality development functionality of Fig. 199;

Fig. 203 is a flowchart illustration of the toy personality transfemng functionality of Fig. 199;

Fig. 204 is a simplified partly pictorial partly schematic illustration of an interactive toy environment comprising a plurality of interactive toys interconnected via a network providing a toy cloning frmctionality in accordance with a preferred embodiment of the present invention;

Fig. 205 is a simplified partly pictorial partly schematic illustration of an interactive toy environment comprising a plurality of interactive toys intercomiected via a network providing a toy cloning functionality in accordance with another prefen-ed embodiment of the present invention;

Fig. 206 is a simplified table of the life history database of Fig. 204 and 205;

Fig. 207A is a simplified flowchart of the personality development of functionality of Fig. 204;

Fig. 207B is a simplified flowchart of the personality development of functionality of Fig. 205;

Fig. 208 is a simplified partly pictorial partly schematic illustration of an interactive toy environment comprising a plurality of interactive toys intercomiected via a network providing a toy cloning functionality in accordance with another prefen-ed embodiment of the present invention;

Fig. 209 is a simplified flowchart of the personality transfening frmctionality of Fig. 208;

Figs. 210A and 210B are simplified partly pictorial partly schematic illustrations of an interactive toy environment comprising a plurality of interactive toys interconnected via a network providing a toy cloning functionality in accordance with a prefened embodiment of the present invention;

Figs. 211 A and 21 IB are block diagrams respectively illustrating the toy personality storage and development functionality of Figs. 210A and 21 OB;

Fig. 211C is simplified block diagram illustration in the context of Figs. 210A, 21 OB, 211 A and 21 IB showing personality development and personality transferring functionality in accordance with another prefened embodiment of the present invention;

Fig. 212 is simplified flowchart of the toy personality transferring functionality of Fig. 211 C;

Fig. 213 is a simplified partly pictorial partly schematic illustration of an interactive toy enviromnent comprising a plurality of interactive toys intercoimected via a network providing a toy cloning functionality in accordance with a preferred embodiment of the present invention;

Fig. 214 is a simplified flowchart of personality transferring functionality of Fig. 213;

Fig. 215 is a simplified partly pictorial partly schematic illustration of an interactive toy environment comprising a plurality of interactive toys interconnected via a network providing toy personality functionality in accordance with a prefened embodiment of the present invention;

Fig. 216 is simplified flowchart of the selectable personality exhibiting mechanism functionality of Fig. 215;

Fig. 217 is a simplified partly pictorial partly schematic illustration of an interactive toy system comprising user assistance functionality including tooth brushing functionality in accordance with a preferred embodiment of the present invention;

Fig. 218 is a simplified flowchart of the user assistance functionality of Fig. 217;

Fig. 219 is a simplified partly pictorial partly schematic illustration of an interactive toy system comprising user assistance functionality including a guide functionality in accordance with a preferred embodiment of the present invention;

Fig. 220A is a simplified table in the context of Fig. 219 showing a destination database for a particular blind user such as the user of toy of Fig. 219;

Fig. 220B is a simplified table in the context of Fig. 219 showing a route-definition database;

Fig. 220C is a simplified table in the context of Fig. 219 showing a multiple user guiding database; Fig. 221 is a simplified flowchart in the context of Fig. 219 showing route definition functionality of a computerized guide system for a blind user;

Fig. 222 is a simplified flowchart in the context of Fig. 219 showing the audio warning functionality and the hazard information providing functionality of the system of Fig. 219;

Fig. 223 is a simplified flowchart of a toy system comprising point to object-name of object functionality;

Fig.. 224 is a simplified pictorial illustration of an interactive toy system comprising both verbal and body language teaching functionality for teaching a foreign language in accordance with a preferred embodiment of the present invention;

Fig. 225 is a simplified flowchart of a school environment teaching functionality wherein an interactive toy acts as a fellow student to a user in accordance with a preferred embodiment of the present invention;

Fig. 226 is a simplified flowchart in the context of Fig. 225 showing a home environment teaching functionality of a toy;

Fig. 227 is a simplified flowchart of an interactive toy system comprising teaching functionality actuable by an event in a non-teaching functionality of a toy in accordance with a prefened embodiment of the present invention;

Figs. 228 and 229 are simplified schematic illustrations of screen display of an interactive toy system providing language teaching functionality in accordance with a prefened embodiment of the present invention;

Figs. 230A-232 are simplified flowcharts of language teaching functionality of an interactive toy system in accordance with prefened embodiments of the present invention;

Figs. 233-236 are simplified partly pictorial partly schematic illustration of an interactive toy system providing behavior corrective functionality in accordance with a preferred embodiment of the present invention;

Figs. 237 and 238 are simplified flowchart of toy learning functionality of the system of Figs. 233-236;

Figs. 239-241 are simplified block diagram illustrations of test group formation frmctionality of an interactive toy system providing a methodology for obtaining and utilizing information in accordance with a preferred embodiment of the present invention;

Fig. 242 is a simplified flowchart of information obtaining functionality of an interactive toy system providing a methodology for obtaining and utilizing infomiation in accordance with a preferred embodiment of the present invention; Figs. 243 and 244 are simplified block diagram illustrations of infonnation utilization functionality of an interactive toy system providing a methodology for obtaining and utilizing information in accordance with a preferred embodiment of the present invention;

Figs. 245 and 246 are simplified flowcharts of information obtaining functionality of an interactive toy system providing a methodology for obtaining and utilizing information in accordance with a prefened embodiment of the present invention;

Fig. 247 is a simplified partly pictorial partly schematic illustration of an interactive toy scheduling system in accordance with a preferred embodiment of the present invention;

Figs. 248A-250 are simplified flowcharts of the scheduling functionality of the system of Fig. 247;

Fig. 251 is a simplified schematic illustration of an interactive toy web browser system in accordance with a preferred embodiment of the present invention;

Figs. 252-255 are simplified flowcharts of the web-browsing functionality of the system of Fig. 251;

Fig. 256 is a simplified partly pictorial partly schematic illustration of an interactive toy system providing teaching functionality in accordance with a prefened embodiment of the present invention;

Fig. 257 is a simplified flowchart of the teaching functionality of the system of Fig. 256;

Fig. 258 is a simplified partly pictorial partly schematic illustration of an interactive toy system providing telephone inquiry functionality in accordance with a preferred embodiment of the present invention;

Figs. 259A-260 are simplified flowcharts of the dialer functionality of the system of Fig. 258;

Fig. 261 is simplified partly pictorial partly schematic illustration of information retrieval frmctionality of the system of Fig. 258;

Figs. 262 and 263 are simplified flowcharts of computer equipment upgrade functionality provided by an interactive toy in accordance with a prefened embodiment of the present invention;

Fig. 264 is a simplified partly pictorial partly schematic illustration of an interactive toy system providing musical output functionality in accordance with a preferred embodiment of the present invention; Fig. 265 is a simplified partly pictorial partly schematic illustration of the noise control functionality of the system of Fig. 264;

Figs. 266-269 are simplified flowcharts of the musical output functionality of the system of Fig. 264;

Fig. 270 is a simplified partly pictorial partly schematic illustration of an interactive persona system comprising a three-dimensional artificial person having a pattern of behavior associated with a physician in accordance with a prefen-ed embodiment of the present invention;

Fig. 271 is a simplified partly pictorial partly schematic illustration of the three dimensional artificial person of the interactive persona system of Fig. 270;

Figs. 272 and 273 are simplified flowcharts of the functionality of the interactive persona system of Fig. 270;

Fig. 274 is a simplified partly pictorial partly schematic illustration of an interactive toy web-browsing system providing employment agency functionality in accordance with a prefened embodiment of the present invention;

Fig. 275 is a simplified flowchart of the employment agency functionality of the interactive toy web-browsing system of Fig. 274;

Figs. 276 and 277 are simplified partly pictorial partly schematic illustrations of an interactive persona system comprising a three-dimensional artificial person having an appearance of a historical figure in accordance with a prefen-ed embodiment of the present invention;

Fig. 278 is a block diagram illustration of various collections of historical figures of the interactive persona system of Figs. 276 and 277;

Fig. 279 is a simplified partly pictorial partly schematic illustration of an interactive toy system providing services to a disabled user in accordance with a prefened embodiment of the present invention;

Fig. 280 is a simplified partly pictorial partly schematic illustration of a walking interactive of the system of Fig. 279;

Fig. 281 is a simplified flowchart of the functionality of Fig. 279;

Fig. 282 is a simplified schematic illustration of an interactive toy system providing a toy personality cloning functionality in accordance with a prefened embodiment of the present invention;

Figs. 283-288 are simplified schematic illustrations of the toy personality cloning frmctionality of Fig. 282; Fig. 289 is a simplified partly-pictorial partly-schematic illustration of an interactive toy web-browsing system providing communication to potential charity organizations in accordance with a preferred embodiment of the present invention;

Figs. 290 and 291 are simplified flowchart of the charity communication functionality of the toy web-browsing system of Fig. 289;

Fig. 292 is a simplified partly-schematic partly- block diagram illustration of an interactive persona system comprising an artificial three-dimensional person having a pattern of behavior of a guide;

Figs. 293A-296 are simplified flowcharts of the functionality of the interactive persona system of Fig. 292;

Figs. 297 and 298 are simplified schematic illustrations of an interactive toy system providing toy-game functionality in accordance with a prefened embodiment of the present invention;

Figs. 299A and 299B are a simplified flowchart of the toy-game functionality of Figs. 297 and 298;

Fig. 300 is a simplified schematic illustration of an interactive toy system providing multi-user game functionality;

Fig. 301 is a simplified schematic illustration of an interactive toy system comprising an interactive toy comprising a lenticular display unit in accordance with a prefened embodiment of the present invention;

Fig. 302 is a simplified flowchart of a point of sale functionality of the interactive toy system of Fig. 301;

Figs. 303A-304 are simplified schematic illustrations of an interactive toy system comprising an inter-toy communication system in accordance with a preferred embodiment of the present invention;

Figs. 305-312 are simplified flowcharts of the inter-toy communication frmctionality of the interactive toy system of Figs. 303 and 304;

Fig. 313 is simplified table of a database record of an interactive toy system providing community fomiation functionality in accordance with a prefened embodiment of the present invention;

Figs. 314-317 are simplified flowcharts of community fomiation functionality provided by an interactive toy system in accordance with a prefened embodiment of the present invention; Fig. 318 is a simplified block-diagram illustration of information storage and utilization of an interactive persona system comprising a three-dimensional artificial person having an appearance of a pattern of behavior associated with a comedian in accordance with a preferred embodiment of the present invention;

Figs. 319A-319D are simplified flowcharts of the functionality of an interactive persona system comprising a tln-ee-dimensional artificial person having an appearance of a pattern of behavior associated with a comedian in accordance with a prefened embodiment of the present invention;

Figs. 320-326 are simplified flowcharts of interpersonal interaction communication functionality of an interactive toy system in accordance with a preferred embodiment of the present invention;

Fig. 327 is a simplified diagrammatic illustration of personal meeting production functionality of an interactive toy system in accordance with a prefened embodiment of the present invention;

Fig. 328A is a simplified flowchart of interpersonal interaction communication functionality of an interactive toy system in accordance with a prefened embodiment of the present invention;

Fig. 328B is a simplified table of a user database record used in conjunction with the interpersonal interaction communication functionality of Fig. 328A;

Fig. 328C is a simplified flowchart of the interpersonal interaction communication functionality of Fig. 328 A;

Figs. 329-332 are simplified flowcharts of interpersonal interaction communication functionality of an interactive toy system in accordance with a preferred embodiment of the present invention;

Fig. 333 is a simplified table of a user database record of an interactive toy system providing interpersonal interactive commimication frmctionality in accordance with a prefened embodiment of the present invention;

Figs. 334-342 are simplified flowcharts of interpersonal interaction communication functionality of an interactive toy system in accordance with a prefened embodiment of the present invention;

Figs. 343A and 343B are simplified schematic illustration of interactive toy propinquity and relative direction detection functionality of an interactive toy system in accordance with a prefen-ed embodiment of the present invention; Fig. 344 is a simplified table of meeting request database record of an interactive toy system providing interpersonal interactive communication functionality in accordance with a prefened embodiment of the present invention;

Fig. 345 is a simplified schematic illustration of a teaching functionality for an interactive toy system in accordance with a preferred embodiment of the present invention;

Fig. 346 is a simplified schematic illustration of an interactive toy system comprising teaching functionality in accordance with a prefen-ed embodiment of the present invention;

Figs. 347 and 348 are simplified flowcharts of teaching functionality of the interactive toy system of Fig. 346;

Fig. 349 is a simplified flowchart of an interactive toy system comprising virtual classroom teaching functionality in accordance with a preferred embodiment of the present invention;

Fig. 350 is a simplified schematic illustration of an interactive toy system comprising virtual classroom teaching functionality in accordance with a prefeιτed embodiment of the present invention;

Fig. 351 is a simplified schematic illustration of an interactive toy system comprising teaching frmctionality in accordance with a preferred embodiment of the present invention;

Figs. 352 and 353 are simplified flowcharts of the teaching functionality of the interactive toy system of Fig. 351;

Fig. 354 is a simplified flowchart of the functionality of an interactive toy system providing content, which assists a user in teaching;

Figs. 355A-360 are simplified flowcharts of the teaching functionality of the interactive toy system of Figs. 350 and 351;

DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS

Improved methods and systems for applications of interactive toys are exemplified by means of the following preferred embodiments.

A methodology for obtaining and utilizing information in an interactive toy environment is described henceforth. Reference is now made to Fig. 1, which is a simplified partly pictorial partly schematic illustration of an interactive toy system comprising a methodology for obtaining and utilizing infomiation in accordance with a preferred embodiment of the present invention. Turning to Fig. 1, it is seen that a plurality of interactive toys including toys 1021, 1022, 1023, 1024 and 1025, located at different sites throughout the world including sites 1001, 1002, 1003, 1004 and 1005, communicate with their users and request that the users provide information via their toys to a suitable interactive toy server 1041. In the illustrated example, information is requested regarding the particular breakfast consumed by the user. This request may be initiated by the toy server 1041, which communicates typically via Internet 1040 with a plurality of computers including computers 1011, 1012, 1013, 1014 and 1015, which in turn respectively provide content input to the toys 1021, 1022, 1023, 1024 and 1025. In the illustrated example the computers 1011, 1012, 1013, 1014 and 1015 communicate with the toys 1021, 1022, 1023, 1024 and 1025 respectively my means of a wireless bi-directional RF link between the computers and the toys. The computers may be located in proximity to their respective toys, for example, in the same home or, alternatively, the computer may be located at a distant location and may communicate with the toy, for example, by means of a public wireless link such as provided by cellular communication systems.

In the illustrated embodiment, the users' response, received via the toys including toys 1021, 1022, 1023, 1024 and 1025, is communicated by the computers including computers 1011, 1012, 1013, 1014 and 1015, to server 1041, which in turn processes the obtained infonnation and provides it, typically via Internet 1040 to research institute 1042. It may therefore be appreciated that the infomiation obtained via the users may be utilized in an application, which is not limited to user involvement.

Reference is now made to Fig. 2 A, which is a simplified table in the context of Fig. 1 showing a database record 1051 of a list of breakfast items reported by a user such as the user of toy 1023 of Fig. 1. The content of the database record 1051 is typically communicated by computer 1023 to server 1041. As seen in Fig. 2 A database record 1051 also includes user's country indication 1063 and date indication 1064. Reference is now made to Fig. 2B, which is a simplified table in the context of Fig. 1 showing a database list 1052 of known breakfast items. Turning to Fig. 2B it is seen that database list 1052 includes two lists of items. Firstly, list 1071 of basic breakfast items such as coffee, eggs, cereal and the like which are denoted as "basic item 1", "basic item 2" and "basic item 3" in the illustrated example. Secondly, a list 1072 which provides, for each item on list 1071, a list of conesponding specific items such as espresso, cappuccino and the like for the item "coffee" of list 1071. Specific items for "basic item 1", for example, are denoted as specific items "1,1", "1,2", 1,3" etc. It. is appreciated that the division of database list 1052 into lists 1071 and 1072 may be based on specific research and development requirements and that other divisions may be drawn allowing for the information, obtained via the users, to be utilized in different ways. In the embodiment illustrated in Fig. 1, database list 1052 may be downloaded to a plurality of computers such as computer 1013 from server 1041. Database list 1052 is preferably updated in the course of infonnation processing such as the process of Fig. 4 described hereinbelow. In such case, a database list such as database 1052 stored on a personal computer such as computer 1013 is preferably continuously updated with new items from server 1041, typically via Internet 1041.

Reference is now made to Fig. 2C, which is a simplified table in the context of Fig. 1 showing a world wide breakfast habits database 1053. Turning to Fig. 2B it is seen that for each breakfast item of list 1081 of Fig. 2B and each user country on list 1082, database 1053 provides the number of instances of the said item in user breakfast item reports of users from the said country. List 1081 of breakfast items may be identical, for example, to basic breakfast item list 1071 of Fig. 2B. In the embodiment illustrated in Fig. 1, database 1053 is typically stored on a suitable server such as server 1041 and is typically updated based on reports arriving from individual users as in the process of Fig. 4 described hereinbelow. It is appreciated that database 1053 may register the number of instances of a particular breakfast item eaten by users from a particular country on a particular day, during any number of days, or the average number per day of such instances calculated over any number of days. Thus, for example, database 1053 may register the average number of cereal servings eaten by American users per day as calculated over the months December, January and February.

Reference is now made to Fig. 2D, which is a simplified table in the context of Fig. 1 showing a database 1054 of breakfast items not included in the list of predetermined items of database record 1052 of Fig. 2B. Turning to Fig. 2D it is seen that for each breakfast item on list 1091 database record 1054 provides on list 1092 the total number of reported instances of such items. In the embodiment illustrated in Fig. 1, database 1054 is typically stored on server 1041, and is typically updated based on reports aniving from individual users as in the process of Fig. 4 described hereinbelow.

Reference is now made to Fig. 3, which is a simplified flowchart of the information obtaining functionality of Fig. 1. Server 1041 sends a request for obtaining information to a plurality computers including computer 1013. Computer 1013 instructs toy 1023 to request that the user provide information regarding items that the user has eaten for breakfast. The user specifies such an item. Computer 1013 adds the item received via toy 1023 to a report list 1051 of items eaten for breakfast and instructs toy 1013 to request that the user provide infomiation regarding another item. If the user provides this information the process described herein is repeated for the item provided by the user. If the user says that there are no more items to provide, database list 1051 is closed, and is sent to server 1041.

Reference is now made to Fig. 4, which is a simplified flowchart of the information utilizing functionality of Fig. 1. A suitable server such as server 1041 of Fig. 1 receives breakfast report lists such as list 1051 of Fig. 2A from a plurality of computers such as computer 1013 of Fig. 1. For any item on list 1051 server 1041 checks if the item is included in lmown breakfast items database 1052. If the item is a known basic breakfast item on list 1071, server 1041 increments by 1 the number of instances of the basic item concerned for users of the country of the user concerned on breakfast habits database 1053. If the item is a known specific breakfast item, server 1041 retrieves from lmown breakfast item database 1052 the basic item to which the specific item concerned conesponds. For example, for the item "espresso" server 1041 retrieves from database 1052 the item "coffee". Then, server 1041 increments by 1 the number of instances of the retrieved basic item for users of the country of the user concerned on breakfast habits database 1053.

If the item on list 1051 is not included in known breakfast items database 1052, it is added to database list 1054. And if the item has already been reported a hundred times, then it is deleted from database record 1054 and added to database record 1052. For example, human intervention is employed in order to determine whether an item is to be added to list 1071 of database record 1052 as part of a basic breakfast. Alternatively, it may be added to list 1072 of database record 1052 as a specific case of another item already on list 1071 of basic breakfast items.

Reference is now made to Fig. 5, which is a simplified schematic illustration in the context of Fig. 1 showing a screen display 1200 of a permission obtaining procedure. Turning to Fig. 5 it is seen that a person such as a parent of the user chooses whether to disallow infonnation retrieval from the user, or only to allow retrieval of non- personalized infomiation. Informational retrieval may be allowed in the areas of purchasing, entertainment and food. Screen display 1200 also allows a parent to review a privacy policy and to send an approval or disapproval message. An approval or disapproval message is typically communicated by a personal computer such as computer 1013 to server 1041, typically via Internet 1040.

It is appreciated that the functionality of Figs. 1, 2, 3, 4 and 5 is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys interconnected via a network and providing a methodology for obtaining and utilizing information comprising: employing at least one of the plurality of interactive toys to obtain infonnation via the user, and utilizing the infonnation obtained via the user in an application which is not limited to user involvement.

It is also appreciated that the functionality of Figs. 1, 2, 3, 4 and 5 is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys interconnected via a network and providing a methodology for obtaining and utilizing information comprising: employing at least one of the plurality of interactive toys to obtain infomiation via the user; utilizing the infonnation obtained via the user in an application which is not limited to user involvement; and obtaining required permission of at least one of a user and a person legally capable of providing permission in respect of the user.

It is also appreciated that the functionality of Figs. 1, 2, 3, 4 and 5 is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a network and providing a methodology for obtaining and utilizing infonnation comprising: employing at least one of the plurality of interactive toys to obtain infomiation via the user; utilizing the information obtained via the user in an application which is not limited to user involvement; and wherein the infomiation is obtained at least partially by employing speech recognition.

It is further appreciated that the functionality of Figs. 1, 2, 3, 4 and 5 is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected via a network and providing a methodology for obtaining and utilizing infonnation comprising: employing at least one of the plurality of interactive toys to obtain information via the user; utilizing the information obtained via the user in an application which is not limited to user involvement; and wherein the infonnation is utilized at least partially for evaluating nutrition habits of at least one user.

An interactive enviromnent providing a methodology for obtaining and utilizing infomiation wherein interactive toys provide infonnation on purchasing and wherein the information is utilized in marketing is now described, in accordance with a preferred embodiment of the present invention.

Reference is now made to Fig. 6, which is a simplified partly pictorial partly schematic illustration of methodology of obtaining infonnation on purchasing and utilizing the infonnation in marketing in accordance with a preferred embodiment of the present invention. Turning to Fig. 6 it is seen that an interactive toy 1301 suggests to a user that the user might wish to order a pizza for supper. This commercial suggestion may be initiated by a suitable toy server 1304, which communicates typically via Internet 1303 with a computer 1302, which in turn provides content input to toy 1301 by means of typically wireless communication therewith. A purchase request by the user received via toy 1301 is typically communicated by computer 1302 to server 1304, which in turn communicates the request typically via Internet 1305 to a suitable shop 1305 in the user's area. It is appreciated that server 1304 may obtain information on purchasing, which is provided by toys such as toy 1301 wold wide.

Reference is now made to Fig. 7A, which is a simplified table in the context of Fig. 6 showing a purchase report message 1310 sent from computer 1302 to server 1304 typically in addition to the purchase request itself. Tmning to Fig. 7A it is seen that purchase report message 1310 includes a product indication 1311, user's country indication 1312 and relative income level indication 1313 referring to the income level of the user's family relative to the average income level in the user's country. The relative income level may be provided, for example, by a member of a user's family at registration to a toy system.

Reference is now made to Fig. 7B, which is a simplified table in the context of Fig. 6 showing a world-wide purchase database record 1320 for a particular product 1330. Database record 1320 is typically stored on a suitable server such as server 1304. Turning to Fig. 7B it is seen that for each country on list 1321 and each relative income level from 1 to 10 on list 1322, database record 1320 provide the total number of reported purchases of product 1330 by users of the relative income level in a given country.

Reference is now made to Fig. 8, which is a simplified flowchart of the infonnation obtaining and utilizing methodology of Fig. 6. In response to a purchase offer made by toy 1301, a user requests to purchase product 1330. Based on the user's purchase request received via toy 1301, computer 1302 coimnuiiicates to server 1304 purchase request message 1310 comprising user's country indication C and relative income level R. Based on purchase report data 1310, server 1304 updates world wide purchase database 1320 for product 1330 by incrementing by 1 the total number of purchases of product 1330 by users of relative income level R of country C.

It is appreciated that infonnation accumulating in database record 1320 may be directly utilized in order to detennine the prefen-ed residential areas in each country where merchandising facilities of a product in question are to be provided.

It is appreciated that the functionality of Figs. 6, 7A and 7B is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected via a network and providing a methodology for obtaining and utilizing infomiation comprising: employing at least one of the plurality of interactive toys to obtain infonnation via the user; utilizing the infomiation obtained via the user in a marketing application which is not limited to user involvement.

It is also appreciated that the functionality of Figs. 6, 7A and 7B is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys interconnected via a network and providing a methodology for obtaining and utilizing information comprising: employing at least one of the plurality of interactive toys to obtain information via the user; utilizing the infonnation obtained via the user in an application which is not limited to user involvement; and wherein one of the plurality of interactive toys provides information on marketing.

An interactive toy environment providing a methodology for obtaining information and utilizing the information in advertising is now described, in accordance with a prefen-ed embodiment of the present invention. Reference is now made to Fig. 9, which is a simplified pictorial illustration of a methodology for obtaining and utilizing infonnation in accordance with a prefened embodiment of the present invention. Turning to Fig. 9, it is seen that a user watches an advertisement via a television set 1505 the advertisement comprising a unique jingle identified with it. Toy 1500 picks up the sound of the advertisement and sends it to computer 1502. Computer 1502 identifies advertisement. At some later time toy 1500 hums a tune of the jingle from the advertisement. In response, user sings a part of the jingle. Toy 1500 picks up user's voice and sends it to computer 1502. Computer 1502 recognizes user's utterance as a part of the advertisement. Computer 1502 notifies server 1504 that user had internalized advertisement.

Fig. 10A is a simplified flowchart in the context of Fig. 9 showing an infonnation obtaining functionality wherein the information may be utilized in advertising. An advertiser 1506 sends toy server 1504 information regarding an advertisement. The infonnation includes: an identifying sound signal embedded in the advertisement, enabling computer 1502 to identify an advertisement being broadcast, information regarding the content of the advertisement enabling computer 1502 to detect that a user repeats a jingle from the advertisement, and toy content meant to stimulate the user to repeat the jingle. At some later time toy 1500 picks up the predetennined sound included in the advertisement and sends the sound to computer 1502. Computer 1500 identifies the unique sound signal of the advertisement, and registers that the advertisement was received by the user. At a still later time, toy 1500 actuates toy the content received from advertiser 1506, the content designed to remind the user of the advertisement, such as a hum of a jingle's tune, some words from the jingle, or another detail from the advertisement. In response the user sings the jingle from the advertisement. Toy 1500 picks up user's utterance and sends it to computer 1502. Computer 1502 identifies the utterance as a part of the jingle, using information regarding advertisement content received from advertiser 1506. The information includes the lyrics of the jingle thus enabling computer to identify user's utterance as a repetition of the jingle. Computer 1502 then notifies the server as to whether the user sand the jingle (or a portion of the jingle) in response to the reminder by toy 1500.

Fig. 10B is a simplified flowchart in the context of Fig. 9 showing another infonnation obtaining functionality wherein the information may be utilized in advertising. Server 1504 sends computer 1502 information regarding an advertisement. The information includes an identifying sound signal embedded in the advertisement and information regarding the content of the advertisement. At some later time toy 1500 picks up a sound and sends it to computer 1502. Computer 1502 identifies a portion of the advertisement preferably by using speech recognition functionality to match the received sound with the lyrics of the advertisement. Computer 1502 then checks whether an identifying sound signal was received, reception of which indicates that the advertisement was received via a broadcasting channel. If the signal was not detected by computer 1502 then computer 1502 assumes that the sound was received from a human source, typically from user. It is appreciated that computer 1502 may further check if it was indeed the user who repeated the advertisement using voice recognition methods.

Reference is now made to Fig. 11 which is a simplified flowchart in the context of Fig. 9 showing infomiation utilization frmctionality for the purpose of designing advertising. An advertiser 1506 tests the effectiveness of two possible advertising jingles for a product. Advertiser 1506 sends toy server 1504 infonnation regarding the two jingles. The infonnation includes: identifying sound codes embedded in the jingles, information regarding the jingles' content, and toy content designed to remind users of the jingles. Advertiser 1506 broadcasts two jingles on two media channels. It is prefened that both media channels have similar usage ratings for users in the target group for the advertising being tested. A plurality of computers report to server 1504 informing the server of which users sand the jingles as described in Figs. 51A and 51B. Server 1504 compares the respective numbers of users that have been reported singing each of the jingles, and uses this information to determine which of the two jingles is likely to be more memorable to targets of future advertisements.

Reference is now made to Fig. 12 which is a simplified flowchart in the context of Fig. 9 showing infomiation utilization functionality for the purpose of directing advertising. Advertiser 1506 tests the relative effectiveness of two media channels with respect to a predefined target group. For that purpose, advertiser 1506 prepares two versions of an advertising jingle, differing only in identifying sound codes embedded in each of them. Advertiser 1506 broadcasts each of the jingles in one of the media channels tested, thus enabling a computer 1502 to identify the broadcasting channel of a jingle picked up by toy 1500. Toy server 1504 receives reports from a plurality of computers regarding users who had sung the jingle, the reports including information regarding media chaimels through which the users were exposed to the jingles.

Reference is now made to Fig. 13 which is a simplified flowchart describing the effectiveness measurement functionality of Fig. 12. Server 1504 detects that a user belongs to a target group regarding which an advertiser 1504 wishes to test the advertising effectiveness of different media chaimels. Such a group may be defined by demographic data, such as gender and age. Such data regarding toy users is typically supplied to toy server 1504 at registration and stored in a database. Computer 1502 counts the number of times a broadcast jingle is picked up by toy 1500 after being received via one of the channels tested. Computer 1502 distinguishes between the broadcast channels by the different sound codes embedded in each of the versions of the jingle. When a toy 1500 picks up a user singing the jingle for the first time, as in the method described in Fig. 10A, computer 1502 calculates the relative effectiveness of each of the chaimels tested, relative to a user, by the number of times the jingle was received by users via each of the channels, the numbers designated as Cl and Cl. The relative efficiencies of the chaimels, designated respectively as El and E2, is obtained from the formulae:

E1=C1/(C1+C2)

E2=1-E1 Server 1504 receives the relative efficiencies of the channels from a plurality of computers, relative to the users of the computers. Server 1504 sums the numbers designative the relative efficiencies for each of the channels, the outcome designating the relative advertising effectiveness of the chaimels relative to the whole target group.

It is appreciated that the functionality of Figs. 9, 10 A, 10B and 11 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected via a network, and providing a methodology for obtaining and utilizing infonnation comprising: employing at least one of the plurality of interactive toys to obtain information via the user; utilizing the information obtained via the user in an application which is not limited to user involvement; and wherein the information is utilized in designing advertising.

It is appreciated that the functionality of Figs. 9, 10 A, 12 and 13 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys interconnected via a network, and providing a methodology for obtaining and utilizing information comprising: employing at least one of the plurality of interactive toys to obtain infomiation via the user; utilizing the infonnation obtained via the user in an application which is not limited to user involvement; and wherein the information is utilized in directing advertising.

An interactive toy environment providing a methodology for obtaining information and utilizing information in directing advertising and in classifying users according to user profiles, at least partly derived from the infonnation, is now described. Reference is now made to Fig. 14 which is a simplified pictorial illustration of a methodology for obtaining information and utilizing the infonnation in classifying users in accordance with a prefened embodiment of the present invention. Turning to Fig. 14, it is seen that a user watches an advertisement on television 1605, the advertisement including a unique expression, namely "hyper cool", coined for the purpose of tracking which users embed the expression in their speech. At some later time toy 1600 participates in the user's play. The user repeats the unique expression "hyper cool" while playing with his toy train. Toy 1600 picks up the user's speech and sends it to computer 1602. Computer 1602 recognizes, in the user's speech and, in particular, identifies the unique expression "hyper cool", and verifies that the source of the expression was in a specific advertisement. Computer 1602 notifies server 1604 that the user repeated, in his speech, an expression from the commercial. It is appreciated that a toy 1600 may also attempt to actively elicit the expression from a user. In such a case, advertiser 1606 sends server 1604 appropriate interactive toy content to be distributed to computers encouraging them to use the expression such as "hyper cool" in the present example. In the example described, the content could be a question whose supposed answer is a superlative, such as "What do you think of this train?"

Reference is now made to Fig. 15 which describes a method wherein information obtained is utilized for the purpose of classifying users. Advertiser 1606 broadcasts an advertisement including a unique expression or phrase. Advertiser 1606 sends toy server 1604 the unique expression or phrase as an indication that a user has watched and or has heard and internalized the advertisement. Server 1604 sends the expression or phrase to a plurality of computers, including computer 1602. At a later time toy 1600 picks up the user's speech including the unique expression and sends it to computer 1602. Computer 1602 recognizes the unique expression in the user's speech. A plurality of computers, including computer 1602, send server 1604 notifications of users who have used the expression in their speech. Server 1604 uses information regarding users who have used the unique expression to derive a profile of users that identify with the advertisement and/or with the product advertised.

Reference is now made to Fig. 16 which is a simplified table, in the context of Fig. 15, showing a database used for the purpose of deriving a user profile, the database including data regarding the whole population of users 1624 and data regarding users who have internalized specific advertisement content 1620, the data obtained in a method described in Figs. 14 and 15. The table describes the distribution of a multiplicity of attributes among the whole population of users 1625, 1626, 1627 and among the users that have been reported to internalize advertisement content 1621, 1622, 1623. The attributes are obtained in a variety of ways, including personal infonnation supplied to toy server 1604 at registration, and data obtained via user interaction with toy.

Reference is now made to Fig. 17 which is a simplified flowchart in the context of Figs. 15 and 16, showing a profile-deriving functionality wherein the derived profile may be used for the purpose of directing advertising. In Fig. 17, G denotes the group of all users who have been reported as using a particular unique expression embedded in an advertisement. Server 1604 finds among the group an attribute (such as age range, nationality, gender, family income level etc.) most characteristic of it, namely an attribute with the highest difference between its rate among group G and its rate among the whole population of users. If the difference between these two rates is higher than a pre-defined difference, such as, for example, 20%, server 1604 creates a subgroup of the group G containing users who have been reported using the particular unique expression embedded in the advertisement and who also have the aforementioned attribute. Server 1604 then removes the attribute from list of attributes to be checked, and then repeats the process for the remaining set of attributes. The outcome of the process is a list of attributes which characterizes, most significantly, the types of users who are most likely to be influenced by the particular advertisement. This profile may then be used in order to direct advertisement among populations wider than the population of toy users.

It is appreciated that the functionality of Figs. 14, 15, 16 and 17 taken together is particularly appropriate to an interactive toy enviroimient comprising a plurality of interactive toys interconnected via a network, and providing a methodology for obtaining and utilizing infonnation comprising: employing at least one of the plurality of interactive toys to obtain infomiation via the user; utilizing the infonnation obtained via the user in an application which is not limited to user involvement; wherein the information is utilized in directing advertising; and wherein the infonnation is utilized in classifying users according to user profiles at least partly derived from the information.

It is appreciated that the functionality of Figs. 14, 15, 16 and 17 taken together is particularly appropriate to a methodology such as the aforementioned and wherein the information includes not only information directly provided by the user but also information derived from user behavior sensed by the at least one interactive toy.

It is appreciated that the functionality of Figs. 14, 15, 16 and 17 taken together is particularly appropriate to a methodology, such as the aforementioned, and wherein the information includes not only infonnation directly provided by the user but also information derived from user behavior sensed by the at least one interactive toy, which behavior is noncommercial behavior.

It is appreciated that the frmctionality of Figs. 14, 15, 16 and 17, taken together, is particularly appropriate to a methodology, such as the aforementioned, and wherein the information is obtained at least partially by employing speech recognition.

An interactive toy enviromnent providing a methodology for obtaining information and utilizing the infonnation for the purpose of updating criteria employed in speech recognition for disparate cultural groups is now described, in accordance with a prefened embodiment of the present invention. Reference is now made to Fig. 18, which is a simplified pictorial illustration of a methodology for prompting the user to say certain words, which may be utilized for updating criteria employed in speech recognition in accordance with a prefened embodiment of the present invention. Turning to Fig. 18, it is seen that a toy 2000 requests from a user that the user read to it a specific story, namely the Three Bears story, the text of which story is already stored in computer 2002. The user reads the story from a book 2006 to toy 2000. Toy 2000 sends the user's speech to computer 2002. Computer 2002 utilizes the speech to update criteria used in speech recognition, as in methods of training a speech recognizer in methods lmown in the art. Computer 2002 sends server 2004 updates, made in criteria employed in speech recognition. Server 2004 utilizes the updates to update criteria employed in speech recognition regarding a cultural group to which the user belongs.

It is appreciated that a book 2006 is provided with sensors communicating with computer in RF and enabling computer 2002 to detect the exact page that is cunently being read, thus shortening the speech and the text segments compared by computer 2002 and making the comparison more reliable. Alternately, this may be achieved by marking the pages of the book 2006 with special marks, such as squares of different colors, received by a video camera on toy 2000 and identifiable by computer 2002. It is also appreciated that book 2006 may be a picture book, such as in infant books. Such a book allows a user to teach a toy 2000 to speak. In. that case a user points to a picture of an object in book 2006 and verbalizes its name to toy 2000, and toy 2000 leams to speak gradually, in accordance with the amount of words already verbalized to it by user. It is further appreciated that a computer monitor may assume the function of the book in the aforementioned examples.

Reference is now made to Fig. 19, which is a simplified flowchart of the speech recognition criteria updating functionality of Fig. 18. The user reads story to toy 2000. Toy 2000 sends user's speech to computer 2002. Computer 2002 employs speech recognition to user's speech. Computer 2002 compares the text into which user's speech has been converted with the text of the story stored in computer 2002. If there are differences between the texts, meaning that speech recognition did not operate correctly, computer updates criteria employed in speech recognition, in methods known in the art for training a speech recognizer, such that recognition according to the updated criteria would have yielded more accurate results. Computer 2002 sends information regarding updates made to server.

Reference is now made to Fig. 20, which is a simplified flowchart in the context of Fig. 18 showing a methodology for employing speech recognition in obtaining information from disparate cultural groups and utilizing the information for the purpose of updating criteria employed in speech recognition. Server 2004 receives from a plurality of computers, divided according to disparate cultural groups of their users, information regarding updates made in criteria employed in speech recognition, as in Fig.. 19. In the example shown the plurality of computers send server 2004 information regarding variations in pronunciations of words. Standard speech recognizers utilize information regarding probabilities of various pronunciations of same words. In training a speech recognizer a computer may update such probabilities in order to improve its recognition efficacy relative to a user. In the example shown computers send server 2004 probabilities for various pronunciations of a words that were changed due to an update procedure described in Fig. 19. Server 2004 employs the information to derive word pronunciation models relating to the disparate groups of users, the models including probabilities for different pronunciations of words.

In another preferred embodiment of the present invention, a plurality of computers send server 2004 not only updates in the criteria employed in speech recognition, but of all words picked up by toys, thus enabling server 2004 to derive a pronunciation model, statistically more accurate in relation to different cultural groups.

Reference is now made to Fig. 21, which is a simplified flowchart of the updating functionality of Fig. 20. Server 2004 receives from computer 2002 information regarding updates to speech recognition parameters, made in consequence of failure to recognize correctly user's speech. Such information may be probabilities of different pronunciations of words not recognized correctly by computer 2002, as in Fig. 20. Server 2004 checks the user record in a database, the record including personal infonnation supplied to server 2004 at registration. If the record includes user's cultural identity, server 2004 adds an update to a database of word pronunciations of the user's cultural group. Otherwise, server 2004 assumes the user's cultural identity, based on demographic data and on user's speech characteristics. Server 2004 may apply, for that purpose, information such as user's place of residence. Server 2004 may also compare updates to speech recognition parameters made by a user's computer 2002 with previously obtained pronunciation models of different cultural groups. If such estimates yield an outcome with a probability higher than a defined rate, such as 95%, server 2004 sends to computer 2002 a pronunciation model related to the cultural group, thus enabling computer 2002 to utilize a pronunciation model more compatible to its user.

Reference is now made to Fig. 22, which is a simplified table in the context of Figs. 20 and 21, showing database of word pronunciation models derived from information obtained from users from disparate cultural groups. The table includes a word column 2010 comprising a plurality of words and two pronunciation columns 2011 and 2012 relating to two disparate cultural groups. The pronunciation columns are transcribed in the table shown in the ARPAbet phonetic alphabet. The table shows the probabilities of different pronunciations of a word in each cultural group. Reference is now made to Fig. 23, which is a simplified table in the context of Figs. 20 and 21, showing a database used in a methodology of utilizing information obtained regarding word pronunciation variations of cultural groups for the purpose of updating criteria employed in speech recognition. The table enables a computer 2002 to utilize information regarding a dialect of a cultural group in order to improve speech recognition relative to speech of a specific user belonging to the cultural group. It compares the efficacy for the purpose of speech recognition, and relative to the user, of a word pronunciation model derived from speech of a plurality of users from the group with that of a model derived from speech of user. The table includes three columns. A first column 2020 contains words. A second column 2021 contains word pronunciation models for the words of column

2020 such as described in Fig. 22, derived from information obtained from users from a cultural group. A third column 2022 contains word pronunciation models derived from speech of a specific user. Columns 2021 and 2022 include respectively columns 2024 and 2026 that specify the number of successful uses of each model, relative to each word in the table.

Reference is now made to Fig. 24 which is a simplified flowchart in the context of Fig. 23 showing a methodology for comparing the efficacy of two word pronunciation models in relation to a user. Computer 2002 receives from server 2004 a word pronunciation model, derived from infomiation obtained from a cultural group a user belongs to, such as described in Fig. 22. Such a model is sent after a user specifies his or her cultural identity, or after server 2004 detects user's cultural identity from personal information and/or from user's speech. As described in Fig. 21, Computer 2002 has two separate pronunciation models: a first model derived from user's speech, as described in Fig. 19, and a second model derived from speech of a plurality of users from a cultural group to which the user belongs. Toy 2000 picks up user's speech and sends it to computer 2002. Computer 2002 converts speech to text, based on the two models, thus creating two text strings: Tl derived from first model and T2 derived from second model. Computer 2002 also calculates the probability of each of the two strings being correct as in standard speech recognition methods, the probabilities designated as PI and P2 respectively. If the two text strings are not identical, toy 2000 asks a user what he said, using one of the strings converted to speech (such as by asking user, "Did you say 'tomato'?"). Computer 2002 thus detects which of the two text strings matches the user's utterance. Computer 2002 removes pronunciation models of the word of the mismatching text string from the column from which they were taken

2021 or 2022. Computer 2002 sends server 2004 infonnation regarding updates to criteria employed in speech recognition. If the two text strings are identical, the computer compares the respective probabilities of the strings. Computer 2002 adds 1 to the successful uses column 2024 or 2026 of all the words of the more probable string.

Reference is now made to Fig. 25, which is a simplified flowchart of speech recognition criteria updating functionality of Fig. 24. Computer 2002 checks efficacy of two word pronunciation models relative to a user. Computer 2002 checks the number of successful uses of each word model with respect to a user. If the ratio between the number of successful uses of one model and the number of successful uses of another model exceeds a defined ration, such as 1 :2, computer removes the less efficient model from list. Computer 2004 notifies server 2004 of updates made into the speech recognition criteria.

It is appreciated that the functionality of Figs. 18, 19, 20, 21, 22, 23, 24 and 25 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercoimected via a network, and providing a methodology for obtaining and utilizing information comprising: employing at least one of the plurality of interactive toys to obtain infonnation via the user; utilizing the infonnation obtained via the user in an application which is not limited to user involvement; and wherein the information is obtained at least partially employing speech recognition.

It is appreciated that the functionality of Figs. 18, 19, 20, 21, 22, 23, 24 and 25 taken together is particularly appropriate to a methodology, such as the aforementioned and wherein the information is obtained at least partially employing speech recognition from disparate cultural groups of users.

It is appreciated that the functionality of Figs. 18, 19, 20, 21, 22, 23, 24 and 25 taken together is particularly appropriate to a methodology, such as the aforementioned and wherein criteria employed in speech recognition are updated in response to infomiation received indicating the efficacy of the speech recognition.

It is appreciated that the functionality of Figs. 18, 19, 20, 21, 22, 23, 24 and 25 taken together is particularly appropriate to a methodology, such as the aforementioned and wherein the at least one toy is employed at least partially to prompt the user to speak.

It is appreciated that the functionality of Figs. 18, 19, 20, 21, 22, 23, 24 and 25 taken together is particularly appropriate to a methodology, such as the aforementioned, and wherein the at least one toy is employed at least partially to prompt the user to say certain words. It is appreciated that the functionality of Figs. 18, 19, 20, 21, 22, 23, 24 and 25 taken together is particularly appropriate to a methodology, such as the aforementioned, and wherein the at least one toy is employed at least partially to a language model.

An interactive toy enviromnent providing a methodology for obtaining information and utilizing the infomiation at least partially as a diagnostic tool for evaluating performance of at least one of a computer and an interactive toy is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 26, which is a simplified partly pictorial partly schematic illustration of a methodology of obtaining and utilizing infonnation as a diagnostic tool for evaluating the performance of a computer in accordance with a prefened embodiment of the present invention. Turning to Fig. 26, it is seen that two interactive toys 2801 and 2805 respectively communicating with computers 2802 and 2806 respectively suggest to their users that the users might wish to play one of two particular games referred as game 1 and game 2. In the illustrated embodiment, computer 2802 includes processor model 1, which is sufficient for the normal requirement of communicating with interactive toys such as the requirement of game 1. Computer 2806 includes a more powerful processor model 2, capable of meeting higher memory and speech- recognition requirements of game 2. A game request is communicated by computers 2802 and 2806 to server 2804 via Internet 2803. Thus, it may be appreciated that server 2804 is operative to utilize information obtained via game request as a diagnostic for evaluating the perfonnance of computers comprising processor model 1.

Reference is now made to Fig. 27 A, which is a simplified table in the context of Fig. 26 showing a game request message 2810 sent from computers such computer 2801 and 2805 to server 2804. Turning, to Fig, 27A it is seen that game request 2810 includes processor type indication 2811 and game number 2812.

Reference is now made to Fig. 27B, which is a simplified table in the context of Fig. 26 showing a database record 2815 of accumulated game requests typically stored on server 2804. Turning to Fig. 27B it is seen that for each processor type x and game number y, database record 2815 provides the total number of requests Tx,y for game y to from computers comprising processors of type x.

Reference is now made to Fig. 28A, which is a simplified flowchart of the infomiation obtaining functionality of Fig. 26. A user requests to play one of game 1 and game 2. Computer 2802 or 2806 communicates game request message 2810 to server 2804. Server 2804 updates database record 2815 of game results. Reference is now made to Fig. 28B, which is a simplified flowchart of the information utilization functionality of Fig. 26. Server 2804 initiates a data process procedm-e. The procedure continues only if each one of the four total numbers of accumulated game-requests Tl,l Tl,2 T2,l and T2,2 is greater than a predetermined number such as 100. Then, server 2804 checks if the relative part of requests for game 2 from among the requests for the two types of games in the case of computers comprising processors of type 1 is considerably smaller than the same relative in the case of computers comprising processors of type 2. For example, at least 1000 time smaller. If so, processor type 1 is insufficient for the memory and/or speech recognition requirements of game 2.

Reference is now made to Fig. 29, which is a simplified partly pictorial partly schematic illustration of a methodology of obtaining and utilizing information as a diagnostic tool for evaluating the performance of an interactive toy in accordance with a prefened embodiment of the present invention. Tuming to Fig. 29, it is seen that a user moves an arm of an interactive toy 2851 comprising an am motion sensor 2855. In the illustrated embodiment, toy 2851 is in wireless communication with a computer 2852, which in turn communicates typically via Internet 2853 with a suitable toy server 2854. It is appreciated that server 2854 is operative to obtain information on motion of toy body parts, which infonnation may be utilized as a diagnostic tool for evaluating the performance of an interactive toy.

Reference is now made to Fig. 30 A, which is a simplified table in the context of Fig. 29, showing a sensor signal message 2860 sent from computer 2852 to server 2854. As is seen in Fig. 30A, a sensor signal message 2860 includes a toy ID 2861 of toy 2851 and a body-part number 2862 for the motion concerned. It is appreciated that a sensor signal message may be communicated from computer 2852 to server 2854 for any single sensor signal individually or for any number of consecutive sensor signals in one message.

Reference is now made to Fig. 30B, which is a simplified table in the context of Fig. 29 showing a database record 2870 of accumulated body-part motions for a particular toy. As seen in Fig. 2870 for each motion of the motions 1 to N of a particular type of toy, a database record 2870 provides the total number of sensor signals for the motion reported by a particular toy of the type concerned.

Reference is now made to Fig. 30C, which is a simplified table in the context of Fig. 29, showing a database record 2880 of accumulated body-part motions for a particular toy type. As seen in Fig. 2880 for each motion of the motions 1 to N of a particular type of toy, database record 2880 provides the total number of sensor signals for the motion reported by all toys of the type concerned.

Reference is now made to Fig. 30D, which is a simplified table in the context of Fig. 29 showing a database record 2890 of malfunctioning body part motion. As seen in Fig. 30D, for each type of toy with a similar body part, database record 2890 provides the number of toys of that type where the body-part became inoperative, and the average number of sensor signals reported by the toys until the body part became inoperative.

Reference is now made to Fig. 31 A, which is a simplified flowchart of the information utilization frmctionality of Fig. 29. Server 2854 initiates body-part-motion check in order to locate unused body parts in a particular type of toy. Server 2854 retrieves data from database record 2880 of all toys motions of a particular toy type 2881. For each of the movable body-parts 1 to N of all toys of type 2881, server 2854 checks if the ratio of the average number of motions of the body-part in question for a single toy from among the total number of motions of all body parts is considerably smaller than the average number of motion for body part. If so, the body part in question is registered as rarely used and therefore unnecessary. The body-parts that are found unnecessary may be included in an immovable version in future models of toys, which will reduce the cost of such models.

Reference is now made to Fig. 3 IB, which is a simplified flowchart of another infonnation utilization functionality of Fig. 29. A body-part X of toy 2851 is reported to be malfunctioning. For example, the toy is brought to toy repair facility, which informs server 2854 of the malfunction in body part X, typically via Internet 2853. Server 2854 retrieves database record 2870 of toy 2851 the number of sensor signals Tx reported by toy 2851 for body-part X. Server 2852 updates database 2890 of malfunctioning body-parts with the data retrieved from database record 2870, by updating the average number T of motions of the body part concerned of all toys of the type of toy 2851. The updated database 2890 allows to detennine which of the toy types with similar body parts allows for a greater number of motions until a body-part becomes inoperative.

It is appreciated that the infonnation utilization procedures of Figs. 31 A and 3 IB include utilizing the information obtained via the user in an application, which is not limited to user involvement such as improving the manufacturing of toys including, not only of computer controlled toys but also of other types of interactive and/or passive toys.

It is appreciated that the functionality of Figs. 26, 27, 28, 29, 30 and 31 is particularly appropriate to an interactive toy enviroimient comprising a plurality of interactive toys intercomiected via a network and providing a methodology for obtaining and utilizing infonnation comprising: employing at least one of the plurality of interactive toys to obtain information via the user; utilizing the infonnation obtained via the user in an application which is not limited to user involvement; and wherein the information is utilized at least partially as a diagnostic tool for evaluating perfonnance of at least one of a computer and an interactive toy.

It is appreciated that the frmctionality of Figs. 26, 27, 28, 29, 30 and 31 is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a network and providing a methodology for obtaining and utilizing infonnation comprising: employing at least one of the plurality of interactive toys to obtain information via the user; utilizing the infonnation obtained via the user in an application which is not limited to user involvement; and wherein the infonnation is utilized at least partially as a toy design tool.

An interactive toy environment providing a methodology for obtaining information and utilizing the infonnation at least partially as a diagnostic tool for evaluating perfonnance of at least one user, in accordance with a prefened embodiment of the present invention, is now described. Reference is now made to Fig. 32, which is a simplified partly pictorial partly schematic illustration of a methodology for obtaining and utilizing infonnation as a diagnostic tool for evaluating the performance of a user over time. Turning to Fig. 32 it is seen that an interactive toy 2901 comprising two ami-press sensors 2906 and 2907 requests that a user press the left arm of toy 2901. Toy 2901 may request that the user press any of the two hands of toy 2901, and may refer to the hands as either "my left/right" or "your left/right". Toy 2901 is in typically wireless communication with computer 2902, which receives sensor signals from toy 2901. Computer 2902 also communicates typically via Internet 2903 with a suitable toy server 2904. Thus, it may be appreciated that server 2904 is operative to evaluate changes in the performance of the user over time.

Reference is now made to Fig. 33 A, which is a simplified table in the context of Fig. 32 showing a report message 2910 of user performance typically communicated from computer 2902 to server 2904. As seen in Fig. 33A, report message 2910 includes a user ID 2911 and game result 2912 showing the number x of successful out of the total number y of attempts in the course of a single game session.

Reference is now made to Fig. 33B which is a simplified table in the context of Fig. 32 showing a database record 2914 of average results of a particular user in the course of a series of N games. Turning to Fig. 33B, it is seen that for each number in the game number row 2915, database record 2914 provides in the average result row 2916 the average result of the number of games.

Reference is now made to Fig. 34, which is a simplified flowchart of the information obtaining functionality of Fig. 32. Toy 2901 suggests to a user that the user might wish to play a reaction game. The user agrees. A computer 2902 randomly chooses one of "my" and "your" and one of "left" and "right", and instructs toy 2901 to fancifully command the user to press on the hands of toy 2901 according to the chosen words. If no sensor signal is received from toy 2901 within half a second, computer 2902 increments by one the number Y of attempts. If a sensor signal received via toy 2901 shows that the user pressed the wrong hand, computer 2902 increments by one the number Y of attempts. If a sensor signal received via toy 2901 shows that the user has pressed the conect hand, computer 2902 increments by one the number X of attempts as well as the number Y of attempts. The process is repeated until the user no longer wishes to continue with the game. Then computer 2901 communicates to server 2904 a report message 2910 comprising the user ID and the game-result X divided by Y. Upon receipt of report message 2910, server 2904 updates the user's database record of game result 2914.

It is appreciated that the functionality of Figs. 32, 33 and 34 is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercoimected via a network and providing a methodology for obtaining and utilizing infonnation comprising: employing at least one of the plurality of interactive toys to obtain infonnation via the user; utilizing the information obtained via the user in an application which is not limited to user involvement; and wherein the infonnation is utilized at least partially as a diagnostic tool for evaluating performance of at least one user.

It is also appreciated that the functionality of Figs. 32, 33 and 34 is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected via a network and providing a methodology for obtaining and utilizing information comprising: employing at least one of the plurality of interactive toys to obtain information via the user; utilizing the information obtained via the user in an application which is not limited to user involvement; and wherein the information is utilized at least partially for evaluating changes in the perfonnance of at least one user over time.

An interactive toy environment providing a methodology for obtaining info nation and utilizing the infonnation at least partially as a diagnostic tool for evaluating performance of content employed by at least one interactive toy is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 35, which is a simplified partly pictorial partly schematic illustration of methodology for obtaining and utilizing information as a diagnostic tool for evaluating content employed by an interactive toy, in accordance with a preferred embodiment of the present invention. Turning to Fig. 35 it is seen that an interactive toy 3001 suggests to a user that the user might wish to listen to a story. This suggestion may be initiated by a suitable server 3004, which communicates typically via Internet 3003 with a computer 3002 which, in turn, provides content input for toy 3001 by means of typically wireless communication therewith. As also shown in Fig. 35, having told the story, toy 3001 requests that the user should rate the story. It may be appreciated that server 3004 is operative to retrieve and process information on one more stories provided for example by publisher 3005.

Reference is now made to Fig. 36 A, which is simplified table in the context of Fig. 35, showing a data report message 3010 typically sent from computer 3002 to server 3004 reporting the response of a user to a particular story. Turning to Fig. 36 A it is seen that a report message 3010 includes a stoiy number 3011 and the user's rating 3012.

Reference is now made to Fig. 36B, which is a simplified table in the context of Fig. 35 showing a database record 3014 for story ratings. Database record 3014 is typically stored on server 3004 and may or may not be reported to publisher 3005. Turning to Fig. 36B, it is seen that database record 3014 provides, for each story on a list, the number of users 3015 who responded to the story and the average rating 3016 of the story.

Reference is now made to Fig. 37, which is a simplified flowchart of the information obtaining and utilizing functionality of Fig. 35. A publisher 3005 sends stories to server 3004 via Internet 3004. Server 3004 sends message to users via computers such as computer 3002. Computer 3002 instructs toy 3001 to suggest to the user that the user might wish to listen to a free story. If the user agrees, computer 3002 downloads a story from server 3004 and instructs toy 3001 to tell the story to the user. If the user listens to the stoiy, toy 3001 requests that the user should rate the story. The user's rating registered in report message 3010 is communicated by computer 3002 to server 3004 via Internet 3003. Server 3004 updates the rating for the particular story in database record 3014.

It is appreciated that the functionality of Figs. 35, 36 and 37 is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys interconnected via a network and providing a methodology for obtaining and utilizing information comprising: employing at least one of the plurality of interactive toys to obtain infonnation via the user; utilizing the information obtained via the user in an application which is not limited to user involvement; and wherein the infomiation is utilized at least partially as a diagnostic tool for evaluating performance of content employed by the at least one interactive toy.

An interactive toy environment providing a methodology for evaluating utility of teaching methods and educational methodologies is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 38 which is a simplified pictorial illustration of a methodology for obtaining information and utilizing the infonnation for the purpose of evaluating teaching methods and/or educational methodologies in accordance with a preferred embodiment of the present invention. Turning to Fig. 38, it is seen that toy 3100 aided by a computer monitor 3104 is teaching the Pythagorean theorem to a user. It is also seen that another toy 3101, aided by a monitor 3105, is teaching theorem to another user. Toys 3100 and 3101 employ disparate teaching methods and disparate educational methodologies. Toy 3100 employs a methodology wherein a user has to find theorem himself, and a teaching method based on intuitive visual perception and algebraic Imowledge. Wliereas, toy 3101 employs a methodology of direct transmission of Imowledge and a teaching method based on geometrical knowledge.

Reference is now made to Fig. 39, which is a simplified flowchart in the context of Fig. 38, showing a teaching methods evaluation functionality and educational methodologies evaluation frmctionality. A server 3106 distributes educational toy content to a plurality of computers. The content is divided into four categories, transmitting the same educational content in all combinations of two disparate teaching methods and two disparate educational methodologies, such as described in Fig. 38. The toys actuate content to their users. A month later, the server distributes toy content to the multiplicity of computers designed to test users' command of the educational content. In the context of Fig. 38, such content may test the ability of users to apply the Pythagorean theorem in various contexts. Toys actuate toy content. The computers evaluate users' performances and send scores to server 3106. Server 3106 evaluates the utility of the disparate teaching methods and educational methodologies.

Reference is now made to Fig. 40, which is a simplified table of a database in the context of Fig. 39, showing a typical outcome of an evaluation procedure described in Fig. 39. The table includes four columns covering all combinations of teaching methods and educational methodologies evaluated in a methodology described in Fig. 39. Each column specifies the success rate of such a combination as detected by a test delivered to users. As is seen in the table, the combination of teaching method 2 and educational methodology 2 yielded highest success rate, meaning that these are the most efficient out of those tested.

It is appreciated that the functionality of Figs. 38, 39 and 40 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected via a network, and providing a methodology for obtaining and utilizing information comprising: employing at least one of the plurality of interactive toys to obtain information via the user; utilizing the infomiation obtained via the user in an application which is not limited to user involvement; and wherein the information is utilized at least partially as a diagnostic tool for evaluating utility of teaching methods.

It is appreciated that the functionality of Figs. 38, 39 and 40 taken together is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys intercomiected via a network, and providing a methodology for obtaining and utilizing information comprising: employing at least one of the plurality of interactive toys to obtain information via the user; utilizing the information obtained via the user in an application which is not limited to user involvement; and wherein the information is utilized at least partially as a diagnostic tool for evaluating utility of educational methodologies.

An interactive toy enviromnent providing a methodology for obtaining information for the purpose of game design is now described, in accordance with a prefened embodiment of the present invention. Reference is now made to Fig. 41 which is a simplified pictorial illustration of a methodology for obtaining infonnation that may be used for the purpose of game design in accordance with a preferred embodiment of the present invention. Turning to Fig. 41 it is seen that a plurality of toys, such as toy 3300, ask users the same riddle. A plurality of computers communicating with the toys, such as computer 3302, send users answers to toy server 3304. It is appreciated that Infonnation obtained in such a method may be then utilized in game design, for example as means to select riddles for games such as popular quest and adventure computer games.

Reference is now made to Fig. 42, which is a simplified flowchart of the information obtaining functionality of Fig. 41. A server 3306 distributes riddles to a plurality of computers. Riddles are sent to computers along with their solutions. Toys communicating with the plurality of computers present riddles to users. Toys send to computers the answers of users to the riddles. Computers notify server 3306 whether users solved riddles correctly. Server 3306 rates riddles according to their difficulty, in relation to different user groups.

Reference is now made to Fig. 43 which is a simplified table of a database record showing riddle rating functionality of Fig. 42. The table shows the solution rates of different riddles relative to disparate age groups. It is appreciated that such infonnation may be utilized in game design, for example for selecting riddles to games such as popular quest and adventure computer games. In such cases information regarding the difficulty of a riddle enables game designers to select the place of a riddle in a game, typically placing more difficult riddles in later phases of a game.

It is appreciated that the functionality of Figs. 41, 42 and 43 taken together is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a network, and providing a methodology for obtaining and utilizing infomiation comprising: employing at least one of the plurality of interactive toys to obtain infonnation via the user; utilizing the infonnation obtained via the user in an application which is not limited to user involvement; and wherein the information is utilized at least partially as game design tool.

A schedule monitoring toy system is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 44, which is a simplified pictorial illustration of a schedule monitoring toy system comprising a personal infonnation item learning and a scheduling prompt presentation functionality in accordance with a prefen-ed embodiment of the present invention. Tuming to Fig. 44, it is seen that an interactive toy 5010 suggests to a child that the child might wish to listen to a particular type of bedtime story referred to as story about monsters. This suggestion might be initiated by a suitable computer 5012 in response to a child verbal input that the child is going to bed, which verbal input is received by the computer 5012 via the toy 5010, typically by means of wireless communication therewith.

As also seen in Fig. 44, on a another day at a time close to the child's bedtime, the toy 5010 suggests to the child that the child might be tired and wish to go to bed. When the child requests that the toy 5010 tell him a story, the toy 5010 suggests that the child might wish to listen to a particular type of story referred to as story about dinosaurs.

In the illustrated embodiment, the toy 5010 communicates with the child, based upon instructions received from the computer 5012, which stores personal information about the child obtained in the course of interaction with the child on the previous day. Thus it may be appreciated that the toy 5010 is actuated to present to the child a scheduling prompt based on personal information learned about the child.

Reference is now made to Figs. 45A and 45B, which are simplified flowcharts respectively illustrating the learning and the presentation functionality of Fig. 44. A user, such as a child, bids an interactive toy 5010 good night and tells the toy that he is going to bed. A user input is received by toy 5010, and is communicated to a computer 5012, typically by means of wireless communication therewith. The computer 5012 is typically provided with speech recognizer software operative to recognize the user's speech. For example, the software is operative to recognize keywords and/or key phrases such as "good night" and "go-to-bed". Based on the recognized verbal input, computer 5012 updates a database record with the user's bedtime. In addition, computer 5012 instructs toy 5010 to suggest to the user that the user might wish to hear a bedtime story about monsters. The user responds negatively and requests a story about dinosaurs. The computer 5012 recognizes the verbal input received via the toy 5010. Based on the recognized verbal input, the computer 5010 updates a database record with the user's preferred type of bedtime story. The computer 5012 may then instruct the toy 5010 to verbalize to the user the requested content possibly provided via a computer network such as the Internet.

On a another day, such as the following day, at a time close to the user's registered bedtime, computer 5012 personalizes a message to the user via the toy 5010, suggesting to the user that the user might wish to go to bed. The user requests a bedtime story. Based on the user's request received via the toy 5010 and on the user's registered prefened type of bedtime story, the computer personalizes a message to the user via the toy 5010, suggesting that the user might wish to listen to a story about dinosaurs.

It is appreciated that the functionality of Figs. 44, 45A and 45B, taken together, is particularly appropriate to a schedule monitoring system comprising: an at least partially verbal-input interactive toy operative to learn personal infomiation about a child; and toy content operative to actuate the verbal-input interactive toy to present to the child at least one personalized, verbal scheduling prompt based on at least one item of personal infonnation which the interactive toy has learned about the child; and wherein the toy content includes personalized content which at least partially confonns to at least one personal characteristic of the user, the personal characteristic being learned by the user's toy.

A schedule monitoring toy system in accordance with another prefened embodiment of the present invention is now described. Reference is now made to Fig. 46, which is a simplified pictorial illustration of a schedule monitoring toy system comprising a parental input receiving, a schedule item presentation and antliropomorphic response functionality in accordance with a preferred embodiment of the present invention. Turning to Fig. 46, it is seen that a parent of a user inputs schedule items for the user into a computer 5120 by means of a computer keyboard 5122 and monitor 5124. This parental input receiving functionality is typically provided by means of a suitable input receiving software run by the computer 5122, which software preferably includes an authorization functionality, whereby only a parent may provide the input.

As seen in Fig. 46, an interactive toy 5126 communicates both verbally and physically with a child, waking up the child, infonning the child of the cunent time and fancifully shaking the child in a human-like fashion, in response to the verbal response made by the child. Upon presenting the schedule items to the user, the toy 5126 informs a parent of the user that the schedule items have been presented to the user.

In the illustrated embodiment, the toy 5126 communicates, typically wirelessly with the computer 5120, which in turn communicates with a public phone system. Thus, it may be appreciated that the computer 5120 is operative to actuate the toy 5126 to present to the child one or more schedule items previously inputted by a parent of the user, and to infonn the parent that the schedule items have been presented to the user.

Reference is now made to Fig. 47, which is a simplified flowchart of the parental input receiving functionality of Fig. 46. A parent of a user chooses a scheduler option on a menu of suitable software ran on computer 5120. The computer notifies the parent that a personal password is required in order to view and/or modify schedule items. The parent provides the personal password by means of a computer keyboard 5122. The computer verifies the parent's password. If the password provided by the parent is correct, the computer provides a display of schedule items for the user on a monitor 5124. The parent updates the schedule items by means of keyboard 5122.

Reference is now made to Fig. 48, which is a simplified flowchart of the schedule item presentation and the anthropomorphic response functionality of Fig. 46. At a time registered by computer 5120, the computer 5120 instructs an interactive toy 5126 to present a schedule item to the user, waking up the user and verbally infonning the user of the current time. Toy 5126 is instructed to repeat the presentation of the schedule item for 15 minutes if no verbal response is received from the user.

A verbal response by the user is received by computer 5120 via toy 5126 and recognized by means of speech recognizer software. In response to a negative verbal response by the user, the computer instructs toy 5126 to fancifully shake the user in a human-like fashion. It is appreciated that "shaking the child" is one of many nonverbal actions which the toy may take to assist the user in waking up. Thus, for example, if a particular toy is not capable of shaking a child it may simply shake by itself in a manner which makes a particularly expressive sound. It is further appreciated that the identification of a negative or a positive resp aonse by the toy may be performed according to the context implied by the schedule item. For example, the computer may receive from a toy server lists of keywords and phrases appropriate to common schedule contexts, such as the context of "waking up". These keywords and phrases enable the computer to detect whether a schedule item has been accomplished. For example, in the context of waking up, the computer may expect phrases such as "I want to sleep" and "leave me alone" as indicators that a schedule item has not been accomplished. A particular advantage of toy systems is that if the computer fails to recognize a user's response the toy may react in a playful and unpredictable manner. Thus, for example, a user may consider it amusing that a toy continues to insist on waking the user up even though the user has already indicated his intention of doing so.

The computer may also identify the context of a schedule item by a list of recurrent contexts and of typical phrasings related to these contexts. Alternately, a parent may choose a schedule item from a list of recunent schedule items associated with specific contexts. It is further appreciated that responses of users to common schedule items are obtained by a toy server, thus enabling to the server to update lists of recunent responses and improve the identication of responses.

After a toy fancifully shaking the user, and/or in a case where a positive response by the user has been received via toy 5126, the computer 5120 checks whether a motion- detector signal has been received via the toy. If no motion-detector signal has been received, and the current time is no later than 15 minutes past the time appropriate to the registered schedule-item, then computer 5120 instructs toy 5126 to repeat the presentation of the schedule item to the user.

Upon receiving a motion sensor signal via toy 5126, computer 5120 instructs the toy to present to the user additional registered schedule items. Such schedule items are preferably presented to the user in the course of interaction between the user and the toy 5126. It is therefore appreciated that the computer may receive via the toy a user response to the presentation of the schedule items. Upon receiving response by a user via the toy 5126, the computer 5120 dials the parent by means of dialer 5127, and personalizes a voice message to the parent, typically in the voice of the toy 5126, infonning the parent that the schedule items have been presented to the user. The parent receives the voice message by means of a mobile communicator 5128.

If a user does not respond to the initial schedule item presentation, or if no motion sensor signal is received within 15 minutes past the time appropriate to the initial schedule item, the computer 5120 dials the parent and personalizes a message to the parent that the user did not get up.

It is appreciated that the functionality of Figs. 46, 47 and 48 is particularly appropriate to a schedule monitoring toy system comprising: a verbal-input interactive toy; a parental input receiver operative to recognize a parent and to receive therefrom at least one parental input regarding at least one desired schedule item; and toy content actuating the verbal-input interactive toy to present to a child a timely verbal presentation of the at least one desired schedule item.

It is also appreciated that the functionality of Figs. 46, 47 and 48 is particularly appropriate to a schedule monitoring toy system comprising: a verbal-input interactive toy operative to perform speech recognition; and toy content actuating the verbal-input interactive toy to present to a child: at least one timely, interactive verbal scheduling prompt; and at least one anthropomorphic response to recognized speech content produced by a child responsive to the prompt.

A schedule monitoring toy system in accordance with yet another preferred embodiment of the present invention is now described. Reference is now made to Fig. 49, which is a simplified pictorial illustration of a schedule monitoring toy system comprising child locating functionality and verbal prompt delivery functionality in accordance with a prefened embodiment of the present invention. Turning to Fig. 49 it is seen that a mobile interactive toy 5130 searches for a user in a number of rooms and, upon detecting the presence of a user at a particular room, delivers to the user a verbal prompt for a schedule item. This search and delivery operation may be initiated by a suitable computer 5132 which provides instructions to the toy 5130 by means of typically wireless communication therewith, which instructions may be based on one or more schedule items registered by the computer 5132.

As seen in Fig. 49, the toy 5130 moves from one room to another until the presence of the user is sensed. For example, the computer 5132 registers the position of the toy 5132 along a predetennined route passing tlirough a series of rooms, and is therefore capable of instructing toy 5130 to move from one room to another until the toy completes the series of rooms. Preferably, the toy 5130 is equipped with an obstacle avoidance apparatus, such as an ultra-sound transceiver based obstacle avoidance device, which allows the toy 5130 to bypass obstacles and return to its predetermined route.

In the illustrated embodiment, toy 5130 is equipped with an IR receiver 5131, and the child wears a diffuse IR transmitter 5133. When the toy 5130 anives at the room where the child is present, a diffuse IR signal, transmitted by the IR transmitter 5133 worn by the child, is received by the IR receiver 5131 on board toy 5130. Thus, it may be appreciated that upon physically approaching the child, the toy 5130 is operative to deliver to the child a prompt for a verbal schedule item, which the computer 5132 provides to the toy 5130 by means of wireless communication therewith.

Reference is now made to Fig. 50, which is a simplified flowchart of the child locating and verbal prompt deliveiy functionality of Fig. 49. A computer 5132 registers a schedule-item for a child. At a time appropriate to the schedule-item registered by computer 5132, the computer instructs an interactive toy 5130 to commence a search and prompt delivery operation. The computer typically registers the position of the toy 5130 along a predetemiined route, passing tlirough a series of rooms in a working area, and is therefore capable of instructing the toy 5130 to move from one room to another until toy 5130 completes the series of rooms.

The child wears a diffuse IR transmitter. If the toy 5132 is located at the same room where the child is present, an IR signal transmitted by the IR transmitter worn by the child is received by an IR receptor on the toy 5130. The IR receptor on the toy 5130 is insensitive to IR signal below given amplitude. Thus, a diffuse IR signal from another room is typically ignored by the toy 5130. It may therefore be appreciated that the toy 5130 is capable of communicating to the computer 5132 an IR receptor signal if and only if the toy 5130 is located in the same room where the child is present.

Upon receiving an IR receiver signal from the toy 5130 by means of typically wireless communication therewith, the computer 5132 instructs the toy 5130 to deliver to the user a verbal prompt for the schedule item in question. If no IR signal is received from the toy 5130, the computer 5132 instructs the toy 5130 to move to the next along the predetermined route. If no IR receiver signal has been received from the toy 5130 in the course of moving a whole series of rooms, the computer 5132 informs a user such as a parent, that the child has not been located. For example, the computer 5132 dials a number of a parent's mobile phone and infonns the parent that the child has not been located.

In the case that a prompt for a schedule item has been delivered to the child, the computer awaits for a child response to be received via the toy 5130. If no response is received in the course of a predetermined time lapse, computer 5132 informs a parent that the child has been located in a specified room and might be hiding. If a child's response via the toy 5130 is received by the computer 5132, the computer 5132 informs the parent of the child that the child has received a prompt for the schedule item in question. It is appreciated that the functionality of Figs. 49 and 50 is particularly appropriate to a schedule monitoring toy system comprising: a mobile, verbal-input interactive toy; a scheduler operative to receive an input regarding at least one schedule item; a child locator operative to locate a child within a predetermined area; and a prompter operative, at a time appropriate to the at least one schedule item, to locate a child and to deliver at least one verbal prompt for the at least one schedule item; and wherein the prompter is operative to physically approach the child.

A schedule monitoring toy system, in accordance with another prefened embodiment of the present invention, is now described. Reference is now made to Fig. 51, which is a simplified flowchart of a schedule monitoring toy system comprising authorized free-time activity prompting functionality and a schedule functionality, in accordance with a preferred embodiment of the present invention. Turning to Fig. 51, it is seen that a user tells toy 5350 that he has finished his homework. Computer 5351 identifies keywords "finished" and "homework", which imply that the user is free of obligations. Computer 5351 then checks user's schedule. If the cun-ent time was initially reserved on a schedule for homework, or if it was not scheduled for any activity, computer 5351 checks further whether the time remaining to next scheduled item is longer than 15 minutes. If time remaining is longer than 15 minutes, computer 5351 initiates a search in a free-time database, in order to recommend a free-time activity to user. If it is shorter than 15 minutes, computer 5351 reminds the user of his next scheduled task.

It is appreciated that computer 5351 may initiate a search for a free-time activity recommendation without any input from a user, when the user's schedule indicates that during cunent time the user has no obligations. In order to find an appropriate free-time activity, computer 5351 checks free-time database of the user. Computer 5351 retrieves from the database all the activities which are currently feasible. The feasibility of an activity is typically determined by its designated time of perfonnance, by its estimated duration (i.e. whether it is shorter than the current duration of free-time of a user), and by its designated location (i.e. whether it is the user's current location). After retrieving a list of all possible free-time activities, computer 5351 organizes the list according to the urgency of each activity. The urgency is typically determined by the number of future occasions wherein the user would be able to perform the activity. To that purpose, computer 5351 compares user's schedule with the designated times for each free-time activity on the list. Computer 5351 then counts the coming days wherein a user would possibly be able to perform the activity, as implied by a user's schedule. Starting from the most urgent activity (i.e. the one with the least number of days left for the user to perform it), computer 5351 delivers activity recommendations to user. Computer 5351 sends a message regarding an activity to toy 5350, and toy 5350 verbalizes the message to user. If the user accepts recommendation, computer 5351 retrieves from database instructions for perfomiing the activity. If a user does not accept a recommendation, computer 5351 moves to next activity on the list. It is appreciated that the list of possible activities may be prioritized according to several parameters, such as the number of times a user had already performed an activity, and the number of times an activity was suggested to a user, a measure of importance defined by an authorized source such as a parent.

Reference is now made to Figs. 52 A and 52B, which are simplified tables respectively illustrating a typical schedule record of a user and a simplified free-time database. Items in a schedule are received from authorized sources, such as a parent of a child user. It is appreciated that a multiplicity of sources may be authorized to input data into a user's schedule. It is further appreciated that a hierarchy of authorizations may be defined for various sources. For example, a parent of a child user may allocate specific times to teachers of the child user so as to enable the teachers to register homework assigmnents into user's schedule. A schedule record includes schedule items and the times reserved for them. It includes both daily items 5360, such as school, and one-time items 5361. Fig. 52B is a table showing a simplified free-time database of a user. It includes activities 5362 and specifications for each activity. The specifications include designated times 5363 for perfonning each activity and a time duration 5364 for each activity. An activity record also indicates whether an activity may be repetitive 5365 (i.e. whether a toy should recommend it after a user has already performed it at least once). The record may also include an indication of a paging location 5366 wherein a toy may deliver a recommendation for an activity. A record also includes instructions 5367 for perfomiing an activity, the instructions being delivered to a user once a user accepts a recommendation. Data in a free-time database are received from an authorized source, such as a parent of a child user, or from a multiplicity of authorized sources.

It is appreciated that the functionality of Figs. 51, 52 A and 52B taken together is particularly appropriate to a schedule monitoring toy system comprising: a verbal-input interactive toy; a schedule input receiver operative to receive, from at least one authorized source, infonnation regarding a plurality of schedule items; a free-time database operative to receive, from at least one authorized source, infonnation regarding at least one free-time activities authorized to be performed by a child during his free-time; and toy content actuating the verbal-input interactive toy to present to the child: a timely verbal presentation of each of the plurality of schedule items; and a verbal presentation, presented at a time not occupied by any of the plurality of schedule items, prompting the child to perform at least one of the free-time activities.

A follow-me toy system, in accordance with a preferred embodiment of the present invention, is now described. Reference is now made to Fig. 53, which is a simplified pictorial illustration of a toy that follows a user. The toy includes a detection and navigation unit 5390 which detects a user, and determines the direction of movement of the toy accordingly. The detection and navigation unit 5390 includes two IR receivers 5400 and 5401 operative to receive unique IR signals from transmitters canied by a user or embedded in user's clothes, typically in a shoe sole. It also includes a wheel or a number of wheels 5405. The unit as a whole rotates in relation to the body of the toy, by the power of a motor 5404 hidden in the body of the toy. The toy includes also a motor 5391 turning the back wheels and moving the toy and a processing unit 5407 that receives infonnation from IR receivers 5400 and 5401, and controls accordingly the operation of the two motors 5391 and 5404.

Fig. 54 is a simplified diagram of the detection and navigation unit 5390. The unit includes front wheel or wheels 5405 of a toy. It is located on a rotating disk connected to a motor 5404, thus enabling a processing unit 5407 to control the direction of movement of the toy. It also includes two IR receivers 5400 and 5401, located at the end of two identical cone shaped grooves 5402 and 5403 with parallel central axis lines.

Fig. 55 is a simplified diagram illustrating the detection and navigation functionality of the mechanism described in Fig. 54. In the figure, two IR receivers 5400 and 5401 are located with a 10 cm distance one from the other. Both receivers are situated at the end of cone shaped grooves with an angle of 6°.. The grooves create for each receiver a limited field of vision 5412 and 5413. The two fields of vision thus intersect at point 5410 approximately 50 cm from the front of the toy. A shoe sole 5414 of a user is equipped with four IR transmitters 5415, transmitting a unique IR signal thus receivable in any direction from the user. The movement of the toy and the rotation of the detection and navigation unit 5390 are determined by the signals received, in such a way as to constantly bring the toy near the user, so that the user will be in the intersection point 5410 in relation to it.

Fig. 56 is a simplified flowchart illustrating the navigation functionality of the mechanism of Fig. 54. If both receivers 5400 and 5401 receive an IR signal from a user, then the user is in the overlap area of the receivers' fields of vision 5412 and 5413, which means that the user is more than 50 cm away from the toy, and that detection and navigation unit 5390 of the toy is facing the general direction of the user. The processing unit 5407 thus orders the toy to move forward in its current direction. If the receivers do not both receive the signal, the processing unit halts the toy and starts a procedure of finding a new direction for the toy. If right receiver 5401 receives the signal, detection and navigation unit 5390 rotates to the left, until left receiver 5400 receives the signal. If right receiver 5401 does not receive the signal the detection and navigation unit 5390 rotates right. The outcome of the procedure as a whole is that a toy moves when both receivers 5400 and 5401 receive IR signal from user, and in any other case the toy halts and checks if a change in direction is needed.

In accordance with another preferred embodiment of the present invention, all the wheels of a toy are connected to the detection and navigation unit through a transmission unit that direct them to the same direction as the IR receivers, thus creating a smoother movement of the toy. In such a case a toy may also keep following a user when a user steps above a toy and proceeds in direction opposite to the direction the toy is cunently facing.

In accordance with another preferred embodiment of the present invention, a toy has also the capability of following a user when he walks beyond obstacles such as a turn in a conidor. In such cases the processing unit tracks also the time since the last signal was received. If such a time exceeds a predetermined length, such as 1 second, the detection and navigation unit returns to its state when last signal was received, and the toy moves a predetennined distance in that direction, such as 1 meter, and then starts the procedure of locating a user again.

It is appreciated that the functionality of Figs. 53, 54, 55 and 56, taken together, is particularly appropriate to a follow-me toy system comprising: a mobile toy; and a user- following mechanism tracking the user and guiding the toy to follow the user as the user moves within a working area.

A networked diary toy system is now described, in accordance with a prefened embodiment of the present invention. Reference is now made to Fig. 57, which is a simplified pictorial illustration of a networked diary toy system comprising networked diary data storage functionality in accordance with a preferred embodiment of the present invention. Turning to Fig. 57 it is seen that a parent enters an item on a child user's schedule via a computer 5480, at his/her place of work, such as an office, com ected to the Internet. A server 5481 delivers schedule updates to user's computer 5482. 15 minutes before a designated time of the scheduled item, toy 5483 verbalizes a message to a user regarding the scheduled item.

Reference is now made to Fig. 58, which is a simplified flowchart of the network interface connection functionality of Fig. 57. An authorized user, such as a parent of a child user, accesses server 5481 via computer 5480 via the Internet. Server 5481 verifies identity of the authorized user by a password the user enters via keyboard. Alternately, verification may be performed by a verbalized password that server 5481 compares with a voice imprint of the user. After verification, server 5481 allows the authorized user access to an on-line schedule of another user, such as a child of the authorized user. The authorized user updates schedule by data entry to the server via a keyboard. Alternately, data entry may be perfonned by speech and utilizing speech recognition mechanism. Server 5481 establishes communication with the user's computer 5482. Server 5481 delivers schedule updates to the user's computer 5482. User's computer 5482 updates a copy of user's schedule. 15 minutes before a schedule item, user's computer 5482 sends a message to toy 5483 regarding the schedule item. Typically, toy 5483 verbalizes the message to the user.

It is appreciated that a user's on-line schedule may be updated by a multiplicity of sources via a multiplicity of computers, for example, via a user's computer and or via a computer of another authorized user. Various rules may be applied to cases of conflict between attempted entries from different sources, such as overriding a prior entry, rejecting an entry conflicting with a prior entry, or accepting an entry from a source higher in a predefined hierarchy of sources.

It is appreciated that the frmctionality of Figs. 57 and 58 taken together is particularly appropriate to a networked diary toy system comprising: a verbal-input interactive toy; a network interface comiecting the verbal-input interactive toy to a computer network including at least one networked computer; a diary database storing at least one diary item for an individual user; and verbal content for presenting diary items from the diary database, wherein at least a portion of the verbal content is stored on the at least one networked computer and arrives at the verbal-input interactive toy via the network interface.

A speech-responsive networked diary toy system, in accordance with a preferred embodiment of the present invention, is now described. Reference is now made to Fig. 59A, which is a simplified partly pictorial partly schematic illustration of a speech responsive networked diary toy system comprising a speech recognition unit residing in a networked computer, and a diary item actuated in response to a user utterance, in accordance with a preferred embodiment of the present invention. Tm-ning to Fig. 59A, it is seen that as a user passes near a toy store, a toy 5590 asks a user to where is he going. The user tells toy 5590 that he is going into store. Toy 5590 sends user's digitized utterance to a toy server 5593 via the Internet via a WAP server 5592 and via a cellular network 5593. Alternately, toy 5590 performs part of the processing needed for speech recognition such as digital signal processing, and sends to toy server 5593 the outcome of the processing. Toy server 5593 processes the user's utterance, and sends a response to toy 5590, reminding the user that he has a prior obligation in a half-hour. It is appreciated that toy server 5593 sends toy 5590 a reminder of the obligation to be verbalized to the user while the user is in the store.

Reference is now made to Fig. 59B, which is a simplified block diagram illustration of the speech recognition and response generation of Fig. 59A. A user's utterance is picked up by a toy 5590. Toy 5590 digitizes utterance and sends it to toy server 5593, typically with other sensory inputs. A speech recognition unit 5594 in the server 5593 converts the utterance to text. A toy content script processor 5596 processes the text in accordance with data in database 5595 regarding the user's schedule, and produces content for toy 5590. A content processor 5597 converts toy content to a response and sends the response to toy 5590. Toy 5590 verbalizes the response to the user.

Reference is now made to Fig. 59C, which is a simplified flowchart illustrating the response actuation functionality of Fig. 59 A. Toy server 5593 detects that next schedule item is in a half-hour. Toy server 5593 initiates a procedure for identifying possible disturbances to the schedule. Toy server 5593 tracks the user location by infomiation received from cellular phone network. Alternately, toy server 5593 tracks the user's location by some other positioning system such as GPS. If a user is not already at scheduled location, toy server 5593 checks whether he is near a possible attraction, such as a toy store. A map of such possible attractions is stored on the server. The map is possibly personalized according to a user profile and history. If the user is near a possible attraction toy, server 5593 sends a question to toy 5590 and toy asks the user to where is he going. The user's answer is sent to toy server 5593 and converted to text. Toy server 5593 identifies a pattern in user's answer, namely "I am going into X", the pattern implying that user's answers contains the user's destination. Since, in this case, the user's answer does not contain the words "piano" or "lesson", it is assiuned that the user plans to go to a place other than that where Iris scheduled piano lesson is supposed to be. As a response, toy server 5593 produces a schedule reminder and sends it to toy 5590. Toy 5590 verbalizes the reminder to user.

It is appreciated that the speech recognition and response generation functionality of the system illustrated in Figs. 59A, 59B and 59C is especially useful if the toy does not include speech recognition functionality. It is appreciated that the speech recognition and response generation functionality of the system, illustrated in Figs. 59A, 59B and 59C, is especially useful if ISP does not include speech recognition functionality. It is appreciated that a similar method of providing speech recognition functionality to a toy by a server over a network is especially useful for a system comprising a toy communicating with a computer network via a set-top-box. It is appreciated that a similar method of providing speech recognition functionality to a toy by a server over a network is especially useful for a system comprising a toy communicating with a computer network via phone line or a DSL connection.

It is appreciated that the frmctionality of Figs. 59A, 59B and 59C taken together is particularly appropriate to a speech-responsive networked diary toy system comprising: a toy; a network interface connecting the toy to a computer network including at least one networked computer; a diary database storing at least one diary item for an individual user; a speech-recognition unit residing at least in part in the at least one networked computer and communicating with the toy via the network and the network interface; and toy content actuating the toy to present at least one diary item responsive to user utterances recognized by the speech recognition unit.

A supervised networked organizer system, in accordance with a prefen-ed embodiment of the present invention, is now described. Reference is now made to Fig. 60, which is a simplified pictorial illustration of a supervised networked organizer system comprising an organizer functionality involving multiple users and a supervision functionality. Turning to Fig. 60, it is seen that a first user tells toy 5700 that he would like to go to a movie with a friend of his. Toy 5700 sends digitized utterance to computer 5701. Computer 5701 sends a request to a toy server 5702 via the Internet. Toy server 5702 notifies a parent of second user of the first user's request by means of a phone 5705. A parent of the second user authorizes request. Toy server 5702 sends toy 5703 of the second user a message via the Internet via second user's computer 5704. Toy 5703 verbalizes a message to a second user, notifying him that the first user would like to go to a movie with him. The second user agrees to go with him. Computer 5704 of the second user sends toy server 5702 a message confinning that the second user agrees to go to a movie with the first user. Toy server 5702 selects a movie appropriate to the first and second users. Toys 5700 and 5703 of the first user and second user respectively, notify the users of the movie. Both users agree to go to the movie. Toy server 5702 notifies the users via toys 5700 and 5703 of a meeting place and time. Reference is now made to Fig. 61, which is a simplified flowchart of the organizing and supervision functionality of Fig. 60. A first user tells a toy 5700 that he would like to go to a movie with a second user. Computer 5701 recognizes a second user's nickname "Billy". The nickname is registered in first user's record in an organizer system, and is associated with a user ID, enabling a toy server 5702 to identify and approach second user. Computer 5701 also recognizes keywords "movie" and "go" which imply that the first user wants to go to a movie with the second user. Computer 5701 sends toy server 5702 the first user's request. Toy server 5702 establishes a phone connection with a parent of the second user who is registered as a supervisor in the organizer system. Toy server 5702 delivers the parent an automated message via phone 5705, notifying him of first user's organizing request. Alternately, toy server 5702 may deliver the user's request, as picked up by toy 5700, to the parent of the second user, typically by phone. If the parent overrides the request, the first user is notified. Whereas, if the parent authorizes the request, toy server 5702 sends message to toy 5703 of second user, notifying him of first user's request. If the second user agrees to the request, the toy server 5702 proceeds to select a movie. The toy server includes a database of movies and movie theaters. The toy server selects a list of movie perfonnances appropriate to the first and the second user in movie theaters, near the users' locations. An organizing system utilizes a networked diary functionality, as illustrated in Figs. 57 and 58, in order to select movie perfonnances in accordance with the users' free-time. The system may also utilize the functionality in order to screen from a selected list of movies, the movies which the users have already seen. Alternately, an organizing system utilizes schedule data stored in the respective users' computers in order to select movie performances in accordance with users' free-time. It is appreciated that an organizing system also utilizes a free-time database functionality, as described in Figs. 51 and 52, in order to select a list of movies appropriate to both users. In such a case, toy server 5702 selects movies that appear in free-time databases, such as illustrated in Fig. 52, of both users. It is also appreciated that toy server 5702 utilizes user profiles to select appropriate movies. After selecting a list of appropriate movies, toy server 5702 sends a message to toy 5703 of second user. Toy 5703 verbalizes message to user, suggesting to him a specific perfonnance of a movie. If the user agrees, toy server 5702 sends a message to toy 5700 of a first user, suggesting a certain perfonnance to the first user. If the first user agrees, toy server 5702 sets a time and a place for a meeting of both users, typically a half-hour before the perfonnance at the appropriate movie theater. Toys 5700 and 5702 tell the users of the meeting place and time. If the users do not agree on a film, the toy server moves onto the next film on the selected list. It is appreciated that the functionality of Figs. 60 and 61 taken together is particularly appropriate to a supervised networked organizer system comprising: an organizer subsystem operative to perform at least one organizing function involving multiple individual users; a supervision subsystem storing at least one supervisor identity and automatically providing to each individual supervisor inputs from the organizer system.

It is also appreciated that the functionality of Figs. 60 and 61 taken together is particularly appropriate to a supervised networked organizer system as the aforementioned and wherein the organizer subsystem includes multiple interactive toys associated with the multiple individual users;

It is further appreciated that the functionality of Figs. 60 and 61 taken together is particularly appropriate to a supervised networked organizer system as the aforementioned and wherein at least some of the multiple individual users are children and wherein at least one of the individual supervisors is a parent of at least some of the children; and

Furthennore, It is appreciated that the functionality of Figs. 60 and 61 taken together is particularly appropriate to a supervised networked organizer system as the aforementioned and wherein the organizer subsystem includes ovenide functionality which enables the individual supervisor to ovenide inputs received by the organizer subsystem from at least one of the multiple individual users.

A child-messaging toy system is now described, in accordance with a prefened embodiment of the present invention. Reference is now made to Fig. 62 which is a pictorial illustration of a child-messaging toy system comprising proximity indicating frmctionality and an amiunciator requesting that a user come in proximity to a toy, in accordance with a preferred embodiment of the present invention. The toy system includes also timed message- delivery functionality and failure reporting functionality, reporting to a sender of a time specific message that the message has not been delivered at specified time. Turning to Fig. 62 it is seen that a parent of a user sends a message to the user via computer in office. Toy server 6004 sends message to user's computer 6003. At specified time computer 6003 detects that the user is not in propinquity with toy 6002. Computer 6003 utilizes for the propinquity detection a watch 6001 worn by the user comprising an RF transceiver, an IR transceiver and a heat sensor. Toy 6002 calls user, telling him that he has got a message. The user comes into propinquity with toy 6002. Computer 6003 detects propinquity of the user with toy 6002. Toy 6002 verbalizes message to user.

Reference is now made to Fig. 63, which is a simplified flowchart of the message delivery functionality and the failure reporting functionality of Fig. 62. A parent of a user sends an email message to a user, to be delivered at a specified time. Alternately the message is delivered via a proprietary toy messaging system. Toy server 6004 sends message to user's computer 6003. 5 minutes before designated time computer 6003 checks whether the user is in propinquity with toy 6002. If the user is in propinquity with toy 6002, computer sends parent's message to toy 6002 and toy verbalizes message to user. Otherwise, toy 6002 calls user's name and requests the user to approach toy and hear a message. Toy 6002 repeats calling user's name every 1 minute until the user comes into propinquity with toy 6002. If designated time to deliver message had passed without the message being delivered, computer 6003 sends toy server 6004 a failure message and toy server 6004 informs parent by e-mail of failure to deliver message on time. Alternately toy server infomis parent via a proprietary messaging system or by an automated phone call.

Reference is now made to Fig. 64, which is a simplified flowchart of the propinquity indication functionality of Figs. 62 and 63. Computer 6001 sends watch 6001 an RF signal ordering watch 6001 to send an acknowledgment signal. If such acknowledgment signal is not received by computer 6003 it is implied that the user is not within reach, typically not at home. If the signal is received by computer 6003, computer orders user's watch 6001 by an RF signal to check if watch 6001 is worn by user. Watch 6001 checks if worn by the user by a heat sensor it includes, designated to detect body heat. If watch 6001 responds in a signal indicating that it is worn by user, computer 6003 orders watch to emit a unique IR signal. Computer 6003 then queries toy 6002 whether the IR. signal was received by toy 6002. If the signal was received the user is in propinquity with toy 6002. Otherwise, the user is not in propinquity with toy 6002. If watch 6001 is not worn by the user it is implied that the user is possibly within reach. Computer 6003 then reverts to other methods of propinquity detection. It is appreciated that a messaging toy system may apply the method described in Fig. 64 with a multiplicity of personal objects comprising RF and IR transceivers and sensors indicating that a user carries them, such as heat sensors or accelerometers, thus increasing the probability that at least one object is carried by a user at any time. Such objects may be a ring, a necklace, eyeglasses, or a gannent. In such a case computer 6003 sequentially accesses each of the multiplicity of personal objects, until an object canied by the user is found. It is also appreciated that by tracking the multiplicity of objects a computer may improve the working of the system: identifying which objects a user tends to carry, identifying which objects a user tends to take when he leaves home.

It is appreciated that the frmctionality of Figs. 62, 63 and 64 taken together is particularly appropriate to a child-messaging toy system comprising: a verbal-input interactive toy including child propinquity indicating functionality; a message database operative to accept at least one message to be delivered to a child whose propinquity to the toy is indicated to exist; a message delivery controller including: an audible annunciator operative to provide a personalized audible output to the child requesting that the child come into propinquity with the toy; and a message output generator, operative in response to an indication of propinquity of the child to the toy for providing at least one message from the message database to the child.

It is also appreciated that the functionality of Figs. 62, 63 and 64 taken together is particularly appropriate to a child-messaging toy system comprising: a verbal-input interactive toy including child propinquity indicating functionality; a timed message database operative to accept at least one time-specific message to be delivered to a child whose propinquity to the toy is indicated to exist at least one predetennined time; and a message output generator, operative in response to an indication of propinquity of the child to the toy for providing at least one time-specific message from the timed message database to the child.

It is further appreciated that the functionality of Figs. 62, 63 and 64 taken together is particularly appropriate to a child-messaging toy system as the aforementioned and also comprising a message delivery indication that the time-specific message has not been delivered to the child at the predetennined time.

A virtual parenting toy system is now described, in accordance with a prefened embodiment of the present invention. Reference is now made to Fig. 65, which is a simplified pictorial illustration of a virtual parenting toy system comprising a game offered to a user as an incentive to perform a task and a compliance monitor accepting an indication that a user had performed a task, in accordance with a preferred embodiment of the present invention. Turning to Fig. 65, it is seen that a toy 6311 suggests to a user that the user play a riddle game with toy 6311. The user agrees to play the game. Toy 6311 then makes a condition that in order to play the game, the user must first finish a homework task, such as reading a chapter in a history book. A while later, the user tells toy 6310 that he finished reading the chapter. Toy 6311 asks the user a question concerning the chapter, if the user provides a correct answer, it implies that the user had indeed read the chapter. If the user answers correctly, toy 6311 offers the user to play the riddle game.

Reference is now made to Fig. 66, which is a simplified flowchart of the prompting and compliance monitoring frmctionality of Fig. 65. Computer 6310 detects that a user is in propinquity with toy 6311. Detection is done in one of a multiplicity of methods: by detecting touch of the user on toy, by detecting user's speech or as in the method described in Fig. 64. Computer 6310 checks tasks' database and retrieves a waiting task. The database is typically managed by a user's parent. It is appreciated that other sources as well, such as teachers, might be authorized to enter data into the database. Computer 6310 selects a game as an incentive for the user to perform task. Toy 6311 offers the user to play the game. If the user agrees toy 6311 conditions the game playing on the user performing the task. A while later the user tells toy 6311 that he had completed the task. Computer 6310 Identifies keywords "finished" and "play", implying that a user completed the task and wants to play the game. Toy 6311 retrieves from tasks database a question associated with the task, a correct answer to the question implies that the user had indeed completed the task. It is appreciated that a toy will be incorporated in learning and school issues of a user including receiving regular school tasks updates. Such tasks will contain homework assignments and associated indicators that the assigmnents are completed. If the user answers the question conectly, toy 6311 proceeds to play the game with user.

It is appreciated that the functionality of Figs. 65 and 66 taken together is particularly appropriate to a virtual parenting toy system comprising: a verbal-input interactive toy operative to play at least one game with a child, the verbal-input interactive toy including verbal-input interactive toy content operative to actuate the verbal-input interactive toy to present to the child: at least one verbal prompt to perfonn at least one task; and at least one verbal offer to play the at least one game with the child once the at least one task is perfonned; and a compliance monitor operative to accept at least one indication that the at least one task has been perfonned and in response to the indication, to actuate the at least one game.

A virtual parenting toy system in accordance with another preferred embodiment of the present invention is now described. Reference is now made to Fig. 67, which is a simplified pictorial illustration of a virtual parenting toy system. The system includes a child-want indication-recognizing frmctionality, a child want satisfying functionality partly controlled by inputs received other than from the user and providing a user advertisement content related to user's wants, a preference eliciting functionality and a transactional functionality, in accordance with a prefened embodiment of the present invention. Turning to Fig. 67, it is seen that as part of a user interaction with a toy 6500, the user tells toy 6500 that he is hungry. A computer 6501 identifies a user's want. Computer 6501 sends a message to a toy server 6502 regarding the user's want. Toy server 6502 selects possible products that might satisfy the want. Toy server 6502 sends a message to a parent at his place of work, such as at an office, specifying the user's want, and products possibly appropriate to satisfy the want. The parent selects authorized products via a computer 6503 in office. Toy server 6502 sends the authorized list to user's computer 6501. Toy 6500 offers the user to select a product or products from the list. After the user selects various products, the user's computer 6501 orders products via toy server 6502 and performs the transaction involved.

Reference is now made to Fig. 68, which is a simplified flowchart of the want indication recognizing, want satisfying, authorization request and preference eliciting functionality of Fig. 67. During an interaction of the user with toy 6500, computer 6501 identifies an irregular behavior pattern that might be indicative of a user's want. Computer

6501 initiates a want identifying procedure wherein toy 6500 asks the user what are the reasons for his behavior. The user's answers to the question lead to a specific want, identified by computer 6501. Computer 6501 sends message to toy server 6502 regarding the want. Toy server 6502 selects appropriate products from a database containing information about products, which is updated regularly by suppliers of the products. It is appreciated that selection of products possibly appropriate to satisfy user's wants, is personalized in accordance with the specific user profile and history of purchases. Toy server 6502 sends a list of products to computer 6503 in the office of the parent, for parent authorization. It is appreciated that a parent may also receive the list via an automated phone call, or by other means known in the art. The parent, in his office, selects products from the list. Toy server

6502 sends the authorized list to user's computer 6501. Toy 6500 verbalizes the authorized list to the user, offering him to purchase a product or products from the list. The user selects products from list. Computer 6501 identifies the selected products and sends a message to toy server 6502. Toy server 6502 orders the products from the suppliers and perfonns the transaction involved via a value account associated with the user or with a parent of user.

Reference is now made to Fig. 69, which is a simplified block diagram illustration of the want indication-recognizing functionality of Figs. 67 and 68. As seen in the figure, a multiplicity of input data types is employed in order to detect an inegular behavior of a user which is indicative of a possible emotional, cognitive or physiological state of want. When such a behavior is detected, user's computer 6501 initiates an inquiry procedure that utilizes interactive scripts in order to identify a specific want of a user. Thus the system proceeds from detecting an irregular behavior that might imply a state such nervousness, frustration, or aggressiveness, to recognizing a specific want emanating from the user's state, such as hmiger or boredom. User's computer 6501 utilizes a multiplicity of input data types in order to detect during an interaction with the user an inegular behavior. Computer 6501 identifies non-verbal vocal inputs 6520 such as crying sounds, which are implicative of distress. It employs techniques such as voice analysis, known in the art, to infer an emotional state such as frustration or anger in the user's voice 6521. Computer 6501 also analyses speech content 6522 for the same purpose, such as by detecting keywords in user's speech, by measuring the time duration of the user's utterances. It uses also tactile inputs, indicating for example, whether a user hugs a toy more than usual. Computer 6501 also analyses the nature of the interaction of the user with toy 6500, checking, for example, whether a user refuses more than usual to toy's suggestions. When the numerous input data types indicate an irregular behavior implicative of a possible emotional, cognitive or physiological state of want, computer 6501 initiates a conversation of toy 6500 with user, operated by interactive scripts designed to elicit from the user an indication of a want. During the conversation, computer 6501 utilizes the multiplicity of input data types in order to activate the interactive scripts, such as by detecting whether user's answers are aggressive, impatient. During the conversation, computer 6501 detects keywords, such as "hungry", "bored" that indicate a specific want.

It is appreciated that the frmctionality of Figs. 67, 68 and 69 taken together is particularly appropriate to a virtual parenting toy system comprising: an interactive toy including: a child want indication-recognizing functionality operative to recognize at least one indication of a child want; a child want reporting functionality for providing an output indication of a child want recognized by the child want indication-recognizing functionality; and a child want satisfying functionality operative to satisfy the child want reported by the child want reporting functionality.

It is also appreciated that the functionality of Figs. 67, 68 and 69 taken together is particularly appropriate to a virtual parenting toy system as the aforementioned and wherein the child want satisfying functionality is controlled by a child want satisfying input which may be received other than from the child.

It is further appreciated that the functionality of Figs. 67, 68 and 69 taken together is particularly appropriate to a virtual parenting toy system as the aforementioned wherein the child want satisfying functionality includes: advertisement content responsive to the child want indication and offering a plurality of advertised items; and child preference eliciting functionality ascertaining a preference of the child for a given item from among the plurality of advertised items and providing a child preference output; and transactional functionality operative in response to the child preference output for purchasing the given item. An interactive toy system is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 70, which is a simplified pictorial illustration of a toy system comprising free time indication frmctionality and free time utilization functionality, in accordance with a preferred embodiment of the present invention. Tuming to Fig. 70 it is seen that a user requests toy 6700 that toy play a game him. Computer 6701 checks the user's schedule. Toy 6700 refuses the user's request, and reminds the user that he is supposed to do his homework currently, as appears in the user's schedule. A while later, computer 6701 detects the beginning of a free-time slot. Toy 6700 suggests to the user to play a game with him.

Reference is now made to Fig. 71, which is a table illustrating the free time indication functionality of Fig. 70. The table illustrates a record in a user's schedule database, comprising a daily schedule items column 6710, a current date schedule items column 6711, and a free-time slots column 6712. The free-time slots' column 6712 is computed from first two columns 6710 and 6711, designating times wherein a user is free from scheduled obligations.

Reference is now made to Fig. 72, which is a simplified flowchart of the entertainment providing in user's free-time functionality of Fig. 70.. The user requests from toy 6700, that it should play a game with him. Computer 6701 checks the user's schedule. If current time is designated as a free-time slot, toy 6700 plays the requested game with user. Otherwise, toy 6700 reminds the user of user's scheduled current obligation. Computer 6701 then registers the requested game, which is to be offered to the user at the beginning of user's next free-time slot.

Figs. 70, 71 and 73, taken together, illustrate a toy system comprising free time indication and free time utilization frmctionality wherein free time indication functionality is responsive to an oveniding parent input. Turning to Fig. 73, it is seen that computer 6701 detects the beginning of a free-time slot, during which the user may participate in toy interaction. Computer 6701 checks whether the free time slot was ovemdden by a parent. If the time slot was not ovenϊdden, toy 6700 offers the user an entertainment item.

It is appreciated that the functionality of Figs. 70, 71, 72 and 73 taken together is particularly appropriate to a toy system comprising: an interactive toy including; free time indication functionality designating at least one time slot during which a child has free time and may participate in toy interaction; and free time utilization functionality operative in response to an output from the free time indication frmctionality for providing entertainment to the child during the at least one time slot. It is also appreciated that the functionality of Figs. 70, 71, 72 and 73 taken together is particularly appropriate to a toy system as the aforementioned wherein the free time indication functionality includes a schedule input receiver operative to receive schedule information regarding a plurality of schedule items and to compute therefrom the at least one time slot.

It is further appreciated that the functionality of Figs. 70, 71, 72 and 73 taken together is particularly appropriate to a toy system as the aforementioned wherein the free time indication functionality is responsive to an oveniding parent input for defining the at least one time-slot.

A user-location monitoring toy diary is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 74, which is a simplified partly pictorial partly schematic illustration of a user-location monitoring toy diary comprising time and coordinates data storage functionality, location tracking frmctionality and prompter functionality in accordance with a prefened embodiment of the present invention.

Turning to Fig. 74, it is seen that in a park, an interactive toy 7001 comprising a GPS device 7002, informs a user that the user must leave the park in two minutes in order to anive on time at a dentist's appointment. The toy also provides the user with directions of how to go to the desired destination.

In the illustrated embodiment, toy 7001 interacts based on instructions received from a suitable toy server 7010, with which it communicates via a cellular antem a 7005, which provides connection to the Internet 7006. The toy server 7010, includes a schedule database 7011 , which stores schedule items for the user, a time for each item, as well as coordinates of a location for each item. Toy server 7010 further includes and/or has access to, typically via the Internet 7006, a traffic database 7012 providing routing and timing required for leaving from one location and for arriving at another, also taking the given means of transportation into account, within a given traffic area. It may therefore be appreciated that the server 7010 is operative to instruct the toy 7001 to prompt the user to conform to a schedule, based on the current location of the user and to provide appropriate directions to the user.

Reference is now made to Fig. 75 A, which is a simplified table of a typical database record of the schedule database 7011 of Fig. 74. Tuming to Fig. 75 A, it is seen that for each schedule item defined by the time thereof (day and hour), the database provides the nature of the item itself, the coordinates X and Y of the location thereof. It also takes into account the means of transportation available to the user, and used for his transportation to the location. A parent of the user, for example, may enter a schedule item according to time and location, such as an address by means of a networked computer terminal. The server 7010 derives the coordinates X and Y of the location by means of, for example, computerized map software.

Reference is now made to Fig. 75B, which is a simplified table of a typical database record of the traffic database 7012 of Fig. 74. Turning to Fig. 75B, it is seen that for any two locations defined each by coordinates X and Y, the database provides the time required to arrive from the first location to the second location. It takes into account the means, such as of any one of different means of transportation including walking, driving, going by bus and by subway. Thus, for example, T3,3,2 designates the time required to arrive by bus from location 3 to location 2. In addition, the traffic database 7012 preferably provides a possible route for arriving from one location to another by means of given means of transportation.

The scenario of Fig. 74 is preferably preceded by a procedure whereby departure time for a schedule item is determined. For example, a toy server 7010 retrieves from traffic database 7012, the walking time required to arrive from the home of the user to the location of a schedule item stored in database 7011. Based on the time of the schedule item, stored in database 7011, the required walking time and a predetennined spare amount of time, the server 7010 detennines a departure time, and infomis the user and/or a parent of the user thereof via phone and/or a toy 7001 with which it may communicate via a networked computer not indicated in Fig. 74.

Reference is now made to Fig. 76, which is a simplified flowchart of the data storage, location tracking and prompter functionality of Fig. 74. Toy 7001 communicates with toy server 7010 via a public wireless communication network, which provides an Internet coimection. A GPS device 7002 on toy 7001 provides coordinates of the current location of toy 7001 and its user. A GPS reading from the toy 7001 is transmitted to the server 7010. Server 7010 retrieves from traffic database 7012, the amount of time required in order to arrive from the current location of the user to the location of a schedule item by the means of transportation defined therefor as registered in schedule database 7011. Traffic database 7012 preferably also provides an indication of the nature of the cun-ent location of the user, as well as a prefened route for walking from the cunent location of the user to the location of the schedule item.

If the time required in order to anive at the location of the schedule item is longer than the time left until the time of the schedule item, the server may send a message to a parent of the user, for example, via a phone system. It is appreciated that since the location tracking and prompter functionality of the system of Fig. 74 is intended to prevent such a case, server 7010 preferably interprets this case as meaning that the user accompanied by the toy 7001 is lost. Thus, server 7010 sends a message to the user via the toy 7011 suggesting that the user wait at the current location, and a message to a parent of the user includes indication of the location of the user.

Otherwise, if the time left until the time of the schedule item is less than 5 minutes longer than the time required in order to arrive at the location of the schedule item, server 7010 instructs toy 7001 to prompt the user to conform to the schedule. For example, server 7010 sends a message to be personalized to the user via the toy 7001 suggesting that the user should leave the cun-ent site, and should follow the provided directions in order to arrive on time at the location of the schedule item. It is appreciated that this prompt message to the user may include reference to the nature of the current location of the user, for example, a park, as retrieved by server 7010 from traffic database 7012 based on the coordinates indicated by GPS device 7002.

It is appreciated that the functionality of Figs. 74, 75 A, 75B and 76 is particularly appropriate to a user-location monitoring toy diary comprising: a schedule database storing, for each of a multiplicity of schedule items, a time thereof and coordinates of a location thereof, wherein the multiplicity of locations are represented within a single coordinate system; a user tracker operative to track the current location of the user; and a prompter operative to prompt the user to conform to the schedule if the user's current location does not conform to the stored location of a current schedule item.

A schedule-monitoring toy system is now described, in accordance with a prefen'ed embodiment of the present invention. Reference is now made to Fig. 77, which is a simplified pictorial illustration of a schedule monitoring toy system comprising scheduling prompt which includes content which is not related to scheduling, in accordance with a preferred embodiment of the present invention. Turning to Fig. 77, it is seen that a toy 7200 tells the user a joke about dentists. Afterwards toy 7200 asks the user whether he may guess what is his next schedule item. The user guesses conectly that it is a dentist appointment.

Reference is now made to Fig. 78, which is a simplified flowchart of the content selection functionality of Fig. 77. A computer 7202 checks whether the next schedule item is emotionally traumatic for the user. If it is indeed traumatic, computer 7202 selects a type of prompt, appropriate for relieving tension involved in schedule item. Selection is performed based on data regarding prompts supplied previously in similar contexts, and the measure which they were correlated with diminishing user's negative response to similar schedule items. If the prompt is a value credit offered to the user for compliance with schedule item, toy reminds the user of the item, and offers the user the value credit in case the user complies with schedule item. If the prompt type is a content item, such as a joke, a story or a game, computer 7202 selects a content item. The content item is selected in accordance with the schedule item, for example, by keywords in the schedule item. A content item is retrieved from a server 7204 via the Internet. Alternately, content is stored locally on computer 7202. Toy 7200 actuates content and then verbalizes a schedule item to the user. Computer 7202 evaluates the user response to the schedule item. Computer 7202 further tracks the user perfonnance of the schedule item. If the schedule item involves being in a specific location at a designated time, computer 7202 may check whether the user a ived at the location on time. Computer 7202 may also utilize positioning systems such as GPS or cellular communication networks. Computer 7204 receives infonnation regarding the user's location via server 7204. After accomplishing the schedule item, computer 7202 adds to a database data regarding the user's response to the schedule item, and user's perfonnance of the schedule item. If a response to a verbalized schedule item was negative, or if the user did not perform schedule item correctly, computer 7202 registers a negative response in a record related to the content type in relation to the schedule item or similar items. Data obtained in such a way, further serves to assess the effectiveness of different content types in relieving tensions involved in various schedule contexts.

Reference is now made to Fig. 79, which is a simplified flowchart of the traumatic schedule item detecting functionality of Fig. 78. The method described enables detection of emotionally traumatic items from among a plurality of schedule items, input in natural language. It defines groups of words, which are indicative of trauma, and are used to associate traumatic weighting with various schedule items. As is seen in Fig. 79, Computer 7202 creates a group X including all words of next schedule item, omitting prepositions. If group X already has an associated record in database, computer 7202 proceeds to deliver schedule item to the user and to evaluate user's response to and performance of the schedule item. Otherwise, computer 7202 defines a new record for group X. Computer 7202 further creates records for all groups created by an intersection of group X with groups associated with existing records, such that the intersection is a non-empty group and not identical to X or to an existing record. A group created in such a way is a sub-group of group X and of a group associated with an already existing record. The record of the sub-group is then associated with the values associated with the existing group, the values describing the number of occurrences of the existing group, and the number of occasions the group was involved with a negative response of a user. After creating the new records, toy 7200 reminds the user of next schedule item and evaluates the user response to, and performance of, the schedule item. If the response is negative, or if the performance is imperfect, computer 7202 adds 1 to negative responses column (7221 in Fig. 81 below) of all records R such that word group R is a subgroup of word group X. Computer also 7202 adds 1 to the occunences record (7222 in Fig. 81 below) of all the records.

A negative user response to a verbalized schedule item is detected by a plurality of means, for example, by parsing content of reply and searching for keywords in it, or by analyzing voice of the user for detecting emotions in methods known in the art. It is appreciated that when a user responds negatively to a verbalized scheduled item, or when a user does not perfonn an item correctly, a toy 7200 initiates a conversation with the user in order to elicit from the user reasons for his attitude towards the item. User's responses during the conversation are then utilized for the purpose of content selection.

Reference is now made to Fig. 80 which is a simplified flowchart illustrating a procedure of assigning weighting associated with trauma to word groups obtained in a method described in Fig. 79 above. Computer 7202 checks the number of times a word group was used in association with a schedule item. The number of times is designated in Fig. 80 as X. The number of times it occuned in conjunction with a negative user response, is designated as Y. If X is higher than a defined number, such as 3, and the relation of Y to X is higher than a defined relation, such as 80%, then the associated word group is marked "traumatic".

Reference is now made to Fig. 81 , which is a table illustrating an exemplary database obtained in methods described in Figs. 79 and 80. As is seen in Fig. 81, a database includes four columns. A word group 7220 column includes groups and sub-groups of the words that appeared in schedule items. An associated negative response 7221 column registers the number of times an appearance of a word group was involved in a negative response of a user. Another associated occurrences 7222 column registers the total number of appearances of a word group in schedule items. A status 7223 column registers whether a word group has a traumatic weight. As may be seen, the word group {aunt, Jemima} is marked "traumatic", because all 4 occurrences of it were involved in a negative response of a user, meaning he responds negatively to any schedule item involving Aunt Jemima, for example, "Bring fish to Aunt Jemima" (in first row) or "Visit Aunt Jemima" (in second row). In may also be seen that a user responds negatively to a schedule item "Bring fish to Aunt Jackie". Nevertheless the word group {aunt, Jackie} is not marked "traumatic" since a user did not respond negatively to other items that included it, such a "Visit Aunt Jackie". It may further be seen that the negative response of the user to the item "Bring fish to Aunt Jackie" was probably due to user's aversion to bringing fish to people rather than to Aunt Jackie, since the word group {bring, fish} is marked "traumatic".

Reference is now made to Fig. 82, which is a simplified flowchart of the traumatic schedule item detecting functionality and of the prompt type selection functionality of Figs. 77 and 78. Computer 7202 checks the traumatic weighting of next schedule item. A schedule item is considered emotionally traumatic if it contains all the words of a word group marked "traumatic" in the database, as is described in Fig. 81. The computer selects whether to prompt a user by a content item or by an offer of value credit to be transferred to the user if the user complies with schedule item. It is appreciated that a use of value credit is detennined by the user's parents or by other sources authorized to input items to user's schedule. Alternately, it is determined by the effectiveness of the use in diminishing user's resistance to perform the schedule item, relative to other means such as prompt by content. After offering value credit to user, computer 7202 checks whether the user had complied with a schedule item. Criteria for compliance are established by a user, who determined the use of value credit as a prompt for the schedule item. Such a criterion may be being in a designated location at a designated time. If the prompt type selected is a content item, computer 7202 selects content type for an emotionally traumatic schedule item, based on prior experience with supplying content to relieve tensions, due to schedule items. Selection is based on a database described below in Fig. 83. Computer 7202 selects content type that has proved most efficient in relieving tensions due to prior schedule items containing the same word group marked traumatic. After selecting content type, computer retrieves content of the type, selected according to words in the word group marked traumatic.

Reference is now made to Fig. 83, which is table illustrating a typical database, utilized in order to select content, as described in Fig. 82. As is seen in Fig. 83, the database includes records for word groups marked as "traumatic" in a database as described in Fig. 81. The database contains a column for each content type supplied such as, games 7260, stories 7261 and jokes 7262. Each column is divided into two, so that it registers the number of times a content type was supplied in conjunction with a word group, and the number of times the content type was followed by a negative response of a user. When selecting a content type for an item, including a word group marked "traumatic", computer 7202 selects type that had proved most efficient for relieving tensions by using the specific word group. Thus, it selects a content type wherein the ratio of the value in the second column 7264, 7266 or 7628, to the first column 7263, 7265 or 7267 is the lowest respectively. It is appreciated that the functionality of Figs. 77, 78, 79, 80, 81, 82 and 83 taken together is particularly appropriate to a schedule-monitoring toy system comprising: a verbal-input interactive toy operative to interact with a user; a schedule database storing the user's schedule; and a schedule reminder actuating the verbal-input interactive toy to present to the user at least one verbal scheduling prompt which includes content which is not related to scheduling.

It is also appreciated that the functionality of Figs. 77, 78, 79, 80, 81, 82 and 83, taken together, is particularly appropriate to a schedule-monitoring toy system, such as the aforementioned wherein the prompt includes at least one game.

It is further appreciated that the functionality of Figs. 77, 78, 79, 80, 81, 82 and 83, taken together, is particularly appropriate to a schedule-monitoring toy system such as the aforementioned, wherein the prompt includes at least one joke.

It is further appreciated that the functionality of Figs. 77, 78, 79, 80, 81, 82 and 83, taken together, is particularly appropriate to a schedule-monitoring toy system such as the aforementioned, wherein the prompt includes at least one joke.

It is further appreciated that the functionality of Figs. 77, 78, 79, 80, 81, 82 and 83 taken together is particularly appropriate to a schedule-monitoring toy system, such as the aforementioned, wherein the prompt offers the user a value credit for compliance with the prompt, and stores the credit for the user if the user fulfills a compliance criterion.

It is further appreciated that the functionality of Figs. 77, 78, 79, 80, 81, 82 and 83, taken together is particularly appropriate to a schedule-monitoring toy system, such as the aforementioned, wherein the prompt includes content which emotionally prepares the user to cope with an emotionally traumatic item in the schedule database.

A computerized guide system for a blind user is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 84, which is a simplified partly pictorial partly schematic illustration of a computerized guide system for a blind user comprising a networked portable interactive device, in accordance with a prefened embodiment of the present invention. As shown in Fig. 84, a blind user with an interactive toy 7401 encounters an obstacle 7403 on route-segment 7407. Toy 7401 detects obstacle 7403 by means of obstacle detector 7409, which may be an infrared-based obstacle detector, or any other suitable device, known in the art, and provides an audio warning message to the user. As is seen further in Fig. 84, another interactive toy 7402 suggests to its user that the user should take an alternative route turning to route-segment 7408 in order to avoid obstacle 7403 on route-segment 7407. In the illustrated embodiment, interactive toy no 7402 interacts based on instructions received from server 7406 with which it communicates by means of wireless communication via public wireless communication network antenna, which provides connection to Internet 7405. Server 7406 also communicates with toy 7401, which typically includes a location-tracking device such as a GPS device. Thus, it may be appreciated that server 7406 may instruct toy 7402 to suggest to its user an alternative route based on hazard information received from toy 7401.

Reference is now made to Fig. 85 A, which is a simplified table in the context of Fig. 84 showing a destination database 7411 for a particular blind user such as the user of toy 7401 of Fig. 84. Destination database 7411 may be stored on a personal computer of the user and/or on a suitable server such as server 7406 of Fig. 84. Turning to Fig. 85A, it is seen that for each keyword designating a user-desired destination, database 7411 provides a suitable route definition. In the illustrated embodiment, a route-definition R is provided in the form a series of route-segments (Xi,Yi) where Xi and Yi represent coordinates of a location on a computerized map of a traffic area. A route-segment may be a segment along a straight line, a portion of a street, a series of stops along a public transport line, and/or any other single motion segment that may be traversed by a blind user without the user requiring instruction as to turns, changes and the like. A route segment (Xi,Yi) is typically defined as a possible single motion segment leading from the location whose coordinates are Xi-1, Yi-1 to the location whose coordinates are Xi,Yi. Coordinates X0,Y0 are typically defined as those of the home of a user.

Reference is now made to Fig. 85B, which is a simplified table in the context of Fig.. 84 showing a route-definition database 7412. Database 7412 is typically stored on a suitable server such as server 7406 of Fig. 84. Tuming to Fig. 85B, it is seen that for any two locations in a given traffic area, respectively represented by coordinates Xi,Yi and Xj,Yj, database 7412 provides a route-definition Ri,j typically comprising a series of route segments, as is shown in Fig. 85A. Preferably, route-definition database 7412 provides routes avoiding obstacles, which may pose hazard to a blind user. Preferably, database 7412 is continuously updated based on newly reported obstacles.

Reference is now made to Fig. 85C, which is a simplified table in the context of Fig. 84, showing a multiple user guiding database 7413. Database 7413 is typically stored on a suitable server, such as server 7406 of Fig. 84. Turning to Fig. 85C, it is seen that for each user cuiTently being guided by an interactive toy, database 7413 provides a toy ID, route- definition of the route currently traversed, and the current location of the user.

Reference is now made to Fig. 86, which is a simplified flowchart in the context of i n Fig. 84, showing route definition functionality of a computerized guide system for a blind user. Toy 7401 at a home of a user is in typically wireless communication with a personal computer 7420, which in turn coimnuiiicates typically via the Internet 7405 with a server 7406. Computer 7420 typically includes a destination database such as destination database 7411 of Fig. 85 A. Toy 7401 receives a verbal input from the user. Computer 7420 is operative to perfonn speech recognition, based on a list of keywords designating desired destinations of the user. If computer 7420 recognizes one of the keywords of destination database 7411, computer 7420 coimnuiiicates to server 7406 the proposed route definition conesponding to the user-desired destination as provided by database 7411. If the proposed route includes newly reported obstacles, server 7406 retrieves an alternative route from route-definition database 7412, based on the location of the user's home and the desired destination. Server 7406 communicates the alternative route to computer 7420, which in turn updates destination database 7411 with the alternative route and instructs toy 7401 to infonn the user thereof.

If the user verbal input does not include a keyword from database 7411, computer 7420 communicates to server 7406 the user's route input. Server 7406 may send a message to be verbalized to the user via toy 7401, requesting that the user clarify and/or confirm the route input, for example, by verbally providing an exact address. Server 7406 then retrieves the coordinates of the destination desired by the user from a suitable traffic database. Based on the coordinates of the user's home and those of the desired destination, server 7406 retrieves a route definition from database 7412. Server 7406 then instracts computer 7420 to verbalize a message to the user via toy 7401 informing the user of the proposed route.

Computer 7420 communicates to server 7406 a user confirmation of a proposed route, which confirmation is preferably received by means of verbal input via toy 7401. Server 7406 then updates a guiding database 7413 with a new guiding task for the user, registering the toy ID of toy 7401 , the route definition and the current location of the user, which is initially that of the home of the user.

Returning to Fig. 84, it is seen that toy 7401 comprising a GPS device, which is in communication with server 7406 via public wireless communication network antenna 7404 and Internet 7405. It may therefore be appreciated that server 7406 is operative to continuously update guiding database 7413 with the user's current location based on GPS device reading received from toy 7401. It may also be appreciated that the route definition procedure of Fig. 86 may also be performed while receiving the user route-input from a location other than the home of the user. In such a case, a route definition is retrieved from database 7412, based on the user cunent location tracked via the GPS device on toy 7401 and a desired destination received from the user typically by means of verbal input via the toy 7401.

Reference is now made to Fig. 87, which is a simplified flowchart in the context of Fig. 84, showing the audio warning functionality and the hazard infonnation providing functionality of the system of Fig. 84. While performing a guiding task for its user, toy 7401 detects obstacle 7403, for example, by means of infrared-based obstacle detector 7409. Based on detector 7409 signal, toy 7401 provides an immediate audio warning for the user. Toy 7401 is typically operative to provide such an audio warning even in a case where wireless communication is temporarily or permanently lost.

Toy 7401 communicates to server 7406 the presence of an unreported obstacle 7403 on route-segment 7407. Server 7406 sends a message to the user suggesting that the user might wish to receive help. Based on the user's response received via the toy 7401, the user may receive help by means of human intervention.

Server 7406 updates route-definition database 7412, based on the newly reported obstacle 7403. This may result in changing one or more route-definitions for future selection of routes. At the same time, server 7406 checks cun-ent task database 7413 for users currently traversing a route comprising route-segment 7407 where obstacle 7403 has been reported. The search retrieves users such as the user of toy 7402. Server 7406 retrieves the cunent location of the user of toy 7402 from database 7413 and/or by means of a GPS reading from toy 7402. Based on the cunent location and the desired destination of the user of toy 7402, server 7406 retrieves from database 7412 an alternative route for the user that avoids obstacle 7403. Server 7406 sends a message to be verbalized for the user via toy 7402 infonning the user of the newly reported obstacle and suggesting that the user should take the alternative route.

It is appreciated that the functionality of Figs. 84, 85A, 85B, 85C, 86 and 87 taken together is particularly appropriate to a computerized guide system for a blind user, the guide comprising: a portable interactive computerized device including route definition functionality operative to receive route input from a blind user for selecting a user route; hazard detection functionality operative to detect at least one hazard along the user route; and audio warning frmctionality operative in response to an output from the hazard detection functionality to provide the user with an audio warning regarding presence of the hazard.

It is also appreciated that functionality of Figs. 84, 85A, 85B, 85C, 86 and 87 taken together is particularly appropriate to a computerized guide system for a blind user such as the aforementioned system and wherein: the interactive device is networked with at least one other such device; the interactive device is operative to provide hazard information to the at least one other such device; and the interactive device is operative to broadcast the hazard information in real time.

A parental surrogate toy is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 88, which is a simplified flowchart of a parental surrogate toy, comprising a child behavior monitor operative monitor a selected parameter of child behavior, in accordance with a preferred embodiment of the present invention. A parent selects via a computer 7502 a behavior parameter to be monitored by a toy 7500. In Fig. 88, the parameter is the use of inappropriate language. Other parameters monitored could be aggressive behavior (detected by voice analysis by methods lmown in the art), compliance with bed time rules (detected by light sensors on toy). Computer 7502 enables a parent to select a list 7504 of inappropriate expressions to be monitored by toy 7500. The parent may select a pre-defined list from a multiplicity of lists or customize such a list, or define a personal list. Toy 7500 picks up child's speech and transfers it to computer 7502. Computer 7502 converts speech to text and searches the text for expressions that appear on selected list 7504. Computer 7502 registers all uses of expressions from list 7504 by child. At the parent's request, computer 7502 delivers to the parent a behavior report, comprising inappropriate expressions used by child and the number of times each of the expressions was used.

It is appreciated that the functionality of Fig. 88 is particularly appropriate to a parental surrogate toy comprising: a toy; a child behavior report receiver; and a toy controller comprising: a behavioral report configuration definer allowing a parent to define at least one parameter of child behavior which is of interest; a child monitor operative to monitor the parameter of child behavior and to provide a report relating to the at least one parameter to the child behavior report receiver.

A web browsing system comprising an interactive toy wherein web browsing is provided in the context of a game is now described, in accordance with a prefened embodiment of the present invention. Reference is now made to Fig. 89, which is a simplified pictorial illustration of a game comprising toy web browsing functionality in accordance with a preferred embodiment of the present invention. Turning to Fig. 89 it is seen that a user and a toy 8000 are playing a word game wherein web browsing provides an answer to a question posed in the game. As is seen in Fig. 89, each of the participants, at his turn, poses a question comprising a pair of words apparently belonging to distant semantic fields, such as "pea" and "bee" or "fish" and "wall". As an answer, the other participant has to find a connection between the pair of words. Toy 8000 retrieves such word pairs from a toy server 8003 via the Internet. Toy 8000 answers questions posed in the game by user, utilizing web browsing functionality, by finding a web site containing a pair of words chosen by user, such as "The Big Fish Wall of Fame" web site. Computer 8001 displays the web site on monitor 8002 and toy 8000 verbalizes an answer to user's question comprising an explanation of the contents of the web site.

Reference is now made to Fig. 90, which is a simplified flowchart of the web browsing functionality of Fig. 89. The user and toy 8000 play a game as illustrated in Fig. 89. Computer 8001 retrieves a word pair from toy server 8003 via the Internet. Server 8003 includes a database of such word pairs. It is appreciated that such a database is enhanced by pairs, annunciated by users during past games. Computer sends message to toy 8000. Toy 8000 verbalizes the message, asking the user for a connection between the word pair. Toy 8000 picks up user's utterance and sends it to computer 8001. Computer 8001 converts the utterance to text and checks whether it is a correct answer to the question posed by toy 8000, for example by checking whether the utterance is a grammatically correct phrase containing both words of the pair. If user's answer is correct, the user now chooses a pair of words and toy 8000 has to find a comiection between them. Otherwise, toy 8000 tries to find an answer to the question that the user failed to answer correctly. In any case, computer 8001 browses web, utilizing standard search engine, searching for a web site with a title that contains both words of the word pair (posed either by the user or by toy 8000). If such a web site is found, toy 8000 introduces it as an answer to the question now posed. Computer 8001 displays the web site on monitor 8002 and toy 8000 verbalizes an answer explaining what is the content of the site. Such an answer is based on the text of the site and/or on XML tags of the site. It is produced utilizing text summarization techniques lmown in the art. If a web site matching the requirements of an answer is not found, computer 8001 proceeds to produce an incorrect answer to the question posed. For that matter, computer 8001 replaces one of the words of the pair with a word phonetically close to it. Such a word is selected utilizing a phonetic dictionary. Computer 8001 repeats the process until a web site is found wherein site's title contains both words of the word pair. Toy 8000 introduces the site as an answer and computer 8001 displays the site on monitor 8002. As the answer is incon-ect, the turn of game is now given to user.

It is appreciated that the frmctionality of Figs. 89 and 90 taken together is particularly appropriate to a web browsing system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having web browsing functionality.

It is also appreciated that the frmctionality of Figs. 89 and 90 taken together is particularly appropriate to a web browsing system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having web browsing functionality; and a computer which serves as an intermediary between the interactive toy and the Internet.

It is further appreciated that the frmctionality of Figs. 89 and 90 taken together is particularly appropriate to a web browsing system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having web browsing functionality; and wherein the user interface also has non-web browsing functionality.

It is yet further appreciated that the functionality of Figs. 89 and 90 taken together is particularly appropriate to a web browsing system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having web browsing functionality; and wherein the user interface provides the web browsing functionality within the context of a game.

Furthennore, It is appreciated that the functionality of Figs. 89 and 90 taken together is particularly appropriate to a web browsing system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having web browsing functionality; wherein the user interface provides the web browsing functionality within the context of a game; and wherein in the context of the game the web browsing functionality provides an answer to a question posed in the game.

It is appreciated that the functionality of Figs. 89 and 90 taken together is particularly appropriate to a web browsing system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having web browsing functionality; wherein the user interface provides the web browsing functionality within the context of a game; and wherein in the context of the game the web browsing functionality provides non- rational web browsing.

It is also appreciated that the functionality of Figs. 89 and 90 taken together is particularly appropriate to a web browsing system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having web browsing frmctionality; and wherein the user interface includes a voice interactive functionality. A web browsing system comprising a toy and operative to produce toy content is now described, in accordance with a prefen-ed embodiment of the present invention. Reference is now made to Fig. 91, which is a simplified pictorial illustration of a web browsing system wherein browsing produces content which is subsequently employed by a toy in accordance with a preferred embodiment of the present invention. Turning to Fig. 91, it is seen that a user introduces a question to a toy 8060, the question relating to the baseball player known as Yogi Bena. Toy 8060 answers the question by utilizing a browsing functionality. For example, a computer 8061 browses the web, utilizing standard search engines, and finds answer in a web site maintained on a web server 8063. Computer 8061 obtains content of web site from web server 8063. Computer 8061 generates toy content from content of web site. At some time later, the user watches baseball on a TV 8064. Toy 8060 picks up the TV sound including the speech indicating that the TV program is related to baseball. Toy 8060 introduces to the user a question regarding the baseball player, and provides the user with an answer.

Reference is now made to Fig. 92, which is a simplified flowchart of content generation functionality of Fig. 91. The user introduces a question to toy 8060. Toy 8060 picks up user's utterance and send it to computer 8061. Computer converts utterance to text and recognizes question. Toy 8060 may ask the user the question recognized in order to verify recognition. Computer 8061 conducts a web search via standard search engine. Computer 8061 utilizes in the web search keywords extracted from question, such as proper names it includes. Computer 8061 then searches for an answer to user's question in web sites found in the search, utilizing techniques of natural language processing lmown in the art. If an answer is found, computer 8061 sends answer to toy 8060 and toy verbalizes answer to user. Computer 8061 then generates content of web site to toy content, utilizing techniques of natural language processing known in the art. Computer 8061 also defines conditions for the employment of the toy content, for example the keywords utilized previously in the web search and categories to which the keywords belong. At a later time computer 8061 detects conditions for employment of the content, for example, upon the toy picking up one or some of the keywords or the categories in user's speech or in speech in user's surroundings. Computer 8061 thus sends content to toy 8060. Toy 8060 actuates content.

Reference is now made to Fig. 93 which is a simplified table in the context of Fig. 91 showing a database utilized in storing content generated via web and in profiling a user according to the content. The database includes content items generated via web browsing 8080. For each content item are attached keywords 8081 pertaining to the item, such as keywords utilized in web searching or keywords defined on HTML code of web sites from which the content item was extracted. There are also categories 8082 pertaining to the keywords 8081 attached to each item. The keywords 8081 and categories 8082 are utilized both in profiling a user and in defining conditions for employment of content.

Reference is now made to Fig. 94, which is a simplified pictorial illustration of a web browsing system comprising a toy and providing an interrogation functionality for obtaining information from other interactive toys in accordance with a preferred embodiment of the present invention. Turning to Fig. 94, it is seen that a user asks a toy 8090 a question relating to baseball. A computer 8092 searches an answer to the question via web browsing, as illustrated in Fig. 91. If the search fails to produce an adequate answer, computer 8092 sends question to a toy server 8094, which delivers question to toy users probable of knowing the answer, including the user of toy 8091. Toy 8091 verbalizes question to user. The user answers the question, and computer 8093 sends the answer to server 8094. Server 8094 sends the answer to computer 8092. Toy 8090 verbalizes the answer to the user.

Reference is now made to Fig. 95, which is a simplified flowchart of the inten-ogation functionality of Fig. 94. The user introduces a question to toy 8090. Computer 8091 utilizes web-browsing functionality to search for an adequate answer to the question in a method illustrated in Fig. 92. If the search fails to produce an answer, computer 8091 sends the question to toy server 8094. Toy server 8094 selects a list of users who are probable, among users online, to know the answer to the question, the list including the user of toy 8091. It is appreciated that server 8094 selects the list according to the user profiles, for example according to a database illustrated in Fig. 93, comprising keywords pertaining to toy content produced via web browsing. Server 8094 sends the question to users on the list until an answer is received. It is appreciated that server 8094 verifies the answer by sending the question to a plurality of users. Server 8094 sends the answer to computer 8092. Computer 8092 sends the answer to toy 8090, and toy 8090 verbalizes the answer to the user.

It is appreciated that the functionality of Figs. 91, 92 and 93 taken together is particularly appropriate to a web browsing system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having web browsing functionality; wherein the web browsing functionality produces content which is subsequently employed by the toy.

It is also appreciated that the frmctionality of Figs. 91, 92 and 93 taken together is particularly appropriate to a web browsing system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having web browsing functionality; wherein the web browsing functionality produces content which is subsequently employed by the toy; and wherein the content is added to a stored user profile.

It is appreciated that the functionality of Figs. 91, 92, 93, 94 and 95 taken together is particularly appropriate to a web browsing system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having web browsing functionality; and wherein the user interface also includes intenogation functionality for obtaining information from other interactive toys networked therewith.

A web browsing system comprising a toy that employs a user characteristic as an input is now described, in accordance with a prefened embodiment of the present invention. Reference is now made to Fig. 96, which is a simplified pictorial illustration of a web browsing system that employs a user characteristic as an input, verified from an interaction between the user and toy, in accordance with a preferred embodiment of the present invention. Turning to Fig. 96, it is seen that a user mentions a name of a music band, namely The Five Eagles Band, during an interaction with a toy 8100. At a later time toy 8100, utilizing a web-browsing functionality, detects that the band will have a concert at a place near residence of user. Toy 8100 informs the user of the concert.

Reference is now made to Fig. 97, which is a simplified flowchart of the web browsing functionality of Fig. 94. A computer 8101 browses the web, searching for activities matching the user characteristics and the user location. Computer 8101 utilizes standard search engines using keywords as search terms for defining the user characteristics, such as "The Five Eagles Band".. Computer 8101 also utilizes in web browsing web indexes and web sites specialized in events' activities. It is appreciated that an updated list of such sites is provided by a toy server 8102. If an activity matching the user characteristics is found, computer 8101 generates a message to user, utilizing techniques of natural language processing known in the art, the message infonning the user of the activity. Toy 8100 verbalizes message to user.

Reference is now made to Fig. 98, which is a simplified table in the context of Fig. 97 showing a database record utilized in matching an activity to a user. The table illustrated, includes a user's residential address 8120, supplied to server 8102 at registration. The table also includes keywords 8121, detected in the user's interactions with toy, such as words and phrases detected in speech of user. It is appreciated that a record also contains the number of occurrences of each of the keywords. It is also appreciated that computer 8101 tracks the occurrences of keywords from a defined list of keywords provided by toy server 8102, which is updated regularly. It is further appreciated that the list is partially obtained via interactions of multiplicity of users and toys, for example by detecting recurrent search tenns in web browsing, by obtaining lists of proper names recurrent in the speech of users, or by detecting recunent phrases in speech of a multiplicity of users. In matching an activity to a user, computer 8101 utilizes keywords in a record as illustrated as search terms. It is appreciated that a record as illustrated may be stored on server 8102, on computer 8101 or on both.

It is appreciated that the frmctionality of Figs. 96, 97 and 98 taken together is particularly appropriate to a web browsing system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having web browsing functionality; and wherein at least one user characteristic ascertained from earlier interaction between the toy and a user is employed as an input in the web browsing functionality.

It is also appreciated that the functionality of Figs. 96, 97 and 98 taken together is particularly appropriate to a web browsing system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having web browsing functionality; wherein at least one user characteristic ascertained from earlier interaction between the toy and a user is employed as an input in the web browsing functionality; and wherein the at least one user characteristic is employed by the web browsing functionality for matching the user with an activity offering functionality.

A web browsing system comprising a toy and providing employment agency functionality is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 99 which is a simplified flowchart of a web browsing system providing employment agency frmctionality in accordance with a preferred embodiment of the present invention. As is seen in Fig. 99, a user requests toy 8150 to find him an employment offer. Computer 8151 connected to toy 8150 browses web for appropriate employment offers. Computer 8151 utilizes for browsing standard search engines and web indexes. It is appreciated that computer 8151 utilizes also lists of web sites specializing in employment offers, the lists maintained by toy server 8152 and downloaded to computer 8151 via the Internet. Computer 8151 compares employment offers found with user's residence, age and free time, in order to include a list of offers appropriate to user. Computer 8151 obtains information regarding user's free time via tracking user interactions with toy, the interactions implying that current time is free time. By tracking the interactions computer 8151 detects times that are regularly free. It is appreciated that computer 8151 obtains infonnation regarding free time of the user via toy-diary functionality as illustrated hereinabove. It is also appreciated that computer 8151 includes database elaborating employment types appropriate to different ages. Computer 8151 thus builds a list of employment offers appropriate to user. Computer 8151 sends messages regarding the offers to toy 8150 and toy 8150 verbalizes the messages to user. If the user accepts an offer, computer 8151 supplies the user with further details regarding the offer.

It is appreciated that the functionality of Fig. 99 is particularly appropriate to a web browsing system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having web browsing frmctionality; wherein at least one user characteristic ascertained from earlier interaction between the toy and a user is employed as an input in the web browsing functionality; wherein the at least one user characteristic is employed by the web browsing functionality for matching the user with an activity offering functionality; and wherein the activity offering frmctionality is an employment agency functionality.

A Imowledge management system comprising an interactive toy being connectable to the Internet is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 100 which is a simplified pictorial illustration of a Imowledge management system comprising information retrieval frmctionality, information synthesis functionality and infonnation filtering functionality in accordance with a preferred embodiment of the present invention. Turning to Fig. 100 it is seen that toy 8500 introduces to a user, infonnation regarding the mountain gorilla, the infonnation delivered partially via toy's speech and partially via computer monitor 8502. The information is retrieved from two disparate web sites maintained on web servers 8505 and 8506, one of them supplying the graphical illustration displayed on monitor 8502 and the other supplying content on which the verbal presentation is based.

Reference is now made to Fig. 101 which is a simplified flowchart of the infonnation retrieval, filtering and synthesis functionality of Fig. 100. The user requests from toy 8500 infonnation on a certain subject, such as mountain gorillas. Toy 8500 picks up user's utterance and sends it to computer 8501. Computer 8501 converts utterance to text and identifies the infonnation request. It is appreciated that computer 8501 verifies identification by sending the identified request, possibly rephrased, to toy 8500 which verbalizes it to the user and asks for his verification. Computer 8501 extracts keywords from the infonnation request, for example by filtering out recunent words such as prepositions and common verbs. Computer 8501 browses web for the information requested, based on the keywords. It is appreciated that computer 8501 utilizes for browsing lists of infonnation sources on web, such as encyclopedias, dictionaries, pictures databases ., the lists maintained by toy server 8503. It is appreciated that the lists are categorized by information type, by subjects and by difficulty level. It is also appreciated that computer 8501 utilizes for browsing a categorized dictionary enabling computer 8501 detect categories of keywords in an infonnation request and thus access appropriate information sources from the categorized lists maintained by toy server 8503. Computer 8501 finds textual infonnation on subject requested from in web site maintained on web server 8504. Computer 8501 retrieves the information. Computer 8501 produces verbal information presentation, based on the information retrieved. It is appreciated that computer 8501 applies for that matter techniques of text summarization known in the art. Computer 8501 searches graphical illustrations appropriate to verbal presentation, for example by searching for picture pertaining to keywords in an information request. Computer 8501 integrates explanations of graphical illustrations into the verbal presentation. Computer 8501 sends verbal presentation to toy 8500. Toy verbalizes the presentation. Computer 8501 displays graphical illustrations on monitor 8502 in their appropriate places, for example, when a toy verbalizes an explanation of such an illustration.

It is appreciated that the functionality of Figs. 100 and 101 taken together is particularly appropriate to a knowledge management system comprising: an interactive toy being connectable to the Internet, the intei active toy including a user interface having infonnation management functionality.

It is also appreciated that the functionality of Figs. 100 and 101 taken together is particularly appropriate to a Imowledge management system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having infonnation management functionality; and wherein the information management functionality includes at least one of information retrieval functionality, infonnation synthesis functionality and infonnation filtering frmctionality.

It is further appreciated that the functionality of Figs. 100 and 101 taken together is particularly appropriate to a Imowledge management system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having information management functionality; and wherein the user interface includes a voice interactive functionality.

A knowledge management system comprising a telephone dialer is now described, in accordance with a prefen-ed embodiment of the present invention. Reference is now made to Fig. 102 which is a simplified pictorial illustration of a knowledge management system comprising a phone dialer in accordance with a prefen-ed embodiment of the present invention. Tuming to Fig. 102 it is seen that a user requests toy 8550 to dial a destination for him, namely Dr. Hutchinson, the destination appearing in user's diary. Computer 8551 retrieves phone number of destination and dials the number. Toy 8550 informs the user that the number was dialed and requests that the user should pick up the phone.

Reference is now made to Fig. 103, which is a simplified flowchart of telephone inquiry functionality of Fig. 102. Computer 8551 detects a proper name in a user diary, such as a toy-diary system illustrated hereinabove, the proper name lacking an associated phone number. Computer 8551 initiates an inquiry procedm-e in order to obtain the lacking phone number. Computer 8551 browses web for yellow pages web sites, such as a site maintained on web server 8554. It is appreciated that the URLs of such yellow pages sites are stored on toy server 8553 and downloaded to computer 8551 via the Internet. If a phone nmnber matching the proper name is not found, computer 8551 sends toy 8550 a message requesting the user to obtain the phone number. If a single such phone number is found, computer 8551 adds the number to user's diary. If more than one numbers are found, computer 8551 initiates a differentiating procedure in order to establish which is the correct number. Computer 8551 downloads all search results matching the proper name. Computer 8551 then compares each search result with all other results, thus establishing for each result a differentiating detail, such as residence associated with a phone entry, for all matching search results. Computer 8551 then goes tlirough the list of differentiating details, in order to establish the con-ect phone number lacking. For each detail on the list, toy 8550 asks the user whether the detail is valid in relation to the proper name, until the correct phone number is found and added to diary.

Reference is now made to Fig. 104, which is a simplified flowchart of the telephone dialer frmctionality of Fig. 102. The user asks a toy 8550 to dial a destination. Toy 8550 picks up the user's utterance and sends it to a computer 8551. Computer 8551 converts the utterance to text and identifies a destination, for example by recognizing a proper name within the utterance. Computer 8551 checks whether the destination appears in the user's diary, such as a toy-diary system described hereinabove. If the destination appears in the user's diaiy, computer 8551 retrieves the conesponding telephone number from the diary. Computer 8551 then dials the number on a telephone line to which it and a telephone set 8552 are both connected. After dialing, toy 8550 asks the user to pick up the phone. If computer 8551 does not find a telephone number, toy 8550 requests the number from the user.

It is appreciated that the functionality of Figs. 102, 103 and 104 taken together is particularly appropriate to a knowledge management system comprising: an interactive toy being connectable to the Intemet, the interactive toy including a user interface having infonnation management functionality; and wherein the user interface includes a telephone dialer.

It is also appreciated that the functionality of Figs. 102, 103 and 104 taken together is particularly appropriate to a knowledge management system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having information management functionality; and wherein the user interface includes a telephone inquiry functionality.

It is further appreciated that the functionality of Figs. 102, 103 and 104 taken together is particularly appropriate to a knowledge management system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having information management functionality; and wherein the user interface includes a download to diary frmctionality.

A knowledge management system comprising donors and charity recipients matching functionality is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 105, which is a simplified block diagram illustration of a matching functionality employing the user profile infonnation collected by an interactive toy in accordance with a preferred embodiment of the present invention. As is seen in Fig. 105, a plurality of charity organizations, including an organization 8601, 8602 and 8603, deliver to a toy server 8600 criteria defining profiles of a potential donor or volunteer for the organizations. The criteria are defined in signs typically detectable by an interactive toy, such as words and phrases recurrent in interactions of a user with a toy. Toy server 8600 distributes the criteria to a multiplicity of computers, including computers 8604, 8605 and 8606, in order to match potential donors and volunteers with the plurality of charity organizations.

Reference is now made to Fig. 106, which is a simplified flowchart of the matching functionality of Fig. 105. Server 8600 sends criteria, such as keywords and key phrases, to computer 8604. Computer 8604 tracks via toy 8607 fulfillment of the criteria by user. Computer 8604 tracks, for example, whether the user utters keywords in speech picked up by toy, the nmnber of occurrences of the keywords. It is appreciated that computer 8604 also tracks the fulfillment of Boolean phrases comprised of such keywords, for example by tracking whether a user utters a word in time propinquity to another word. It is further appreciated that computer may track other criteria such as web sites visited by a user accompanied by a toy . If computer 8604 detects fulfillment of one of the profile criteria received from toy server 8600, then computer 8604 reports the fulfillment to server 8600. Server 8600 then sends computer a message suggesting to the user to volunteer or donate to the organization whose profile criteria were fulfilled by user. Toy 8607 verbalizes the message to user. It is appreciated that if a user wishes to donate to a charity organization, server 8600 may effect the transaction involved by debiting an account of a user. Alternately donation is effected via a credit card. It is appreciated If a user wishes to volunteer to such an organization, toy 8607 will supply him with further infomiation needed.

Reference is now made to Fig. 107, which is a simplified table in the context of Fig. 106 showing a database record utilized by toy in user profiling. The illustrated database record elaborates profile criteria utilized by computer 8604 to detect whether a user is a potential volunteer or donor to any of a multiplicity of charity organizations. First column 8630 of table details names of charity organizations. Second column 8631 of table elaborates profile conditions sent to computer 8604 from the organizations via toy server 8600. In the illustrated table the criteria are Boolean phrases to be detected in interactions of the user with toy 8607, for example in speech of the user to toy 8607. Computer 8604 tracks the fulfillment of the criteria by user. For example computer 8604 tracks whether the user uttered the word "hate" within a defined propinquity to the word "fur", thus fulfilling the criterion "hate" near "fur".

It is appreciated that the functionality of Figs. 105, 106 and 107 taken together is particularly appropriate to a knowledge management system comprising: an interactive toy being connectable to the Intemet, the interactive toy including a user interface having information management frmctionality; and wherein the information management functionality includes matching functionality operative to match potential donors with potential charity recipients.

It is also appreciated that the functionality of Figs. 105, 106 and 107 taken together is particularly appropriate to a Imowledge management system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having infonnation management frmctionality; wherein the information management functionality includes matching functionality operative to match potential donors with potential charity recipients; and wherein the matching functionality employs user profile infonnation collected by the toy. It is further appreciated that the functionality of Figs. 105, 106 and 107 taken together is particularly appropriate to a knowledge management system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having information management functionality; and wherein the information management functionality includes matching functionality operative to match potential volunteers with potential charity organizations.

A knowledge management system comprising user status determination functionality and help functionality is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 108, which is a simplified flowchart of user status determination functionality and help functionality provided by a Imowledge management system comprising an interactive toy in accordance with a preferred embodiment of the present invention. During an interaction of a user with a toy 8660, a computer 8661 identifies an in-egular behavior pattern that might imply a user's illness or emotional distress. Computer 8661 initiates a symptom detecting procedure wherein toy

8660 asks a user for the causes of his behavior. Utilizing interactive scripts toy 8660 asks whether the user feels physically well. If during the inquiry user's speech implies symptoms of possible illness, for example if the user complains about pains he suffers, computer 8661 initiates a phone call to a physician, utilizing a dialer functionality as illustrated hereinabove in Figs. 102, 103 and 104. It is appreciated that a telephone number of a family physician is stored on computer 8660. Alternately, toy server provides telephone numbers of physicians. If symptoms of physical illness are not implied, computer 8661 checks whether symptoms of emotional distress, such as complaints about depression or sadness, are implied from user's speech in the inquiry. Computer 8661 stores signs of such symptoms in a database, and then checks the accumulated data in order to detect accumulative symptoms of emotional distress. If such accumulative symptoms, such as recurrent complaints about bad mood, are detected, computer 8661 initiates a phone call to a center of psychological help and diagnosis. It is appreciated that a telephone number of such a center is stored on toy server.

Reference is now made to Fig. 109, which is a simplified block diagram illustration of the symptom and irregular behavior detection frmctionality of Fig. 108. As seen in the figure, a multiplicity of input data types is employed in order to detect an inegular behavior of a user that imply a possible physical illness or emotional distress. When such a behavior is detected, a user's computer 8661 initiates an inquiry procedure that utilizes interactive scripts in order to identify specific physical or mental symptoms of a user. User's computer

8661 utilizes a multiplicity of input data types in order to detect during an interaction with the user in-egular behavior. Computer 8661 identifies non-verbal vocal inputs 8670 such as a crying sound implying distress. It employs techniques of voice analysis lmown in the art to infer an emotional state such as frustration, anger or depression from user's voice 8671. Computer 8661 also analyses speech content 8672 for the same purpose, such as by detecting keywords in user's speech, by measuring length of user's utterances. It uses also tactile inputs, indicating for example whether a user hugs a toy more than usual. Computer 8661 also analyses the structure of interaction of the user with toy 8660, checking, for example, whether a user refuses more than usual to toy's suggestions. When the multiplicity of input data types indicate an inegular behavior that implies a possible illness or emotional distress, computer 8661 initiates a conversation of toy 8660 with user, operated by interactive scripts designed to elicit from user indications of physical or mental symptoms. During conversation computer 8661 continues to utilize the multiplicity of input data types in order to activate the interactive scripts, such as by detecting whether user's answers are aggressive, impatient. During conversation computer 8661 detects keywords and key phrases such as "I don't feel well", "pain", "sad", "depressed" that indicate a specific symptom.

Reference is now made to Fig. 110, which is a simplified table in the context of Fig. 108 showing a database record utilized by computer 8661 in order to detect symptoms of possible illness or emotional distress. The illustrated record lists keywords and key phrases that computer 8661 searches within user's speech in an inquiry procedure illustrated in Figs. 108 and 109. The illustrated record includes also time-spanned indications relating to emotional symptoms, defining states wherein accumulated data require psychological help.

It is appreciated that the functionality of Figs. 108, 109 and 110 taken together is particularly appropriate to a knowledge management system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having information management functionality; and wherein the information management functionality includes user status determination functionality operative to sense a wellness status of a user and help functionality operative to place the user into contact with functionalities equipped to enhance the wellness status of the user.

It is also appreciated that the functionality of Figs. 108, 109 and 110 taken together is particularly appropriate to a knowledge management system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having infomiation management functionality; and wherein the information management functionality includes user status dete nination functionality operative to sense a happiness status of a user and help functionality operative to place the user into contact with functionalities equipped to enhance the happiness status of the user.

It is further appreciated that the functionality of Figs. 108, 109 and 110 taken together is particularly appropriate to a knowledge management system comprising: an interactive toy being connectable to the Intemet, the interactive toy including a user interface having information management functionality; wherein the information management functionality includes user status determination functionality operative to sense a happiness status of a user and help frmctionality operative to place the user into contact with functionalities equipped to enhance the happiness status of the user, and wherein the user status determination functionality includes voice responsive functionality.

A knowledge management system comprising matching functionality operative to match possessions of potential donors with potential charity recipients is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. I l l which is a simplified pictorial illustration of potential donors and potential charity recipients matching functionality in accordance with a preferred embodiment of the present invention. Turning to Fig. I l l it is seen that toy 8700 suggests to the user that the user donate a bicycle he had 4 years ago to a children hospital. User's response is picked up by toy 8700. Computer 8701 recognizes affirmative response and reports response to toy server 8702. Toy server 8702, communicating with a plurality of charity organizations, including organizations 8703 and 8704, arranges delivery of donation to its destination.

Reference is now made to Fig. 112 which is a simplified flowchart illustration in the context of Fig. I l l showing possession reporting functionality. Toy 8700 detects that the user possesses an object. Information regarding possessions of the user is obtained in various means, such as tracking purchases a user effects via a toy or via conversations of toy with user. Computer 8701 reports possession to toy server 8702. Toy server 8702 stores information regarding possession in a database containing reports regarding a multiplicity of users. Such infomiation is then utilized in matching potential donors with charity recipients.

Reference is now made to Fig. 113 which is a simplified table in the context of Fig. 112 showing a database utilized in matching potential donors with potential charity recipients. The database illustrated contains information regarding possessions 8721 of a multiplicity of users 8720. For each possession the database registers the year of report 8722 of the possession and the age 8723 of the user at the time of report. The database enables toy server 8702 to send donation requests to users a long period, typically a number of years, after a possession report, thus increasing the probability of affinnative responses to the donation requests. It is appreciated that the duration of the waiting period between a possession report and a donation request regarding the possession varies according to the type of object involved and to the age of the user.. For example, toy server 8702 may send a donation request for a book that a user at the age of 9 had been reported to possess a year ago, but would wait a longer period for a donation request of a book of a user at the age of 15 or for a donation request of a more valuable object such as bicycle.

Reference is now made to Fig. 114 which is a simplified flowchart of the matching functionality of Fig. 111. Toy server 8702 receives donation requests from a plurality of charity organizations, including organizations 8703 and 8704, the requests typically comprising objects requested. Toy server 8702 utilizes a database as illustrated in Fig. 113 in order to locate potential donors, such as users that have been reported to possess an object requested at least a defined period of time prior to the request. Toy server 8702 sends donation requests to computers of users thus located. Toys of the users, such as toy 8700, verbalize donation requests to users.

It is appreciated that the functionality of Figs. I l l, 112, 113 and 114 taken together is particularly appropriate to a Imowledge management system comprising: an interactive toy being connectable to the Internet, the interactive toy including a user interface having information management functionality; and wherein the information management functionality includes matching functionality operative to match possessions of potential donors with potential charity recipients.

An interactive persona system is now described, in accordance with a preferred embodiment of the present invention.

Reference is now made to Fig. 115, which is a simplified partly pictorial partly schematic illustration of an interactive persona system comprising a three-dimensional artificial person is now described, in accordance with a preferred embodiment of the present invention. Turning to Fig. 115, it is seen that a three-dimensional artificial person 9001 communicates typically wirelessly with a computer 9002, which, in turn, communicates typically via the Intemet with an interactive toy server 9003, and a server 9004 which provides medical and health services. As also seen in Fig. 115, computer 9002 also communicates via a telephone network 9006 and a cellular phone network 9007 with a medical and health phone services system 9005. Computer 9002 also communicates with a fax machine 9008, a printer 9009 and a telephone 9010.

Reference is now made to Fig. 116, which is a simplified partly pictorial partly schematic illustration of three-dimensional artificial person 9001 of Fig. 115. Turning to Fig. 116, it is seen that three-dimensional artificial person 9001 providing a visual impression of a physician includes the following elements: a video camera 9011, a retractable fiber optics probe 9012, magnifying lenses 9013, ECG electrode/s 9014, EEG electrodes 9015, an eye piece 9016, a temperature probe 9017, an electrical pulse amplification system 9018, an ultrasound probe 9019, a sound amplification system 9020, an ultrasound amplification system 9021, blood monitoring sensor/s 9022, urine monitoring sensor/s 9023, a retractable stethoscope 9024 and an inflatable sphygmomanometer cuff 1925.

Reference is now made to Fig. 117, which is a simplified flowchart of the interaction functionality of three-dimensional artificial person 9001 of Figs. 115 and 116. Turning to Fig. 117, it is seen that in the course of verbal interaction between artificial person 9001 and a user, the artificial person com ects to an on-line doctor and/or a local doctor via medical and health phone services system 9005 of Fig. 115. In the illustrated embodiment, artificial person 9001 communicates, based on instructions received from computer 9002, which instructions are based on analysis of verbal input be the user received via artificial person 9001, and verbal input via the on-line doctor of system 9005 received via a phone network 9006.

Reference is now made to Fig. 118, which is a simplified flowchart of another interaction functionality of three-dimensional artificial person 9001 of Figs. 115 and 116. Turning to Fig. 118, it is seen that in the course of verbal interaction between artificial person 9001 and a user, artificial person 9001 receives indications of the condition of the user via any one or more of the probes, such as temperature probe 9017 and fiber optics probe 9012, which indications artificial person 9001 relays typically via the Internet to medical center server 9004. Thereafter, artificial person 9001 continues to communicate with the user based on instructions receive from medical server 9004, either via computer 9002 or directly via an Internet connection on artificial person 9004 such as connection to a cellular phone network.

It is appreciated that the functionality of Figs. 115, 116, 117 and 118 taken together is particularly appropriate to an interactive persona system comprising: a three-dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona.

It is also appreciated that the functionality of Figs. 115, 116, 117 and 118 taken together is particularly appropriate to an interactive persona system comprising: a three- dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattem of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona, and wherein the voice responsive interactive functionality employs artificial intelligence.

It is also appreciated that the functionality of Figs. 115, 116, 117 and 118 taken together is particularly appropriate to an interactive persona system comprising a three- dimensional artificial person including: a computer and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattem of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona, and wherein the three- dimensional artificial person has at least one of an appearance and voice which is characteristic of the persona.

It is also appreciated that the functionality of Figs. 115, 116, 117 and 118 taken together is particularly appropriate to an interactive persona system comprising: a three- dimensional artificial person including a computer and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a mai er which mimics behavior characteristic of the persona, and wherein the three-dimensional artificial person has a pattern of behavior associated with a physician.

It is also appreciated that the functionality of Figs. 115, 116, 117 and 118 taken together is particularly appropriate to an interactive persona system comprising: a three- dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona; wherein the three- dimensional artificial person has a pattern of behavior associated with a physician, and physician accouterments including at least one of: blood pressure sensor; heart activity sensor; brain activity sensor; visual sensors; temperature sensor; breathing sensor; metabolite sensor; and medicament dispenser.

Reference is now made to Figs. 119A and 119B, which are simplified flowcharts in the context of Figs. 115, 116, 117 and 118, showing programming and control functionality of the interactive persona system of Figs. 115, 116, 117 and 118. A user chooses, preferably via keyboard and monitor of computer 9002, one of the three following options of persona system functions: toy functionality, doctor functionality via phone system, doctor functionality via medical Internet server. According to the first option of toy functionality, three-dimensional artificial person 9001 behaves as a toy, which entertains the user and provides no medical services. According to the second option of doctor functionality via phone system, three-dimensional artificial person 9001 connects to system 9005 of medical phone services as shown in Fig. 117 described hereinabove. This options allows the user to communicate with an online doctor while computer 9002 provides content input to artificial person 9001 based on verbal input by both the user and the online doctor. According to the third option of doctor functionality via Intemet system, three-dimensional artificial person 9001 connects to system 9004 of medical Internet services as shown in Fig. 118 described hereinabove. This option allows artificial person 9001 to receive content input from server 9004, which preferably includes a computer having greater computing power than that of computer 9002.

It is appreciated that the functionality of Figs. 115, 116, 117, 1 18, 119A and 119B taken together is particularly appropriate to an interactive persona system comprising: a three-dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona; and wherein the pattern of behavior is at least partially programmable.

It is also appreciated that the functionality of Figs. 115, 116, 117, 118, 119 A and 119B taken together is particularly appropriate to an interactive persona system comprising: a three-dimensional artificial person including: a computer; and a voice responsive interactive frmctionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona; and wherein the pattern of behavior is at least partially programmable by a user.

It is also appreciated that the functionality of Figs. 115, 116, 117, 118, 119 A and 119B taken together is particularly appropriate to an interactive persona system comprising: a three-dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a maimer which mimics behavior characteristic of the persona; and wherein the pattem of behavior is at least partially programmable other than by a user. It is also appreciated that the functionality of Figs. 115, 116, 117, 118, 119A and 119B taken together is particularly appropriate to an interactive persona system comprising: a three-dimensional artificial person including a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona; and wherein the pattern of behavior is at least partially remotely programmable.

It is also appreciated that the frmctionality of Figs. 115, 116, 117, 118, 119A and 119B taken together is particularly appropriate to an interactive persona system comprising: a three-dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona; and wherein the pattern of behavior is at least partially controllable via a computer network.

It is also appreciated that the functionality of Figs. 115, 116, 117, 118, 119A and 119B taken together is particularly appropriate to an interactive persona system comprising: a three-dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattem of behavior associated with a defined persona and being capable of interacting with a user in a maimer which mimics behavior characteristic of the persona; and wherein the pattern of behavior is at least partially controllable via a computer network in real time.

An interactive persona system comprising a three-dimensional artificial person having a pattern of behavior of a teacher is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 120 which is a simplified partly pictorial partly schematic illustration of an interactive persona system comprising a toy having a persona of a known non-teacher and a pattem of behavior of a teacher in accordance with a prefened embodiment of the present invention. Turning to Fig. 120 it is seen that a three-dimensional toy 9090 fashioned after King Richard teaches a user about the crusades. Toy 9090 receives teaching content from computer 9091. Computer 9091 may receive content via the Internet from living objects interactive system 9093, in accordance with toy's 9090 persona. Alternately, toy 9090 receives content living objects interactive system 9093 via external receiver 9092, such as a public wireless communication network. Reference is now made to Fig. 121 which is a simplified flowchart in the context of Fig. 120 showing a teaching functionality provided by a toy having a persona of a famous non-teacher. Fig. 121 illustrates a three-dimensional toy 9100 having the persona of Albert Einstein teaching a physics lesson to a user. The user registers herself on a living objects interactive system 9101, typically identical with living objects interactive system 9093 of Fig. 120. In registration the user supplies system 9101 personal information such as age, gender . System 9101 sends computer 9102, which is nomially in communication with toy 9100, interactive teaching scripts selected in accordance with user's age and with toy's 9100 persona. Computer 9102 sends message to toy 9100, suggesting to the user to leam about light. Toy 9100 verbalizes message to the user and picks up the user response. Computer 9102 converts the user response to text and recognizes that the user agrees to the suggestion. Computer 9102 sends message to toy 9100, delivering instructions to perform a simple experiment. While the user prepares experiment toy 9100 picks up user's voice and sends it to computer 9102. Computer 9102 converts speech to text. Computer 9102 identifies in text keyword "music" and selects accordingly interim content to be actuated while the experiment is being prepared, the content designed in accordance with toy's 9100 persona, namely content relating to Einstein's Jaobby of violin playing. Computer 9102 then recognizes from user's speech that preparations for the experiment are complete, and sends toy 9100 a message containing further instructions for the experiment. Toy 9100 verbalizes instmctions. It is appreciated that toy 9100 is aided by a computer monitor in explaining the meaning of the experiment thus perfonned.

It is appreciated that the functionality of Figs. 120 and 121 taken together is particularly appropriate to an interactive persona system comprising: a three-dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona; and wherein the pattern of behavior is that of a teacher.

It is also appreciated that the functionality of Figs. 120 and 121 taken together is particularly appropriate to an interactive persona system comprising: a three-dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona; wherein the pattern of behavior is that of a teacher; and wherein the persona is of a known non-teacher.

An interactive persona system comprising a toy having a pattern of behavior of a coach is now described, in accordance with a prefened embodiment of the present invention. Reference is now made to Fig. 122 which is a simplified pictorial illustration of an interactive persona system comprising a toy having a persona of a coach in accordance with a prefened embodiment of the present invention. Tuming to Fig. 122 it is seen that a user is using a fitness machine 9112 connected to a computer 9111, the machine communicating to the computer data regarding user performance. Toy 9110, which is typically fashioned after a famous coach, and in RF communication with computer 9111, speaks with the user regarding the user perfonnance and suggests increasing the level of difficulty of machine 9112. The user agrees and computer 9111 increases the level of difficulty.

Reference is now made to Fig. 123 which is a simplified flowchart in the context of Fig. 122 showing coaching functionality of an interactive persona system comprising a toy having a persona of a famous coach. Computer 9111 identifies fitness machine 9112 used by user. If the machine is located in a public fitness institute, computer 9111 may identify the user by a user ID card inserted to machine 9112. Computer 9111 retrieves user's training program. The program is stored on computer 91 11 or on another computer, such as user's home computer, communicating with computer 9111 via a computer network such as the Intemet. Computer 9111 activates machine according to user's training program. Computer 9111 sends toy 9110 message relating to fitness machine 9112. If the machine is at least partially user-controlled, the message contains parameters for activating the machine according to user's training program. Toy 9110, which is typically fashioned after a lmown coach, verbalizes message to user.

It is appreciated that toy's 9110 voice and speech are designed as mimicry to a lmown coach after which toy 9110 is fashioned, for example by the toy repeating rhythm utterances associated with the lmown coach. Fitness machine 9112 sends computer 9111 data regarding the user perfonnance. If the performance concur with user's training program, toy 9110 suggests increasing the level of difficulty. If the user agrees computer 9111 increases level of difficulty of machine 9112. If the machine is user controlled, toy 9110 verbalizes to the user, instructions for increasing the level of difficulty. If user performance does not concur with user's training program, toy 9110 verbalizes encouragement message to user.

It is appreciated that the functionality of Figs. 122 and 123 taken together is particularly appropriate to an interactive persona system comprising: a three-dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona; and wherein the pattem of behavior is that of a coach.

It is also appreciated that the frmctionality of Figs. 122 and 123 taken together is particularly appropriate to an interactive persona system comprising: a three-dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona; wherein the pattern of behavior is that of a coach; and wherein the persona is of a known coach.

An interactive persona system comprising a locomotive three dimensional artificial person is now described, in accordance with a preferred embodiment of the present invention.

Reference is now made to Fig. 124 which is a simplified schematic illustration in the context of Fig. 122 showing a locomotive toy having a persona of a coach in accordance with a prefened embodiment of the present invention. Turning to Fig. 124 it is seen that a locomotive coach toy 9144 such as a toy illustrated in Figs. 122 and 123 is in a public fitness club 9130 comprising a plurality of fitness machines 9132, 9133, 9134 and 9135. Toy 9144 is equipped with an engine enabling it to track the user as the user moves between different machines. Toy 9144 is equipped with four IR receivers 9140, 9141, 9142 and 9143 positioned on the left and right sides of toy on its back and on its front. Each fitness machine is equipped with one or more IR transmitters, such as 9136, 9137, 9138 and 9139. Each of the transmitters transmits a unique IR signal. Computer 9131 identifies the fitness machine a user is operating, for example by a personal magnetic ID card inserted to an appropriate magnetic card reader connected to machine. Toy 9144 sends computer 9131 IR signals received by receivers 9140, 9141, 9142 and 9143, and reports to computer 9131 which receivers among the receivers receives which IR signal. Computer 9131 then identifies location of toy 9144 by an IR signal the toy receives and identifies angular attitude of the toy by the IR receiver receiving the signal. Computer 9131 then sends toy 9144 motion and direction commands, calculated according to a virtual map of fitness club 9130 stored on computer 9131, until toy receives IR signal from the fitness machine currently operated by user. Computer 9131 then sends toy 9144 further motion and direction commands, thus turning toy 9144 so that its front IR receiver 9140 receives the signal, indicating that toy 9144 faces the fitness machine. It is appreciated that at least some of the IR transmitters and/or receivers are positioned at the edges of cone-shaped grooves limiting vision fields of the transmitters and/or receivers, thus enabling computer 9131 a more refined location of toy 9144.

It is appreciated that the functionality of Fig. 124 is particularly appropriate to an interactive persona system comprising: a three-dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a mamier which mimics behavior characteristic of the persona; and wherein the three-dimensional artificial person is locomotive.

An interactive persona system comprising a three-dimensional artificial person having a pattern of behavior of guide is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 125, which is a simplified partly pictorial partly schematic illustration of a three-dimensional artificial guide, a central computer and the flow of information therebetween in accordance with a preferred , embodiment of the present invention. Turning to Fig. 125 it is seen that a three dimensional - artificial person 9151, which may have a persona fashioned after a lmown guide such as.. Marco Polo, includes the following elements: a video camera 9152, a pointing arm 9157, an,' arm tilt sensor 9153, a cellular antenna 9154, a GPS device 9158, a compass 9155, microphone and speaker system 9156. Fig. 125 also shows a suitable Intemet server 9160 comprising a central computer 9161, a database 9162 of digital pictures and a database record 9163 of places to visit. In the illustrated embodiment, central computer 9161 receives a typically wireless transmission from artificial person 9151 comprising a digital picture 9165 by video camera 9152, a location indication by GPS device 9158, a direction indication 9167 by compass 9155, and an am tilt reading by arm tilt sensor 9153. Computer 9161 also retrieves infonnation from database record 9162 of digital pictures and database record 9163 of places to visit. Thus it may be appreciated that computer 9160 is operative to provide content input to artificial person 9151, which content includes both verbal content to be verbalized to the user via speaker 9156 and instmctions for motion of pointing arm 9157 in accordance with the verbal content and the objects in the user's vicinity.

Reference is now made to Fig. 126, which is a simplified flowchart illustration of the functionality of the interactive persona system of Fig. 125. Computer 9161 receives the user verbal input received via microphone 9156 on artificial person 9151. Computer 9161 derives from the user verbal input an indication of an object by means of speech recognition software. Based on the indication of the object requested by the user and an indication of the user location received via GPS 9158, computer 9161 retrieves from database record 9163 the location of the object requested by the user. Based on the retrieved location of the object and the previously received location of the user, computer 9161 calculates the relative direction of the object with respect to the user. Based on the calculated relative direction, the direction of artificial person 9151 received from compass 9155, and arm tilt reading of sensor 9153, computer 9161 calculates a tilt of pointing arm 9157 required in order to point at the object requested by the user. Let: α be the angle measured by compass 9155, namely the relative direction, which artificial person 9151 faces with respect to the magnetic north; β be the relative direction of the requested object with respect to the user; and χ be the angle in which ami 9157 is tilted to the left of the direction, which artificial person 9151 faces.

Then the desired angle δ in which pointing aim 9157 needs to be titled clockwise with respect to its cuιτent tilt may be obtained from the following formula: δ = β - α + χ.

Computer 9161 instructs artificial person 9151 to tilt ami 9157 in the desired angle. Upon receiving an arm tilt sensor reading from sensor 9153, which reading shows that arm 9157 is pointing in the required direction, computer 9151 provides verbal content to artificial person 9151, informing the user that the requested object is in the direction where arm 9157 is pointing to.

Thereafter, computer 9161 receives a digital picture 9165 via video camera 9152, and verbal input from the user, inquiring whether the object shown pointed to by camera 9152 on artificial person 9151 is the object previously inquired about by the user. Based on the previously assessed location of the user, computer 9161 retrieves from database record 9162 a set of digital pictures of objects in the current vicinity of the user. Computer 9161 matches the set of pictures with digital picture 9165 and chooses picture 9168 of the set of pictures, which is the most likely to show the object shown in picture 9165. Based on whether the object shown in picture 9168 as indicated by database registration of database record 9162, computer 9161 provides verbal content input to artificial person 9151, informing the user whether the object pointed to by camera 9152 is the object inquired about by the user. It is appreciated that the functionality of Figs. 125 and 126 taken together is particularly appropriate to an interactive persona system comprising: a three-dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona; and wherein the pattern of behavior is that of a guide.

It is also appreciated that the functionality of Figs. 125 and 126 taken together is particularly appropriate to an interactive persona system comprising: a three-dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona; wherein the pattem of behavior is that of a guide; and wherein the persona is of a known guide.

Reference is now made to Fig. 127 A, which is a block diagram illustration of another functionality of the interactive persona system of Fig. 125. Tuming to Fig. 127A, it is seen that a computer 9161 of Fig. 125 may receive information from a database record 9171 of places to visit, which may be identical to database record 9163 of Fig. 125, database record 9172 of user information not shown in Fig. 125, a weather forecast database 9173, a trip schedule 9174, and a verbal request by the user. Based on these inputs, computer 9161 may provide suggestions of places to visit and well guidance on visiting a particular site. It is appreciated that database 9172 of user infomiation and trip schedule 9174 may be stored on another computer, such as a home computer of the user, from which computer the information nay be retrieved by central computer 9161.

Reference is now made to Fig. 127B, which is a flowchart in the context of Fig. 125 showing the functionality of Fig. 127 A. Turning to Fig. 127B, it is seen that on arriving at a new town, an interactive persona system may suggests places to go based on a request by the user, or the system may come up with suggestions based on infonnation retrieved from database record 9172 of user infonnation, such as the user interests. Other option whereby the interactive persona system of Fig. 125 may suggest places to visit are described in Fig. 127B in a self-explanatory way.

It is appreciated that the functionality of Figs. 125, 126, 127A and 127B taken together is particularly appropriate to an interactive persona system comprising: a three- dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a maimer which mimics behavior characteristic of the persona; and wherein the pattern of behavior is determined at least in part by at least one user characteristic known to the artificial person.

It is also appreciated that the functionality of Figs. 125, 126, 127A and 127B taken together is particularly appropriate to an interactive persona system comprising: a three-dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a mamier which mimics behavior characteristic of the persona; wherein the pattern of behavior is that of a guide; and wherein the pattern of behavior is determined at least in part by at least one user characteristic known to the artificial person.

An interactive persona system comprising a toy having a pattern of behavior associated with a comedian is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 128 which is a simplified pictorial illustration of an interactive toy having a persona of a comedian in accordance with a prefeιτed embodiment of the present invention. Turning to Fig. 128 it is seen that toy 9210, which is typically fashioned after a lmown comedian, tells a user a context related joke, namely a banker joke selected upon picking up an utterance addressed by the user to another person, the utterance containing the word "Bank".

Reference is now made to Fig. 129, which is a simplified flowchart in the context of Fig. 128, showing joke selection functionality and feedback reception functionality. Computer 9211 retrieves a joke. Selection of joke is perfonned according to context, for example according to a keyword recognized in user's speech, and according to user characteristics obtained previously by toy 9210, such as user's reaction to jokes previously told by toy 9210. It is appreciated that computer 9211 receives batches of jokes according to the user profile from toy server 9212, and selects joke form the batches according to context. Alternately, computer 9211 requests a joke from server 9212 and sends server requested context of joke the context designated by a keyword detected in the user's speech. To 9210 suggests telling a joke to the user. If the user agrees, toy 9210 tells the joke. Toy 9210 picks up the user response to joke and sends it to computer 9211. Computer 9211 analyzes response and evaluate whether the user enjoyed the joke. It is appreciated that computer 9211 recognizes whether the user laughed from joke by. It is further appreciated that toy 9210 asks the user for his opinion of the joke. Computer 9211 sends server 9212 evaluation of the user's response to joke. The evaluation is then added to the user profile in order enhance the joke matching functionality of server 9212 in relation to user. Server 9212 receives evaluations of the user responses to jokes from a multiplicity of computers, including computers 9220 and 9221. The evaluations are then utilized in order to select jokes for users by means of pattern matching techniques known in the art.

Reference is now made to Fig. 130 which is a simplified table in the context of Fig. 129 showing a database record utilized in joke selection. The illustrated record shows the responses of a user to various jokes, the responses designated as negative, indifferent or positive. A multiplicity of records as illustrated enable server 9212 to select jokes according to user characteristics obtained by toys. By means of pattern matching techniques known in the art, server 9212 selects a joke for a user, such that other users that typically enjoy the same jokes as the user, enjoyed also the selected joke. It is appreciated that a record as illustrated is enhanced by personal data such as age and gender, enabling a better joke matching by server 9212.

It is appreciated that the functionality of Figs. 128, 129 and 130 taken together is particularly appropriate to an interactive persona system comprising: a tm-ee-dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattem of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona; and wherein the three-dimensional artificial person has pattern of behavior associated with a comedian.

It is also appreciated that the functionality of Figs. 128, 129 and 130 taken together is particularly appropriate to an interactive persona system comprising: a three-dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona; wherein the three-dimensional artificial person has pattern of behavior associated with a comedian; and wherein the persona is of a lmown comedian.

It is further appreciated that the functionality of Figs. 128, 129 and 130 taken together is particularly appropriate to an interactive persona system comprising: a three- dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a mamier which mimics behavior characteristic of the persona; wherein the three- dimensional artificial person has pattern of behavior associated with a comedian; and wherein the three-dimensional artificial person is operative to provide personalized content to a user based on pre-acquired knowledge of at least one characteristic of the user obtained by the three-dimensional artificial person.

It is yet further appreciated that the functionality of Figs. 128, 129 and 130 taken together is particularly appropriate to an interactive persona system comprising: a three- dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattem of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona; wherein the three- dimensional artificial person has pattern of behavior associated with a comedian; and wherein the three-dimensional artificial person is operative to provide at least similar content to a group of users and to receive from the at least plural ones of the group of users feedback regarding the content.

An interactive persona system wherein plurality of toys provide content to a user is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 131 which is a simplified pictorial illustration of a plurality of toys having persona providing content to a user in accordance with a preferred embodiment of the present invention. Tuming to Fig. 131 it is seen that toy 9260, having a comedian persona, is adjunct to toy 9261, having a physician persona. Toys 9260 and 9261 sense propinquity via IR transceivers 9262 and 9263. As is seen in the figure comedian toy 9260 tells physician toy 9261 a doctor joke.

Reference is now made to Fig. 132 which is a simplified flowchart of the content providing functionality of Fig. 131. Toy 9260 receives unique IR signal from toy 9261. Toy 9260 sends the signal to computer 9270, thus enabling computer to identify both toys and establish propinquity between them. Computer 9270 sends toy server 9271 ID codes of toy 9260 and 9261. Server 9271 identifies toys 9260 and 9261 and selects content to be actuated coordinately by the toys. Content is selected based on the respective persona of toys 9260 and 9261, for example a comedian toy tells a doctor joke to a physician toy. Server 9271 sends content to computer 9270. Computer 9270 coordinates actuation of content. Computer 9270 sends first part of content to first toy. Upon receiving from a toy confirmation for the actuation of a previous part of content, computer 9270 sends next part of content to next toy, until actuation of content is complete.

Reference is now made to Fig. 133 which is a simplified table in the context of Fig. 132 showing a database record utilized in content selection for a plurality of toys having persona. The illustrated record contains pointers to content items appropriate to be various combinations of toy. Each field in the database correlates to a pair of toys having persona, such that item 1, for example, is appropriate to an encounter of a physician toy with a comedian toy, such as the encounter illustrated in Fig. 131. It is appreciated that selection of content items is based also on user profile, on context sensed by toys .

It is appreciated that the functionality of Figs. 131, 132 and 133 taken together is particularly appropriate to an interactive persona system comprising: a three-dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona; and comprising a plurality of three dimensional artificial persons.

It is also appreciated that the functionality of Figs. 131, 132 and 133 taken together is particularly appropriate to an interactive persona system comprising: a tliree-dimensional artificial person including: a computer; and a voice responsive interactive functionality employing the computer, the three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of the persona; and comprising a plurality of three dimensional artificial persons; and wherein the plurality of three dimensional artificial persons cooperate in providing content to at least one user.

An inter-toy communication system is now described, in accordance with a preferred embodiment of the present invention.

Reference is now made to Fig. 134, which is a simplified partly pictorial partly schematic illustration of an inter-toy communication system comprising an interactive toy operative for interaction with a plurality of users in accordance with a prefened embodiment of the present invention. Turning to Fig. 134 it is seen that an interactive toy 10011 receives verbal input from a first user, which verbal input relates to a second user. Interactive toy 10011 then interacts with the second user in a way, which is affected by the previous interaction with the first user. In the illustrated embodiment toy 10011 interacts based on instruction received from a computer 10012, which in turn communicates with a suitable server 10014 via Internet 10013. Thus, it may be appreciated that the verbal input of the first user may be processed and that the processed infomiation derived from the verbal input may be utilized in the course of the interaction with the second user.

Reference is now made to Fig. 135, which is a simplified table in the context of Fig. 134, showing a database record 10016 of a user interaction. Turning to Fig. 135 it is seen that database record 10016 provides for each name of a secondary user, the relationship of the secondary user with the primary user of a toy, an event associated with the secondary user, and the attitude of the secondary user to the event. Database record 10016 may be kept on a computer communicating with an interactive toy such as computer 10012 in the case of toy 1001 of Fig. 134. The language processing required in order to input info nation into database record 10016 and/or in order to utilize the information may be performed at least partly also by a remote computer such as server 10014 of Fig. 134.

Reference is now made to Fig. 136, which is a simplified flowchart of the communication functionality of Fig. 134. Toy 10011 interacts with user 1 who may be the user normally interacting with toy 10011. Typically, toy 10011 recognizes user 1, for example, by means of voiceprint. User 1 tells toy 1001 1 that another user named Dan is supposed to come and that the favorite team of that user lost a game. The verbal input of user 1 is processed by computer 10012, which communicates with toy 10011 by means of typically wireless communication therewith. Computer 10012 is typically operative to perform speech recognition as well as natural language processing. In the illustrated embodiment, computer 10012 recognizes that "Dan" is a name of a person, that the person is supposed to be present at the site, and that "team lost game" is an event associated with the person, which event is of emotionally distressing content to the user.

The name "Dan" may be retrieved form a database record of the user friends kept on computer 10012 and/or provided by server 10014 via Internet 10013, in the case where Dan is a toy-user. The natural language processing of the verbal input of the user may be perfonned, for example, by means of artificial intelligence designed to identify persons and events associated therewith. Such context limiting typically improves the performance of artificial intelligence software. Heavy speech recognition and/or language processing tasks included in the functionality of Fig. 134 may be performed, of required, by server 10014, which provides backup computing power to computer 10012.

The processed infonnation derived from the verbal input of user 1 is inputted into database record 10016, typically on computer 10012.

Thereafter, the user named Dan arrives at the site and interacts with toy 10011. This other user, who may be a user not normally interacting with toy 10011 is recognized by toy 10011, for example by his name, which computer 10012 retrieves from database record 10016. Computer 10012 also retrieves from database record 10016 that an event "team lost game" is of emotionally distressing content to the user named Dan. Computer 10012 provides content input to toy 10011, which includes expressing regret over the eyent.

It is appreciated that the functionality of Figs. 134, 135 and 136 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys interconnected via a network, and an inter-toy commimication system comprising: at least one interactive toy operative for interaction with a plurality of users, wherein the interaction of the at least one interactive toy with at least one of the plurality of users is affected by the interaction of the at least one interactive toy with another one of the plurality of users.

An inter-toy commimication system in accordance with another preferred embodiment of the present invention is now described.

Reference is now made to Fig. 137, which is a simplified partly pictorial partly schematic illustration of an inter-toy communication system comprising an interactive toy operative for interaction with a plurality of users in accordance with a prefened embodiment of the present invention. Turning to Fig. 137 it is seen that a first user interacting with an interactive toy 10021 at a site 10027, tells toy 10021 that a second user named "Dan" is supposed to anive at site 10027. The second user nomially interacts with another interactive toy 10022 at another site 10028. Interactive toy 10021 then interacts with the second user in a way which is based on toy 10021 knowing which user it is interacting with and characteristics of the second user. In the illustrated embodiment, toy 10021 interacts based on instructions received from a computer 10023, one of a plurality of computers, which communicates typically via the Intemet with a suitable toy server 10025. The plurality of computers includes computer 10024, which is in typically wireless communication at site 10028 with toy 10022 normally interacting the second user. Thus it may be appreciated that computer 10023 at site 10027 is operative to provide content input to toy 10021 based on knowing which user it is interacting with and characteristics of the user.

Reference is now made to Fig. 138, which is simplified flowchart of the communication functionality of Fig. 137. Toy 10021 interacts with a first user who is the user normally interacting with toy 10021 at site 10027. The first user tells toy 10021 that a friend named Dan is supposed to arrive at site 10027. Computer 10023 in typically wireless communication with toy 10021 processes the verbal input received from the first user. Computer 10023 finds the name "Dan" on a list of friend users of the first user, and retrieves from a database record of friend users a unique identification number associated with the name "Dan". Based on the identification number retrieved from the database record of friend users, computer 1023 downloads from server 1025 a user- visit-file associated with a user named "Dan".. A user- visit-file includes infonnation about a toy-user, which information may be provided to server 1025 by a computer communicating with the toy is the user such as computer 10024 communicating with toy 10022 of Fig. 137. A user- visit-file typically includes infonnation, which a user declared not private such as interests of the user, and preferably includes a voiceprint of the user. In the illustrated embodiment, a user-visit-file includes indication that the user likes football.

When the second user named Dan anives at site 10027 and interacts with the first user and toy 10021, toy 10021 recognizes the second user by means of voiceprint included in the user- visit-file downloaded from server 10025. Computer 10023 provides content input to the toy 10021, which is based on the second user's interest in football, which is indicated in the downloaded user- visit-file.

It is appreciated that the functionality of Figs. 137 and 138 is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected via a network, and an inter-toy communication system comprising: at least one interactive toy operative for interaction with a plurality of users, wherein the interaction of the at least one interactive toy with at least two of the plurality of users is dependent on Imowledge of the toy of which user it is interacting with and characteristics of the user lmown to the toy.

An inter-toy communication system in accordance with yet another preferred embodiment of the present invention is now described.

Reference is now made to Fig. 139, which is a simplified partly pictorial partly schematic illustration of an inter-toy communication system comprising a plurality of interactive toy operative for interaction with at least one user in accordance with a preferred embodiment of the present invention. Turning to Fig. 139 it is seen that a user interacting with an interactive toy 10032 at site 10038 tells toy 10038 that a team named "Wild Chicken" lost a game and that the user is going to visit a friend named Billy. When the user later arrives at another site 10037 and interacts with another interactive toy 10031, toy 10031 interacts with the user in a way, which is affected by the previous interaction between the user and interactive toy 10032. In the illustrated embodiment, toy 10031 interacts based on instructions received fr-om a computer 10033, one of a plurality of computers, which communicate typically via the Internet with a suitable toy server 10035. The plurality of computers includes computer 10034, which is in typically wireless communication at site 10038 with toy 10032 normally interacting with the user. Thus it may be appreciated that the interaction of toy 10031 with the user may be affected by the previous interaction between toy 10032 and the user.

Reference is now made to Figs. 140 A and 140B which taken together are a flowchart of the communication functionality of Fig. 139. A user interacting with toy 10032 tells toy 10032 that a team named "Wild Chicken" lost a game. Toy 10032 is in typically wireless communication with a computer 10034, which is operative to perform speech recognition and natural language processing of verbal input received from the user. Based on a database record of user infonnation, computer 10034 recognizes "Wild Chicken" as the favorite team of the user and the phrase "lost the game" as an event of emotionally distressing content to the user. Computer 10034 registers the event in a database record of user information and provides content input to toy 10032, which content input includes regret over the event.

The user then tells toy 10032 that the user is going to visit a friend named Billy. Computer 10034 finds the name "Billy" on a list of friend users of the first user, and retrieves from a database record of friend users a unique identification number associated with the name "Billy".. Based on the identification number retrieved from the database record of friend users, computer 10034 communicates to server 10035 that the user is going to visit another user nomially interacting with an interactive toy 10031 at site 10037. Computer 10034 may also report server 10035 of the recent event of emotionally distressing content to the user, which event server 10035 may then register in visit file of the user. Server 10035 downloads to computer 10033 a visit file of the user including voiceprint of the user and indication of the recent event of emotionally distressing content to the user. When the user arrives at site 10037 and interacts with the second user and with toy 10031, toy 10031 recognizes the user by means of the voiceprint included in the downloaded visit file and provides to toy 10037 content input, which includes regret over the event of emotionally distressing content to the user, indication of which event is also included in the downloaded visit file.

It is appreciated that the frmctionality of Figs. 139, 140A and 140B is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a network, and an inter-toy communication system comprising: a plurality of interactive toys operative for interaction with at least one user, wherein the interaction of one of the plurality of interactive toys with the at least one user is affected by the interaction of another of the plurality of toys with the at least one user. A multi-toy communication system is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 141, which is a simplified partly pictorial partly schematic illustration of a multi-toy communication system in accordance with a prefened embodiment of the present invention. Tuming to Fig. 141 it is seen that an interactive toy 10501 turns its user's attention to the presence of the user of another interactive toy 10502, which in turn interacts with its own user based on the user's location. In the illustrated embodiment the toys 10501 and 10502 communicate by means of RF modules 10503 and 10504 respectively on the toys 10501 and 10502, which RF modules are preferably low-power, short range RF modules such as the module included in Bluetooth-capable devices. Toy 10501 also communicates with a public wireless communication network via antenna 10505, which public wireless communication network provides in turn communication with a suitable toy server 10507, typically via the Internet. Thus it may be appreciated that toy 10502 may receive content input from server 10507 with which it communicates via toy 10501.

Reference is now made to Fig. 142, which is a simplified flowchart of the communication frmctionality of Fig. 141. Toy 10501 communicates with server 10507 via a public wireless commimication network such as a cellular communication network with cellular antenna 10505. Toy 10501 and toy 10502 respectively include RF modules 10503 and 10504, which low-power short-range RF modules. When RF module 10504 on toy 10502 enters the range of RF module 10503 on toy 10501, wireless communication is established between the RF modules. RF module 10503 communicates its unique number to RF module 10504, which number is then communicated by toy 10501 to server 10507. Based on the unique number of RF module 10504, server 10507 retrieves the toy ID of toy 10502 from a database record, which provides for each RF module number the toy ID associated therewith. Based on the retrieved toy ID of toy 10502, server 10507 checks whether the respective users of toys 10501 and 10502 are acquaintances. If the users are acquaintances, server 10507 instructs toy 10501 to turn its user's attention to the presence of the user of toy 10502 including reference to the name of that user. Server 10507 also instructs toy 10501 to communicate non-verbally with toy 10502, thereby instructing toy 10502 to provide a verbal message to its user including reference to the location of the user, which location may be known, for example, based on the cellular antenna 10505 via which toy 10501 communicates with the cellular communication network. If the users are not acquaintances, server 10507 only instructs toy 10501 to communicate non-verbally with toy 10502 in order to provide a message to the user of toy 10502. It is appreciated that toy 10502 may interact with its user based on stand alone features, and may receive content input from server 10505 as long as it within the range of RF module 10503 on toy 10501 and/or any similar RF module on any similar toy.

It is appreciated that the frmctionality of Figs. 141 and 142 is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a computer network, and a multi-toy communication system comprising: at least one first interactive toy operative for communication with the computer network; at least one second interactive toy operative for communication with the computer network via the at least one first interactive toy.

An interactive toy system comprising propinquity sensing and toy voice recognition functionality is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 143 which is a simplified pictorial illustration of an interactive toy system comprising propinquity sensing and toy voice recognition functionality in accordance with a preferred embodiment of the present invention. Turning to Fig. 143 it is seen that toys 10600 and 10601 are in propinquity with each other. Such propinquity is detected using GPS devices 10602 and 10603 on toys. Toy server 10612 tracks location of toys reported to it by toys via the Internet via public wireless communication network 10611. Server 10612 thus detects propinquity between toys 10600 and 10601. Notwithstanding the propinquity the two toys 10600 and 10601 have no visual contact with each other. Server 10612 detects lack of visual contact by IR transceivers 10604 and 10605 on toys. Server 10612 thus orders toy 10601 to audibly announce toy 10600. If toy 10600 detects the audible announcement server 10612 orders toy 10600 to respond in another audible announcement, thus enabling users to approach each other.

Reference is now made to Fig. 144 which is a simplified flowchart of propinquity sensing and toy voice recognition functionality of Fig. 143. Server 10612 ananges a meeting between two toys 10600 and 10601, for example after two users have scheduled a meeting. Toys utilize GPS devices to report their location to server 10612. Server 10612 calculates distance between toys. If distance is lower than a defined distance, such as 20 meters, server proceeds to check propinquity of toys 10600 and 10601 in more refined measures. Server 10612 checks whether toys 10600 and 10601 are in visual contact with each other. Server 10612 orders toy 10600 to transmit a unique IR signal and then queries toy 10601 whether the signal was received by it. Server may send toy 10601 via the Internet the IR signal thus enabling toy to Identify the signal when received. Alternately toy 10600 transmits a generic toy system signal, enabling toy 10601 to identify the signal without further information. If IR signal was not received by toy 10601 server 10612 proceeds to detect propinquity by toy voice recognition. Server 10612 orders toy 10601 to audibly announce toy 10600, typically using name of toy 10600. Server 10612 sends toy 10600 infomiation regarding the announcement of toy 10601, thus enabling toy 10601 to determine whether the announcement is detected. Infonnation regarding the announcement includes infomiation regarding voice of toy 10601, such as frequencies of toy's voice. Server 10612 may also send toy 10600 a sound file containing complete information regarding the announcement. Alternately toy 10601 may embed an ultra sound signal in the announcement enabling toy 10600 to identify the announcement. Still alternately identification of announcement may be perfonned by server 10612 after receiving sound picked up by toy 10600. If audible announcement of toy 10601 was picked up by toy 10600 Server 10612 orders toy 10600 to respond in another audible announcement.

It is appreciated that a toy may also apply an RF device in order to detect propinquity to another toy. For example, toys may include low-power, short- range RF modules such as the module included in Bluetooth-capable devices.

It is appreciated that the functionality of Figs. 143 and 144 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys interconnected via a computer network and providing a multi-toy location system comprising: location functionality operative to sense at least predetermined propinquity between at least two of the plurality of interactive toys.

It is also appreciated that the functionality of Figs. 143 and 144 taken together is particularly appropriate to a multi-toy location system as the aforementioned and also comprising: propinquity notification functionality operative in response to an output from the location functionality indicating the sensed at least predetermined propinquity for notifying at least one of the at least two of the plurality of interactive toys of at least the predetennined propinquity.

It is further appreciated that the functionality of Figs. 143 and 144 taken together is particularly appropriate to a multi-toy location system as the aforementioned and wherein the location functionality includes toy voice recognition functionality.

An interactive toy system comprising toy communication and toy recognition functionality is now described, in accordance with a prefened embodiment of the present invention. Reference is now made to Fig. 145 which is a simplified pictorial illustration of commimication establishing functionality of a computer and a toy which is not normally in communication therewith in accordance with a prefened embodiment of the present invention. Turning to Fig. 145 it is seen that a user visits a friend-user, bringing with him a visiting toy 11001. Computer 11002 identifies visiting toy 11001 and establishes communication with it, thus enabling home toy 11000 and visiting toy 11001 to interact. Computer 11002 orders home toy 11000 to welcome visiting toy 11001 and then orders visiting toy 11001 to answer welcome. Computer 11002 retrieves infonnation regarding visiting toy 11001, such as toy's name, form server 11004. Alternately the infonnation is stored in a local database in computer 11002, elaborating details regarding toys with which the computer 11002 is authorized to communicate.

Reference is now made to Fig. 146 which is a simplified block diagram illustration of communication functionality of Fig. 145. Computer 11002 communicates with toys via base unit 11003. Base unit includes two RF receivers 11004 and 11005 and RF transmitter 11006. Transmitter 11006 transmits RF signals to toys 11000 and 11001 in two different frequencies RF1 and RF2. First receiver 11004 receives RF signals in first frequency RF1 from home toy 11000 and second receiver 11005 receives RF signals in second frequency RF2 from visiting toy 11001 . Alternately base unit includes one receiver operative to switch between frequencies. Receivers on toys and in base unit 1 1003 are operative to switch between frequencies.

Reference is now made to Fig. 147 which is a simplified flowchart of identification and communication establishing functionality of Fig. 146. Computer 11002 communicates with home toy 11000 via transmitter 11006 in base unit 11003 in frequency RF1, dedicated to communication with the toy. Visiting toy 11002 transmits arrival signal in frequency RF2, which is a generic visiting toy frequency dedicated to establishing communication with visiting toys, the arrival signal including a unique ID code of visiting toy 11001. It is appreciated that transmission in the signal may be turned on and off by the user when switching a toy between home and visiting modes, such as when a user leaves home. When visiting toy 11001 comes into conmumication propinquity with computer 11002 computer receives arrival signal via receiver 11005. Computer 11002 identifies toy 11002 by ID code in anival signal. Computer 11002 checks authorization status for toy 11001. If computer 11002 is authorized to communicate with visiting toy 11001, computer 11002 selects an available frequency RF3 for communication with visiting toy 11001. Computer 11002 transmits frequency hop signal in RF2, ordering visiting toy 11001 to switch to frequency RF3 and including unique toy ID code. Visiting toy 11001 receives signal from computer 11002 and identifies ID code in signal. Visiting toy 11001 switches receiver and transmitter to new frequency RF3, and transmits an acknowledgement signal to computer 11002 in RF3. Computer 11002 now communicates with home toy 11000 in RF1 and with visiting toy

11001 in RF3, thus clearing generic visiting toy frequency RF2 for a possible arrival of another visiting toy. It is appreciated that if the number of toys with which computer 11002 communicates exceeds the number of RF frequencies base unit 11003 may receive simultaneously, two or more toys will share the same frequency. In such a case computer

11002 transmits toy signals comprising ID codes, thus enabling toys to distinguish between signals addressed to them and signals addressed to other toys communicating in the same frequency. Computer 11002 also allocates time segments wherein each of the toys may transmit in the frequency. Computer further allocates a time segment wherein a new visiting toy may transmit.

Reference is now made to Fig. 148 which is a simplified table in the context of Fig. 147 showing a database record that enables a user to authorize a computer to communicate with a visiting toy.. As is seen the database record specifies unique ID codes of toys that computer 11002 is authorized to communicate with. It is appreciated that the codes also enable computer 11002 to retrieve from server 11004 information regarding the toys, such as toys' names. Altemately the information is stored in computer 1 1002.

It is appreciated that the functionality of Figs. 145, 146, 147 and 148 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys, at least one of which being normally in interactive commmiication via a computer with a computer network, the computer including a toy communication functionality comprising: a toy recognition functionality enabling the computer to recognize the identity of a toy which is not nomially in interactive communication therewith, when the toy comes into communication propinquity therewith; and a commimication establishing functionality operative following recognition of the identity of a toy which is not nomially in interactive commimication therewith, when the toy comes into communication propinquity therewith for establishing interactive communication therewith.

It is also appreciated that the frmctionality of Figs. 145, 146, 147 and 148 taken together is particularly appropriate to an interactive toy environment as the aforementioned and wherein the commimication establishing functionality is operative in response to an authorization received from a user of the at least one toy which is nomially in interactive communication with the computer network via the computer.

An interactive toy system providing a multi-toy coordinated activity system is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 149 which is a simplified pictorial illustration of a coordinated activity functionality in accordance with a preferred embodiment of the present invention. Turning to Fig. 149 it is seen that two toys 12000 and 12002 sense propinquity to each other when two users encounter each other on the street. Toys 12000 and 12002 sense the propinquity by IR transceivers 12001 and 12003 on them. Toy server 12006 is notified of the sensed propinquity via the Internet via public wireless communication network 12005. Server 12006 identifies toys 12000 and 12002 and selects toy content appropriate to users of the toys, in the illustrated embodiment the content being a dialogue from Alice in Wonderland by Louis Carol. Server 12006 sends content to toys 12000 and 12002. Toys 12000 and 12002 actuate content coordinately.

Reference is now made to Fig. 150 which is a simplified flowchart of the coordinated activity functionality of Fig. 149. Toy 12000 transmits unique IR signal via IR transceiver 12001. Toy 12002 detects the signal via IR transceiver 12003. Toy 12002 sends IR signal detected to toy server 12006. Server 12006 identifies toy 12000 by the unique IR signal, thus establishing that toys 12000 and 12002 are in propinquity with each other. Server 12006 selects toy content appropriate to users of toys 12000 and 12002. The content is selected based on user profiles and history, for example by shared likes and dislikes of users. Server 12006 then sends toys 12000 and 12002 the content and toys actuate the content coordinately.

Reference is now made to Fig. 151 which is a simplified flowchart of the activity coordination functionality of Fig. 150. Server 12006 sends first toy 12000 first part of content. Toy 12000 actuates the content. Toy 12000 sends server 12006 a signal acknowledging completion of actuation of the content. Server 12006 then sends second toy 12002 a second part of content. Second toy 12002 actuates the content and sends server 12006 a signal acknowledging actuation of content. Alternately, coordination of activity is managed locally by toys 12000 and 12002. For example, toy 12000 receives from server 12006 toy content for itself and for second toy 12002, and relays timely content to second toy 12002 (such as toy 12000 verbalizes an utterance and then sends toy 12002 a response to the utterance, to be verbalized by toy 12002). Still alternately, server 12006 may send toys 12000 and 12002 their respective content parts together with conditions for actuation of any part of the content, for example an utterance to be verbalized after certain words are picked up or after a certain signal (such as IR or ultrasound) is detected, the signal attached to content verbalized by the other toy.

It is appreciated that the functionality of Figs. 149, 150 and 151 taken together is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys intercoimected via a network, and providing a multi-toy coordinated activity system comprising a plurality of interactive toys operative for interaction via a computer network; and a coordinated activity functionality operative via the computer network to cause the plurality of interactive toys to coordinate their actions in a coordinated activity.

An interactive toy system providing a multi toy coordinated activity system comprising coordinated activity of toys located at disparate locations is now described, in accordance with a prefened embodiment of the present invention. Reference is now made to Fig. 152 which is a simplified pictorial illustration of a multi-toy coordinated activity system comprising coordinated activity over disparate locations and toy communication which is not in real time in accordance with a prefened embodiment of the present invention. Turning to Fig. 152 it is seen that three users are playing a game of hide and seek accompanied by toys 12050, 12052 and 12054. The toys are equipped with GPS device 12051, 12053 and 12055 respectively, enabling toys to report their locations to toy server 12062 via the Internet via public wireless communication network 12061. Server 12062 detects that toy 12052 is nearing toy 12050. Toy 12052 tells the user that the user of toy 12050, who is hiding from the other users in the game, is near. Toy 12050 warns the user that another user is approaching him. At the same time, server 12062 detects that a third user, namely the user of toy 12054, is located at a place where the user of toy 12050 has been a few minutes before. Toy 12054 tells the user that the user he is looking for has been where he is now 5 minutes ago.

Reference is now made to Fig. 153 which is a simplified flowchart of the coordination and communication functionality of Fig. 152. A multiplicity of users, including the users of toys 12052 and 12054 are searching after the user of toy 12050. Toys report to server 12062 their locations detected by GPS devices. Server 12062 stores locations of toy 12050 in a database 12070 including time indexes indicating the times when toy 12050 was at different locations. Server 12062 calculates distances of toys to toy 12050, and then checks for each toy whether the distance is shorter then a defined distance such as 50 meters. If the distance is shorter then 50 meters, server 12062 sends messages to toy 12050 and to the toy reported to be near him. Toy 12050 warns the user that another user is approaching. The approaching toy tells the user that he is approaching the user of toy 12050. For each toy, if the toy is not reported to be near toy 12050 server 12062 further checks whether it located at a place where toy 12050 was in the last 10 minutes. If server 12062 detects that a toy is located in such a location, the toy infomis the user that the user of toy 12050 has been where he is now a few minutes ago. It is appreciated that the functionality of Figs.. 152 and 153 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected via a network, and providing a multi-toy coordinated activity system comprising: a plurality of interactive toys operative for interaction via a computer network; and a coordinated activity frmctionality operative via the computer network to cause the plurality of interactive toys to coordinate their actions in a coordinated activity; and wherein the plurality of interactive toys are located at disparate locations.

It is also appreciated that the functionality of Figs. 152 and 153 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected via a network, and providing a multi-toy coordinated activity system comprising: a plurality of interactive toys operative for interaction via a computer network; and a coordinated activity functionality operative via the computer network to cause the plurality of interactive toys to coordinate their actions in a coordinated activity; and wherein the coordinated activity functionality causes the plurality of interactive toys to communicate with each other at least partially not in real time.

A communication system providing communication between at least one of multiple toys and at least one toy and at least one user is now described, in accordance with a prefened embodiment of the present invention.

Reference is now made to Fig. 154, which is a simplified partly pictorial partly schematic illustration of a communication system providing communication between at least one of multiple toys and at least one toy and at least one user in accordance with a preferred embodiment of the present invention. Turning to Fig. 154 it is seen that at a site 13007 a user provides a message to another user by means of verbal input to an interactive toy 13001. At a later stage at site 13008 the second user receives via another interactive toy 13002 the message provided by the first user. In the illustrated embodiment toys 13001 and 13002 respectively communicate with computers 13003 and 13004, which in turn communicate with a suitable toy server 13005. Thus it may be appreciated that a message received by toy 13001 at site 13007 may be provided to a user via toy 13002 at site 13008.

Reference is now made to Fig. 155, which is a simplified flowchart of the communications functionality of Fig. 154. A first user tells toy 13001 that the first user has a message for a second user named Dan. Computer 13003 in communication with toy 13001 recognizes the first user's requests and retrieves required infonnation associated with the name "Dan" from a database record of friend users, which record typically provides for each user name a toy ID of a toy nonnally interacting with the user. Computer 13003 instructs toy 13001 to indicate to the first user that the first user may provide the message. The first user provides the message by means of verbal input to toy 13001. Computer 13003 communicates to server 13005, typically via the Internet, a message package comprising a wave-file of the first user's message and a toy ID of the toy of the second user. According to the communicated toy ID, server 13005 sends to computer 13004 a message package comprising the wave-file of the first user and additional information such as the first user's name. Upon sensing that toy 13002 interacts with the second user, computer 13004 instructs toy 13002 to infonn the second user that the second user has received a message from the first user. Computer 13004 then provides the message to the second user via toy 13002 using the voice of the first user.

It is appreciated that the functionality of Figs. 154 and 155 is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys interconnected via a network, and a communication system providing communication between at least one of multiple toys and at least one toy and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via the computer network to cause at least some of the plurality of interactive toys to communicate with each other at least partially not in real time.

A communication system providing communication between at least one of multiple toys and at least one toy and at least one user in accordance with another preferred embodiment of the present invention is now described.

Reference is now made to Fig. 156, which is a simplified partly pictorial partly schematic illustration of a communication system providing communication between at least one of multiple toys and at least one toy and at least one user in accordance with a preferred embodiment of the present invention. Turning to Fig. 156 it is seen that at a site 13017, a first user requests that an interactive toy 13011 deliver a message a second user, providing the name of the second user and the content of the message to delivered thereto. At a later stage at another site 13018, another interactive toy 13012 infomis the second user of the content of the message provided by the first user. In the illustrated embodiment toys 13011 and 13012 respectively communicate with computers 13013 and 13014, which in turn communicate typically via the Internet with a suitable toy server 13015. Thus it may be appreciated that a message received by toy 13011 at site 13017 may be provided to a user via toy 13012 at site 13018.

Reference is now made to Fig. 157, which is a simplified flowchart of the communications functionality of Fig. 156. A first user tells toy 13011 that the first user wishes to deliver a message to a second user named Dan, providing the content of the message to be delivered to the second user. Computer 13013 in typically wireless communication with toy 13011 recognizes the first user's requests and instructs toy 13011 to verbally so infonn the user. Computer 13013 converts the message content part of the first user's request into text. For example, the message is converted into reported speech, the pronoun I being replaced by the first user's name. Based on the name of the second user, computer 13013 retrieves from a database record of friend users an email address of the second user and sends the converted text message to the email address. The text message is received by toy server 13015, which downloads it to computer 13014 based on the email address of the second user. Computer 13014 converts the text message into voice characteristic of toy 13012 and instincts toy 13012 to verbalize the message to the second user. Alternately, computer 13014 instructs toy 13012 to verbally inform the second user that a message from the first user has been delivered via toy 13011. Then, computer 13014 converts the text message into voice characteristic of toy 13011 and instructs toy 13012 to verbalize the message to the second user.

Reference is now made to Fig. 158, which is a simplified flowchart in the context of Fig. 156 showing another commmiications functionality of the communication system of Fig. 156, which communications functionality enables transmission of movement instructions. A first user tells toy 13011 that the first user wishes to deliver a message to a second user named Dan, providing the content of the message to be delivered to the second user. Computer 13013 in typically wireless communication with toy 13011 recognizes the first user's requests and instructs toy 13011 to the first user that the first user might wish to attach movement instructions to the message for the second user. The first user says yes. Computer 13013 converts the message content part of the first user's request into text. For example, the message is converted into reported speech, the pronoun I being replaced by the first user's name. Then, based on the name of the second user, computer 13013 retrieves from a database record of friend users an email address of the second user and toy ID of toy 13012 of the second user. Based on toy ID of toy 13012 of the second user, computer 13013 retrieves from server 13015 a file of toy motion features appropriate to the toy type of toy 13012. Based on the downloaded file of motion features, computer 13013 instructs toy 13011 to suggest to the first user the types of movement instructions that may be attached to a message to be sent via toy 13012. The first user chooses a set motions possibly to be synchronized with voices typical to toy 13012. Computer 13013 produces a motion file of synchronized motion and sound, attaches the motion file to the previously converted text email message and sends the message to the previously retrieved email address of the second user. The message is received by toy server 13015, which downloads it to computer 13014 based on the email address of the second user. Computer 13014 converts the text message into voice characteristic of toy 13012 and instructs toy 13012 to perform the required motions and to verbalize the message to the second user.

It is appreciated that the functionality of Figs. 156 and 157 is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a network, and a communication system providing communication between at least one of multiple toys and at least one toy and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via the computer network to cause at least some of the plurality of interactive toys to communicate with each other at least partially not in real time, the communications functionality includes a text message to voice conversion functionality.

It is also appreciated that the functionality of Figs. 156 and 157 is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys intercoimected via a network, and a communication system providing communication between at least one of multiple toys and at least one toy and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via the computer network to cause at least some of the plurality of interactive toys to communicate with each other at least partially not in real time, the communications functionality includes a message to voice conversion functionality, which provides a vocal output having characteristics of at least one of the plurality of interactive toys.

It is further appreciated that the functionality of Figs. 156 and 157 is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected via a network, and a coimiiunication system providing communication between at least one of multiple toys and at least one toy and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via the computer network to cause at least some of the plurality of interactive toys to communicate with each other at least partially not in real time, the communications frmctionality includes an e mail commimication functionality. It is appreciated that the functionality of Figs. 156 and 158 is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercoimected via a network, and a communication system providing communication between at least one of multiple toys and at least one toy and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via the computer network to cause at least some of the plurality of interactive toys to communicate with each other at least partially not in real time, the communications functionality includes an e mail communication functionality; and wherein the e mail communication functionality enables transmission of movement instructions to at least one of the plurality of interactive toys.

A communication system providing communication between at least one of multiple toys and at least one toy and at least one user in accordance with yet another preferred embodiment of the present invention is now described.

Reference is now made to Fig. 159, which is a simplified partly pictorial partly schematic illustration of a communication system providing communication between at least one of multiple toys and at least one toy and at least one user in accordance with a preferred embodiment of the present invention. Turning to Fig. 159 it is seen that at a site 13027, a first user requests that an interactive toy 13021 deliver a message a second user, providing the name of the second user and the content of the message to delivered thereto. At a later stage at another site 13028, another interactive toy 13022 informs the second user of the content of the message provided by the first user. In the illustrated embodiment toys 13021 and 13022 respectively communicate with computers 13023 and 13024, which in turn communicate typically via the Internet with a suitable toy server 13025. Thus it may be appreciated that a message received by toy 13021 at site 13027 may be provided to a user via toy 13022 at site 13028.

Reference is now made to Fig. 160, which is a simplified flowchart of the communications functionality of Fig. 159. A first user tells toy 13021 that the first user wishes to deliver a message to a second user named Dan, providing the content of the message to be delivered to the second user. Computer 13023 in typically wireless communication with toy 13021 recognizes the first user's requests and instructs toy 13021 to verbally so infonn the user. Computer 13023 converts the message content part of the first user's request into text. For example, the message is converted into reported speech, the pronoun I being replaced by the first user's name. Based on the name of the second user, computer 13023 retrieves from a database record of friend users an email address associated with toy 13022 of the second user and sends the converted text message to the email address. The text message is received by toy server 13025, which downloads it to computer 13024 based on the email address of toy 13022. Computer 13024 instructs toy 13022 to inform the second that toy 13022 has received an email message from toy 13021 of the first user providing the name of the first user and suggesting that the second user might wish to listen to the content of the message. If the second user says "yes", computer 13024 converts the text message into voice characteristic of toy 13022 and instructs toy 13022 to verbalize the message to the second user. If the user says "no", computer 13024 sends an email message to toy 13021 of the first user informing the first user via toy 13021 that the second user did not wish to listen to the email message sent by the first user.

It is appreciated that the functionality of Figs. 159 and 160 is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys interconnected via a network, and a communication system providing communication between at least one of multiple toys and at least one toy and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via the computer network to cause at least some of the plurality of interactive toys to communicate with each other at least partially not in real time; and wherein at least some of the plurality of interactive toys has an e-mail address which is distinct for that of a user thereof.

An interactive toy system comprising communication system providing communication between a toy and a user via a telephone link is now described, in accordance with a prefened embodiment of the present invention. Reference is now made to Fig. 161 which is a simplified pictorial illustration of user and toy telephone communication functionality in accordance with a preferred embodiment of the present invention. Turning to Fig. 161 it is seen that a user speaks with a toy 13504 via telephone 13500. User's speech is converted to text and communicated to home computer 13503 via the Internet via toy server 13502 and via interactive voice response computer 13501. Home computer 13503 which is normally in communication with toy 13504 generates a response in typical voice of toy 13504. Such response includes content typical to toy 13504, and may include information obtained via prior interaction of toy 13504 with user.

Reference is now made to Fig. 162 which is a simplified flowchart of the communication functionality of Fig. 161. The user phones interactive voice response computer 13501. After the phone call is answered, the user dials a code of toy 13504. Computer 13501 identifies the toy requested by the code. It is appreciated that the user also dials a password enabling him to access toy 13504. Alternately computer 13501 uses standard voice recognition technology in order to authenticate the user identity by a verbal annunciation of user. Computer 13501 retrieves phone call response of toy 13504 from computer 13503 via toy server 13502 via the Intemet. The response, such as "Hello", is verbalized in typical voice of toy 13504. The user speaks with toy 13504. Computer 13501 converts user's speech to text, and sends it to home computer 13503 via the Internet via toy server 13502. Home computer 13503 generates response of toy 13504 in typical voice of toy 13504 and sends the response to computer 13501, the response comprising content typical of toy 13504.

It is appreciated that toy 13504 may also call a user over a phone when toy knows where the user is. For example if a toy schedule, such as described hereinabove, details the location of a user in a specific time, a toy may call the user at the location and remind him of his next schedule item.

In another preferred embodiment of the present invention server 13502 generates a typical response of toy 13504, thus enabling communication between the user and toy 13504 when home computer 13503 is not connected to the Internet.

Reference is now made to Fig. 163, which is a simplified flowchart in the context of Fig. 161 showing another communication frmctionality of the communication system of Fig. 161, which coimnunication functionality enables a user to provide instructions to an interactive toy to cany out physical functions. The user phones interactive voice response computer 13501. After the phone call is answered, the user dials code of toy 13504. Computer 13501 identifies toy requested by code. It is appreciated that the user also dials a password enabling him to access toy 13504. An answering machine on computer 13501 instructs the user how to provide a message to be delivered via toy 13504, which message may include a combination of voice message and physical functions. In the illustrated embodiment each motion feature available for the toy type in question has its own DTMF code. In addition, a voice message section begins with unique DTMF code. The user pushes a series of phone buttons corresponding to the desired series of physical functions and record one or more voice message sections. When the message is completed, computer 15301 produces a message file comprising a series of wave-files and DTMF codes and sends the message to computer 13503 via toy server 13502. Computer 13504 converts the DTMF codes into their corresponding motion commands. When toy 13504 senses the presence of a second user, computer 13503 instructs toy 1350d to deliver to the second user the voice message together with the motions provided by the first user. It is appreciated that the functionality of Figs. 161 and 162 taken together is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys intercomiected via a network, and providing a communication system providing communication between at least one of multiple toys and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate with at least one user via a telephone link.

It is also appreciated that the functionality of Figs. 161 and 162 taken together is particularly appropriate to an interactive toy communication system as the aforementioned wherein the communications functionality includes an interactive voice response computer operative to enable the user to communicate by voice with the at least one of the plm-ality of interactive toys.

It is appreciated that the functionality of Figs. 161 and 163 taken together is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a network, and providing a communication system providing communication between at least one of multiple toys and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate with at least one user via a telephone link; and wherein the communications functionality enables a user to provide instructions to at least one of the plurality of interactive toys to carry out physical functions.

An interactive toy system comprising communication system providing communication between a toy and a user via a telephone link in accordance with another preferred embodiment of the present invention is now described. Reference is now made to Fig. 164, which is a simplified partly pictorial partly schematic illustration of an interactive toy coimnunication system providing communications functionality, which enables a user to communicate with an interactive toy and another user via a telephone link in accordance with a prefened embodiment of the present invention. Turning to Fig. 164 it is seen that a first user communicates with an interactive toy 13531 via a phone link, and toy 13531 verbally communicates with a second user infonning the second user that the first user wishes to talk to the second user. In the illustrated embodiment the second user phones a voice responsive computer 13534, which is in communication with a toy server 13533, which in turn communicates typically via the Intemet with a multiplicity of computers including computer 13532, which provides content input to toy 13531. Thus it may be appreciated that the second user may communicate at the same time both with interactive toy 13531 and with the first user.

Reference is now made to Figs. 165 A and 165B, which taken together are a simplified flowchart of the communications functionality of Fig. 164. The second user calls a voice responsive computer 13534 of toy server 13533 and chooses an option of "contact toys and users" on a menu of an answering system. The second user selects a desired toy and/or user by providing toy ID or a user name. The second user is required to dial a password. If a correct password is dialed, the answering system provides response in the voice of toy 13531, which response may or may not be accompanied by the actual toy 13531 speaking. The answering system is also operative to identify the first user, namely the user normally interacting with toy 13531, typically by means of voiceprint. In the embodiment illustrated in Fig. 164, the second user requests to speak with the first user. Server 13533 instructs computer 13532 to check partly via toy 13531 whether the user is present at the home site and is available for interaction. For example, computer 13532 checks a schedule database for the first user to determine whether the first may be at home and may be interrupted. If the first user is available, computer 13532 instructs toy 13531 to suggest to the first user that the first use might to talk to the second user, to let toy 13531 talk to the second, or to request that the second user leave a message for the first user. If the first user chooses to talk or let toy 13531 talk to the second, the voice of the second user may be heard by the first user, preferably via toy 13531, and the voice of the first user received via toy 13531 may be heard by the second user via the telephone link. Voice input from both the first and the second user is communicated to server 13533, which provides content input to toy 13531, which content input is communicated both to computer 13531 and to computer 13534, thus heard by both users.

It is appreciated that the functionality of Figs. 164, 165 A and 165B is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys interconnected via a network, and providing a communication system providing communication between at least one of multiple toys and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate with at least one user via a telephone link; and wherein the communications functionality enables another user to communicate with the at least one user via the telephone link.

An interactive toy system comprising a plurality of interactive toys and providing communication between at least one of multiple toys and at least one user is now described, in accordance with a prefened embodiment of the present invention.

Reference is now made to Fig. 166, which is a simplified partly pictorial partly schematic illustration of a communication system providing communication between at least one of multiple toys and at least one user in accordance with a preferred embodiment of the present invention. Turning to Fig. 166 it is seen that at a site 13707, a first user tells an interactive toy 13701 that he wishes to play with a second user named Dan. Then, at another site 13708, another interactive toy 13702 tells the second user that the first user wishes to play with him. The second user answers toy 13702, and the answer is delivered to the first user by toy 13701 at site 13707. In the illustrated embodiment, toys 13701 and 13702 interact based on instmctions respectively received from computers 13703 and 13704, which in turn communicate typically via the Intemet with a suitable toy server 13705. Thus, it may be appreciated that toy 13701 at site 13707 may communicate with the second user at site 13708 and that toy 13702 at site 13708 may symmetrically communicate with the first user at site 13707 in real time.

Reference is now made to Fig. 167, which is a simplified flowchart of the communications functionality of Fig. 166. The first user tells toy 13701 that he wishes to play with another user named Dan. Computer 13703 retrieves a toy ID of toy 13702 of the second user from a database record of friend users, which database record provides for each name of friend user, the toy ID of the toy thereof. Based on the toy ID of toy 13702 computer 13703 sends a query to server 13705 to check whether toy 13701 may directly communicate with the second user via toy 13702. Typically, server 13705 includes an updateable database record of friend users, where server 13705 checks whether the first user is one the list of the second, which list is typically updated by the second user via computer 13704. If the direct communication is allowed, computer 13701 converts the message of the first user into text and into reported speech and sends the converted message to server 13705, typically via the Internet. Server downloads the message for the second user to computer 13704. Computer 13704 checks if the second user is available that is, sensed by toy 13702 and not engaged in interaction with toy 13702, which interaction may not be intermpted. If the second user is available, computer 13704 instmcts toy 13702 to verbally communicate with the second user delivering to the second user the content of the message of the first user. The first user verbally responds. Computer 13704 converts the verbal response of the second user into reported speech text message and sends it to server 13705. Server 13705 downloads the message of the second user to computer 13703, which instructs toy 13701 to verbally deliver the message to the first user. It is appreciated that other options of the procedure of Fig. 167, which options do not involve inter-toy communication in real time are described in Fig. 167 in a self-explanatory way.

It is appreciated that the functionality of Figs. 166 and 167 taken together are particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected via a network, and a commimication system providing communication between at least one of multiple toys and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate with at least one user via at least another one of the interactive toys.

An interactive toy system comprising inter-toy speech and motion communication functionality is now described, in accordance with a prefen-ed embodiment of the present invention. Reference is now made to Fig. 168, which is a simplified pictorial illustration of speech and motion communication functionality in accordance with a prefened embodiment of the present invention. Tuming to Fig. 168 it is seen that a first user moves toy 14000 hands and speaks according to toy 14000 character as it is situated in the context of toy 14002, namely as a jester in front of a king. Toy 14000 picks up user's speech. Motion sensors 14001 on toy 14000 sense motion of hands. Computer 14009 receives motion and speech signals and converts them to a toy content file. Computer 14009 sends file to server 14011 via the Internet. Server 14011 adapts file to toy 14004. Server 14011 sends file to computer 14010. Computer 14010 sends motion and speech commands to toy 14004. Toy 14004 verbalizes speech commands and actuates motion commands by actuators 14005, thus generally replicating motion of toy 14000. After performance of the toy content by toy 14004, motion sensors 14007 on toy 14006 do not detect motion imparted on it by second user and toy 14006 does not detect second user's speech, thus indicating that the user did not produce a response to the content actuated by toy 14004. Therefore server 14011 generates a response appropriate to toy 14006 and sends it to computer 14010, the response including speech and motion commands. Toy 14006 verbalizes speech commands and actuates motion commands by actuators 14008. Server 14011 then adapts the response to toy 14002 and sends adapted response to computer 14009. Toy 14002 actuates response in speech and in motion actuated by actuator 14003. Reference is now made to Fig. 169 which is a simplified flowchart of the motion and speech communication functionality of Fig. 168. Toy 14000 picks up user's speech. Sensors 14001 on toy sense motion imparted on toy 14000 by user. Computer 14009 converts speech to text. Computer 14009 converts motion commands received from sensors 14001 to motion commands. Computer 14009 creates a toy content file comprising synchronized motion and speech. After converting speech to text computer 14009 utilizes recorded speech in order to measure the actual length of each spoken word thus synchronizing text and motion commands. Computer 14009 sends toy content file thus generated to server 14011. Server 14011 adapts the file to toy 14004. For that matter server 14011 converts motion commands included in file to commands appropriate to the type of toy 14004. It is appreciated that server 14011 may replace commands for actuators not provided by toy 14011 with commands for equivalent actuators, for example by converting hand motion to head motion according to a defined schema. Server 14011 sends file thus converted to computer 14010. Computer 14010 sends motion and speech commands to toy 14004. Toy 14004 verbalizes speech in toy's voice, and actuates motion commands. After actuating the commands computer 14010 checks whether second user responds in speech and motion imparted on toy 14006. If such a response is detected computer 14010 repeats process described above, thus communicating motion and speech from toy 14006 to toy 14002. Otherwise, computer

14010 requests server 14011 to generate a response appropriate to toy 14006. Such response is generated using toy content database comprising speech and motion content categorized by different types of toys and by different contexts wherein the toys are situated. Server

14011 selects response based on keywords detected in speech of toy 14004 and based on profile of the second user and on toy history of toy 14006. Alternately such response is generated by means of artificial intelligence technology, such as chatterbot software. Server 14011 sends response to computer 14010. Computer 14010 sends speech and motion commands to toy 14006. Toy 14006 sends computer 14010 a signal acknowledging execution of the commands. Computer 14010 notifies server 14011 of actuation of response. Server 14011 adapts response to toy 14002 and sends adapted response to computer 14009. Computer 14009 sends speech and motion commands to toy 14002. Toy 14002 actuates commands.

It is appreciated that at least one of toy 14002 and toy 14004 in the illustrated embodiment may be replaced by a virtual image of a toy on a monitor, actuating motions commands sent from server 14011 by means of standard animation software, and verbalizing speech content sent from server 14011 via speakers on computer. It is appreciated that the functionality of Figs. 168 and 169 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercoimected via a network, and providing a communication system providing communication between at least one of multiple toys and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate motions of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys.

It is appreciated that the functionality of Figs. 168 and 169 taken together is particularly appropriate to an interactive toy enviromnent as the aforementioned and wherein the another of the plurality of interactive toys generally replicates the motion of the first one of the plurality of interactive toys.

It is appreciated that the functionality of Figs. 168 and 169 taken together is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys intercomiected via a network, and providing a communication system providing commimication between at least one of multiple toys and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate motions of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys; and wherein the communications frmctionality is operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate motions and speech of at least one of the plurality of interactive' toys to at least another of the plurality of interactive toys.

It is appreciated that the frmctionality of Figs. 168 and 169 taken together is particularly appropriate to an interactive toy environment as the aforementioned and wherein the communications functionality is operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate synchronized motion and speech of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys.

It is appreciated that the functionality of Figs. 168 and 169 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected via a network, and providing a communication system providing communication between at least one of multiple toys and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate motions of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys, wherein the communications frmctionality employs software instructions to the first one of the plurality of interactive toys for transmission to the another of the plurality of interactive toys.

It is appreciated that the functionality of Figs. 168 and 169 taken together is particularly appropriate to an interactive toy environment as the aforementioned and wherein the another of the plurality of interactive toys generally replicates the motion of the first one of the plurality of interactive toys.

It is appreciated that the functionality of Figs. 168 and 169 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys interconnected via a network, and providing a communication system providing communication between at least one of multiple toys and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate motions of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys, wherein the communications functionality employs software instructions to the first one of the plurality of interactive toys for transmission to the another of the plurality of interactive toys; and wherein the communications functionality is operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate motions and speech of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys.

It is appreciated that the functionality of Figs. 168 and 169 taken together is particularly appropriate to an interactive toy enviromnent as the aforementioned and wherein the communications functionality is operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate synchronized motion and speech of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys.

It is appreciated that the functionality of Figs. 168 and 169 taken together is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a network, and providing a communication system providing communication between at least one of multiple toys and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate motions of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys, wherein the communications functionality employs information regarding sensed motion of the first one of the plurality of interactive toys for transmission to the another of the plurality of interactive toys.

It is appreciated that the functionality of Figs. 168 and 169 taken together is particularly appropriate to an interactive toy environment as the aforementioned and wherein the another of the plurality of interactive toys generally replicates the motion of the first one of the plurality of interactive toys.

It is appreciated that the functionality of Figs. 168 and 169 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected via a network, and providing a communication system providing commimication between at least one of multiple toys and at least one user, the system comprising: plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate motions of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys, wherein the communications functionality employs information regarding sensed motion of the first one of the plurality of interactive toys for transmission to the another of the plurality of interactive toys; and wherein the communications functionality is operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate motions and speech of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys.

It is appreciated that the functionality of Figs. 168 and 169 taken together is particularly appropriate to an interactive toy environment as the aforementioned and wherein the communications frmctionality is operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate synchronized motion and speech of at least one of the plurality of interactive toys to at least another of the plurality of interactive toys. An interactive toy system providing an integrated toy-game frmctionality wherein a toy plays a game as a player is now described, in accordance with a preferred embodiment of the present invention.

Reference is now made to Fig. 170 which is a simplified pictorial illustration of a toy-game functionality wherein a toy participates in a game as a player in accordance with a preferred embodiment of the present invention. Turning to Fig. 170 it is seen that toy 16000 and a user are playing a word-game wherein a participant has to say a word from a defined categoiy, such as names of animals, the word beginning with the letters ending the word previously spoken by the other participant.

Reference is now made to Fig. 171 which is a flowchart of the gaming functionality of Fig. 170. Toy 16000 begins the game described above by verbalizing a word from the selected category. Computer 16001 then waits 30 seconds for an utterance of the user. If the user does not utter a word within the 30 seconds, toy 16000 announces the user that he has lost the game. If toy 16000 picks up an utterance within the 30 seconds, computer 16001 converts the utterance to text and sends text to server 16002. Server 16002 checks whether the utterance is a conect response to the word verbalized previously by toy 16000. Server 16002 performs the check utilizing a dictionary that defines different categories valid for different words. If the user utterance is a correct response computer 16001 determines whether to continue the game according to the level of difficulty defined for the game, the level of difficulty deteπnining, for example, the number of words that toy 16000 is enabled to verbalize within each round of the game. If computer 16001 detennines to continue the game, server 16002 selects a word according to the last word spoken by the user and according to the level of difficulty defined for the game. Server 16002 selects such a word using a dictionary as described above. Otherwise, toy 16000 announces to the user that it has lost the game. If user's utterance is not a conect response, toy 16000 suggests to the user to find another word.

Reference is now made to Fig. 172 which is a simplified flowchart showing response to a sensed user characteristic functionality of Fig. 170. A user plays a game with toy. Computer 16001 evaluates user's emotional state according to data sensed by toy 16000, such as user's voice. If computer 16001 detects a negative emotional state such as frustration, aggression, or boredom, computer 16001 selects a type of intervention in the course of the game. The type of intervention include giving the user a gaming tip, changing the level of difficulty of the game, and making a toy comment relating to the user or to the user's gaming activity. Such comments are divided into disparate types, such as challenging comments, encouraging comments . It is appreciated that computer 16001 provides interventions of the types appropriate to for each game provided by toy 16000. When a game ends, computer 16001 registers in database infomiation regarding the game, such as total length of game, the user's performance in game, user's detected emotional states, types of interventions utilized during the game .. the infomiation enabling computer 16001 to select intervention more efficiently in the future.

Reference is now made to Fig. 173 which is a simplified block diagram illustration of the emotional state sensing functionality of Fig. 172. The diagram show types of sensory data utilized by computer 16001 in order to evaluate a user's emotional state. Computer 16001 analyses user's voice 16011 in techniques lmown in the art in order to identify emotional states such as frustration or enthusiasm. Computer 16001 also analyzes the content 16012 of the user's speech in order to detect such emotional states. Such detection may be performed utilizing lists of key-words indicating an extreme emotional state. Furthennore, computer 16001 analyzes tactile input data 16013, sensed by tactile sensors on toy. Such data may imply an emotional state such as excitement (for example, when a user hugs toy) or aggression (for example, when a user applies extreme physical force on toy).

Reference is now made to Fig. 174 which is a simplified table in the context of Fig. 172 showing database record utilized in measuring the effectiveness of different interventions in the course of a game relative to a specific user and a specific type of game. The database illustrated includes data regarding the average length of games wherein different types of intervention have been applied, and the average perfonnance of the user during the games. It is appreciated that a perfonnance measurement tool will be provided for games involving toys. When detennining the type of intervention computer 16001 selects a type that yielded best results in both fields, lengthier playing duration and better performance implying a greater measure of enjoyment from a game.

It is appreciated that computer 16001 also measures the direct effectiveness of various types of intervention, by tracking changes in the emotional states of a user after the interventions have been applied.

It is appreciated that the functionality of Figs. 170, 171, 172, 173 and 174 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys interconnected via a computer network, and providing an integrated toy- game functionality comprising: a game which may be played by a user; at least one interactive toy containing game-specific functionality which participates in playing the game; and wherein the game-specific frmctionality enables the interactive toy to play the game as a player; and wherein the game-specific functionality enables the interactive toy to have voice interaction with the user in the course of playing the game; and wherein the game-specific functionality enables the interactive toy to be responsive to at least one current characteristic of the user as sensed by the interactive toy.

An interactive toy system comprising toy-game functionality wherein toy assists a user in playing a game is now described, in accordance with a prefened embodiment of the present invention.

Reference is now made to Fig. 175 which is a simplified pictorial illustration of a toy-game functionality wherein toy assists a user in playing a game in accordance with a preferred embodiment of the present invention. Tuming to Fig. 175 it is seen that a user plays a computer adventure game via joystick 16104 and monitor 16103. As is seen in the figure during the game toy 16100 delivers to user information about the game scene not seen on monitor 16103. It is appreciated that computer 16101 may deliver such information via toy 16100 in a way that creates for toy 16100 a distinct view point relative to the game scene, for example by the toy delivering infonnation about game events that occur behind the back of a character in the game.

Reference is now made to Fig. 176 which is a simplified pictorial illustration in the context of Fig. 175 showing a voice interaction functionality in a game. During a game, a user asks a toy how many game points he has obtained. Toy 16100 picks up user's speech and sends it to computer 16101. Computer 16101 converts speech to text and identifies a user question. Toy 16100 tells the user the amount of game points obtained, and since the amount is within a defined proximity to the user record in the game, toy 16100 also tells the user that he is approaching his record.

It is appreciated that toy 16100 in Figs. 175 and 176 may also be responsive to cun-ent characteristics of a user as sensed by the toy in a method illustrated in Figs. 172, 173 and 174. Toy 16100 senses emotional state indicating inputs from a user, such as user's voice and user's content of speech. When the inputs indicate a negative emotional state, such as aggression or frustration, computer 16101 adjusts accordingly the involvement of toy 16100 in the game, for example, by delivering more infomiation and game tips via the toy, or by verbalizing more encouraging or more challenging comments regarding the user and his perfonnance in the game.

It is appreciated that the functionality of Figs. 172, 173, 174, 175 and 176 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercoimected via a computer network, and providing an integrated toy- game functionality comprising: a game which may be played by a user; at least one interactive toy containing game-specific functionality which participates in playing the game; and wherein the game-specific functionality enables the interactive toy to assist the user in playing the game; and wherein the game-specific functionality enables the interactive toy to have voice interaction with the user in the course of playing the game; and wherein the game-specific functionality enables the interactive toy to be responsive to at least one current characteristic of the user as sensed by the interactive toy.

An interactive toy system comprising toy-game functionality wherein a toy is employed as a user interface in playing a game is now described, in accordance with a prefened embodiment of the present invention.

Reference is now made to Fig. 177 which is a simplified pictorial illustration of a toy-game functionality wherein a toy is employed as a user interface to a game in accordance with a prefened embodiment of the present invention. Turning to Fig. 177 it is seen that two users are playing a fighting game over a network, employing toys 16300 and 16301 as interfaces to the game. As is seen, users move toys' body parts and images of characters in the game move accordingly. For example, a user reaches the hand toy 16301 forward, and images 16303 and 16305 of the game character associated with the toy reaches its hand accordingly. As the hand touches images 16302 and 16304 of the character associated with toy 16300, the motion is considered a successful hit. Therefore toy 16300 verbalizes a pain cry, and speaker 16308 connected to computer 16311 announces a similar pain cry in the voice of the character associated with toy 16300.

Reference is now made to Fig. 178 which is a simplified flowchart of the toy- interface functionality of Fig. 177. Sensors 16320 and 16321 on toys 16300 and 16301 respectively sense body part motions of the toys. Computers 16310 and 16311 send data regarding the body parts motions to server 16312. Server 16312 updates a game scene which elaborates the positions and postures of characters in game in relation to a defined game scene. Server 16312 then creates an updated image of the game scene and sends it to computer 16310 and 16311. Computers 16310 and 16311 display updated image on monitors 16306 and 16307 respectively.

Reference is now made to Fig. 179 which is another simplified flowchart of the toy interface functionality of Fig. 177. Server 16312 updates game scene as illustrated in Fig. 178. Server 16312 then checks whether a contact occurs between the game character associated with toy 16300 and 16301. If a limb of a character touches the other character, the toy associated with the other character announces a pain cry and the virtual image of the other character on the computer associated with the other toy amioimces a similar pain cry via speaker.

Reference is now made to Fig. 180 which is a simplified pictorial illustration in the context of Fig. 177 showing a toy-interface functionality utilizing voice interaction. As is seen in Fig. 180 a user gives toy 16300 a verbal movement command, namely, "Run". Toy 16300 picks up user's speech and sends it to computer 16310. Computer 16310 converts speech to text and identifies movement command. Computer 16310 sends the movement command to server 16312.. Server 16312 applies the movement command to game character 16302 associated with toy 16300. Server 16312 sends computer 16310 updates to game scene according to the movement command and to updates in positions of other characters 16319 in the game.

It is appreciated that a combination of the functionality Figs. 177 and 180 enables a user to control a game character via vocal command and manipulations of toy's body, thus providing him command of both local body motions, such as limbs motions, and of movements of a character as whole, such as ramiing and turning to different directions.

Reference is now made to Fig. 181 which is a simplified pictorial illustration in the context of Fig. 177 showing a multi-user game played over a network. The game illustrated is played between two rival groups of users. Each user commands a character in the game via a toy, as illustrated in Figs. 177 and 180. In the illustrated embodiment a first user commands via toy 16300 a character represented in images 16302 and 16304 on monitors

16306 and 16307 respectively. A second user commands via toy 16301 a character represented in images 16303 and 16305 on monitors 16306 and 16307 respectively. As is seen in Fig. 181 the character associated with toy 16300 kneels and peeks over a comer of a wall, while the character associated with toy 16301 stands behind it. Server 16312 receives motion updates from a multiplicity of computer, including computers 16310 and 16311, and updates game scene accordingly. Server 16312 then sends the multiplicity of computers images of the game scene, each image representing a view point on the game scene dependant on the location within the game scene of the character associated with each computer. Thus, as seen in Fig. 181 first user and second user see on monitors 16303 and

16307 a slightly different image of the game scene. First user, commanding the character associated with toy 16300, sees on monitor 16306 three soldiers 16313 of the rival group, the soldiers unseen by second user, commanding the character associated with toy 16301. First user then announces what he sees over the comer, namely the three soldiers of the rival group. Toy 16300 picks up speech of first user and sends it to computer 16310. Computer converts the speech to text and sends it to toy server 16312. Toy server 16312 sends the text to computers that display the character associated with toy 16300 on their monitors, among which computer 16311, thus creating the effect that a speech of a character is heard by characters near it. Computer 16311 verbalizes the text via speaker 16308, in the voice of the character associated with toy 16300.

Reference is now made to Fig. 182 which is a simplified pictorial illustration in the context of Fig. 181 showing toy-mediation functionality in a multi-user game in accordance with a prefened embodiment of the present invention. As is seen in Fig. 182 a character 16302 commanded by a first user via toy 16300 is attacked by two other characters 16314 of a rival group. First user tells toy 16300 to call for help. Toy 16300 picks up user's speech and sends it to computer 16310. Computer 16310 converts speech to text and identifies keywords "call" and "help", the occuιτence of both in a user's utterance in the context of the illustrated game is interpreted to mean that a help request is to be delivered to other .members of the user's group. It is appreciated that the game software includes a list of keywords and phrases enabling users to verbally activate various game functions. It is further appreciated that the interpretation of verbal commands is dependant on the current context within the game. For example, when several characters are in proximity to a character from a rival group, an utterance of the user associated with a particular character that contains the words "call" and "help" is interpreted as a help request. Upon identifying the help request computer 16310 sends the request to server 16312. Server distributes the help request to computers of users of the group of first user, including computers 16315 and 16316. As is seen in the figure, toys 16317 and 16318 verbalize the help request, detailing the location of character 16302.

It is appreciated that toy 16300 in Figs. 177, 180, 181 and 182 may also be responsive to current characteristics of a user as sensed by the toy in a method illustrated in Figs. 172, 173 and 174. Toy 16300 senses emotional state indicating inputs from user, such as user's voice and user's content of speech. When the inputs indicate a negative emotional state, such as frustration or boredom, computer 16310 adjusts accordingly parameters of the game and of the toy involvement in it. For example, computer 16310 may strengthen a character associated with a toy by providing it with a weapon or by making it more endurable. Computer 16310 may also adjust the vocal reaction of a toy to hits that its associated character gets from other characters .

It is appreciated that the frmctionality of Figs. 172, 173, 174, 177, 178, 179, 180, 181 and 182 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys interconnected via a computer network, and providing an integrated toy-game functionality comprising: a game which may be played by a user; at least one interactive toy containing game-specific functionality which participates in playing the game; and wherein the game-specific functionality enables the interactive toy to be employed by the user as a user interface in playing the game; and wherein the game- specific functionality enables the interactive toy to have voice interaction with the user in the course of playing the game; and wherein the game-specific functionality enables the interactive toy to be responsive to at least one cunent characteristic of the user as sensed by the interactive toy.

It is also appreciated that the functionality of Figs. 177, 178, 179, 180, 181 and 182 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected via a computer network, and providing an integrated toy-game functionality comprising: a game which may be played by a user; at least one interactive toy containing game-specific functionality which participates in playing the game; and wherein the game is a multi-user game which may be played over a network; and wherein the game specific functionality is operative to mediate between at least two users playing the game.

An interpersonal interaction communication system operative to produce conversations between users is now described, in accordance with a prefened embodiment of the present invention.

Reference is now made to Fig. 183 which is a simplified pictorial illustration of an interactive toy system comprising an interpersonal interaction communication system operative to produce conversations between users in accordance with a preferred embodiment of the present invention. Turning to Fig. 183 it is seen that two users encounter on the street, the users being unfamiliar to each other. Toys 16500 and 16501 may identify each other using IR transceivers 16502 and 16503, by sending and receiving unique IR signals. Toys 16500 and 16501 initiate a conversation on subject of interest shared by both users, as implied from user profiles based on past interactions with toys. Toy 16500 and

16501 also embed users' names in the conversation as stimulation for the users to participate in the conversation.

Reference is now made to Fig. 184 which is a simplified flowchart of the conversation producing functionality of Fig. 183. Toy 16500 receives via IR transceiver

16502 a unique IR signal transmitted by toy 16501 via IR transceiver 16503. Toy 16500 sends the signal to toy server 16506 via public wireless communication network 16505. Toy server 16506 identifies toy 16501 by the signal, thus concluding that toys 16500 and 16501 are in propinquity with each other. Toy server 16506 then checks whether compatibility exists between users of toys 16500 and 16501. If such a matching exists, toy server 16506 sends toys 16500 and 16501 toy content designed to produce a conversation between users, the content selected from a database based on the compatibility detected between users. Toys 16500 and 16501 actuate the selected content. It is appreciated that the coordination of the actuation of content is handled by server 16506, by sending each toy a part of the content after receiving from the other toy a confirmation of the actuation of a preceding part of the content. Alternately the coordination is handled locally by toys 16500 and 16501, for example, by sending content to one of the toys, the toy delivering parts of the content to the other toy in a correct timing. Alternately both toys receive their respective parts of the content together with conditions for actuation of each part, for example after a specific word is verbalized by the other toy.

Reference is now made to Fig. 185A which is a simplified table in the context of Fig. 184, showing a database record utilized in detecting compatibility between users. The database record illustrated elaborates the number of occurrences of words from a pre-defined list of keywords in interactions of a user with a toy. Such a list is designed for the purpose of creating a user profile that might be utilized in detecting compatibility between users. It may include, for example, names of famous people, names of culture and media products, words designating public issues or fields of interest . It is appreciated that such a list may be interactively designed utilizing information obtained via toys. The database record illustrated shows the number of occurrences of words from such list in different realms of the interaction between a user and a toy: in conversation with toy, in web searches conducted via toy, as described hereinabove, and in web sites a user browses via toy. When checking the compatibility between users, toy server 16506 searches shared keywords recurrent in the interactions of the users with their toys.

Reference is now made to Fig. 185B which is a simplified table in the context of Fig. 184 showing a database record utilized in selecting conversation stimulating content for toys. The record illustrated shows keywords, such as the keywords described in Fig, 185 A, and content items related to the keywords, such as the content item illustrated in Fig. 183, possibly related to the keyword "Yankees".. It is appreciated that toy content is produced in accordance with a list of keywords described in Fig. 185 A.

It is appreciated that the functionality of Figs. 183, 184, 185A and 185B taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercoimected via a network, and providing an interpersonal interaction communication system providing communication between multiple users via multiple toys, the system comprising: a plurality of interactive toys operative for interaction via a computer network, at least some of the plurality of interactive toys having a functionality of forming personal profiles of users thereof based on accumulated past interactions therewith; and a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate content with at least another of the plurality of interactive toys, the content being operative to produce personal interaction between respective users thereof.

It is also appreciated that the frmctionality of Figs. 183, 184, 185 A and 185B taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercoimected via a network, and providing an interpersonal interaction communication system providing commimication between multiple users via multiple toys, the system comprising: a plurality of interactive toys operative for interaction via a computer network, at least some of the plurality of interactive toys having a functionality of forming personal profiles of users thereof based on accumulated past interactions therewith; and a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate content with at least another of the plurality of interactive toys, the content being operative to produce interaction between respective users thereof; and wherein the content is operative to produce interactions between interactive toys which are in physical propinquity therebetween.

It is further appreciated that the functionality of Figs. 183, 184, 185 A and 185B taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys interconnected via a network, and providing an interpersonal interaction communication system providing communication between multiple users via multiple toys, the system comprising: a plurality of interactive toys operative for interaction via a computer network, at least some of the plurality of interactive toys having a functionality of forming personal profiles of users thereof based on accumulated past interactions therewith; and a communications frmctionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate content with at least another of the plurality of interactive toys, the content being operative to produce interaction between respective users thereof; and wherein the content is operative to produce conversations between respective users thereof and employs at least some personal infonnation about the respective users based on accumulated past interactions therewith. An interpersonal interaction communication system wherein toys have a persona which share personal characteristics with an identifiable person is now described, in accordance with a preferred embodiment of the present invention.

Reference is now made to Fig. 186 which is a simplified pictorial illustration of an interpersonal interaction communication system wherein toys have a persona which share personal characteristics with an identifiable person in accordance with a prefen-ed embodiment of the present invention. Tuming to Fig. 186 it is seen that two users encounter each other on the street, each user carrying a toy having a persona identifiable by its hat as a baseball fan. Toys 16550 and 16551 identify each other via IR transceivers 16552 and 16553. Toys 16550 and 16551 initiate a conversation appropriate to their persona and to personal characteristics of their users that are shared by toys 16550 and 16551.

Reference is now made to Fig. 187 which is a simplified flowchart of the conversation functionality of Fig. 186. Toy 16550 receives via IR transceiver 16552 a unique IR signal transmitted by toy 16551 via IR transceiver 16553. Toy 16550 sends the signal to toy server 16556 via public wireless communication network 16555. Toy server 16556 identifies toy 16551 by the signal, thus concluding that toys 16550 and 16551 are in propinquity with each other. Toy server 16556 retrieves from content database toy- conversation according to persona of toys 16550 and 16551 and according to user profiles. Toy server 16556 sends the conversation to toys 16550 and 16551. Toy 16550 and 16551 actuate the conversation. It is appreciated that the coordination of the actuation of content is handled by server 16556, by sending each toy a part of the content after receiving from the other toy a confirmation of the actuation of a preceding part of the content. Alternately the coordination is handled locally by toys 16550 and 16551, for example, by sending content to one of the toys, the toy delivering parts of the content to the other toy in a correct timing. Alternately both toys receive their respective parts of the content together with conditions for actuation of each part, for example after a specific word is verbalized by the other toy.

Reference is now made to Fig. 188 which is a simplified table in the context of Fig. 186 showing a database record utilized in selection of content according to toy persona and to a characteristic of a user. The illustrated record includes content items appropriate to various encounters of toys having a persona. The illustrated record lists content items appropriate to a toy with a persona designated as Persona 1, such as a baseball fan, the toy belonging to a user at the age of 8 years. The record lists content items appropriate to the toy in encounters with toys having various personas and belonging to users of various ages. It is appreciated that content items appropriate to a specific age include speech characteristic of users of that age. Thus, for example, a toy with a baseball fan persona, belonging to an 8 years old user, will talk like an 8 years old baseball fan.

It is appreciated that the frmctionality of Figs. 186, 187 and 188 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercoimected via a network, and providing an interpersonal interaction communication system providing communication between multiple users via multiple toys, the system comprising: a plurality of interactive toys operative for interaction via a computer network, at least some of the plurality of interactive toys having a frmctionality of fonning personal profiles of users thereof based on accumulated past interactions therewith; and a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate content with at least another of the plurality of interactive toys, the content being operative to produce interaction between respective users thereof; and wherein at least some of the interactive toys have a persona which share at least one personal characteristic with an identifiable person.

It is further appreciated that the functionality of Figs. 186, 187 and 188 taken together is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a network, and providing an interpersonal interaction communication system providing communication between multiple users via multiple toys, the system comprising: a plurality of interactive toys operative for interaction via a computer network, at least some of the plurality of interactive toys having a functionality of fonning personal profiles of users thereof based on accumulated past interactions therewith; and a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate content with at least another of the plurality of interactive toys, the content being operative to produce interaction between respective users thereof; and wherein at least some of the interactive toys have a persona which share at least one personal characteristic with an identifiable person; and wherein the identifiable person is the user of a given toy.

An interpersonal interaction communication system operative to produce a personal meeting between users is now described, in accordance with a prefened embodiment of the present invention.

Reference is now made to Fig. 189 which is a simplified partly pictorial partly diagrammatic illustration of an interpersonal interaction communication system operative to produce a personal meeting between users in accordance with a preferred embodiment of the present invention. Tuming to Fig. 189 it is seen that two users are within a pre-defined distance from each other. Toy server 16606 detects meeting compatibility between the two users. Toys 16600 and 16601 suggest to their respective users to meet each other. After users accept suggestion toys infom users of a meeting place.

Reference is now made to Fig. 190 which is a simplified flowchart of the meeting producing functionality of Fig. 189. Toy server 16606 detects a multiplicity of toys, including toys 16600 and 16601, within a defined area, such as a square of 100 meters length. Server 16606 detects the multiplicity of toys via GPS devices on toys, such as devices 16602 and 16603, reporting their location to server 16606 via public wireless communication system 16605. Server 16606 checks compatibility of users among users of the multiplicity of toys. Server 16606 detects compatibility between the users of toys 16600 and 16601. Server 16606 generates messages to toys 16600 and 16601 suggesting users to meet, based on the detected compatibility. Server 16606 sends message to toy 16600. Toy 16600 verbalizes message to user, suggesting him to meet the user of toy 16601. If the user agrees, server 16606 sends similar message to toy 16601. If the user agrees, server sends messages to toys 16600 and 16601 designating a defined meeting point for the users. It is appreciated that server 16606 also tracks the advancement of the users toward the meeting point, thus verifying its execution.

Reference is now made to Fig. 191 A which is a simplified table in the context of Fig. 190 showing a database record utilized in checking compatibility between users. The illustrated database record details a profile of users whom a user would like to meet. The user inputs in such a record characteristics of users whom he would like to meet. A user inputs a description 16620 of the users. In conjunction with the description 16620 a user inputs features that are supposed to characterize the users, the features being typically detectable by toys via interaction with users. Such features includes words 16621 detected in speech of a user, products 16622 bought by a user via a toy, and web sites 16623 visited by a user via a toy, as described hereinabove. It is appreciated that a plurality of records as illustrated enables toy server 16606 to extract meaningful features for profiling users. For example, it enables toy server 16606 to detect products the purchasing of which have a meaning in the eyes of many users, and is translatable to a description in terms of personality.

Reference is now made to Fig. 191 B which is a simplified table in the context of Fig. 191 A showing a database record utilized in checking compatibility of users. The illustrated record shows a profile of a user obtained via a user interaction with toy and categorized in tenns obtained via manipulation of records illustrated in Fig. 191 A. The illustrated record elaborates the number of occurrences of certain keywords 16624 in a user's speech from among a defined list of keywords. The list of keywords may be obtained via records as illustrated in Fig. 191 A elaborating characteristics specified by users of users that they would like to meet. The illustrated record includes also products 16625 bought by a user via a toy, and web sites 16626 visited by a user via a toy. When checking meeting-compatibility of two users toy server 16606 compares a record such as illustrated in Fig. 191 A with a record such as illustrated in Fig. 19 IB.

Reference is now made to Fig. 191 C which is a simplified table in the context of Fig. 191 A showing a database record utilized in profiling users. The database record illustrated is obtained by manipulating a multiplicity of records illustrated in Fig. 191 A. It includes descriptions 16627 coined by users, with features given by the users to the descriptions, the features being typically detectable by a toy via an interaction with a user, as illustrated in Fig. 191 A. The illustrated record enables a toy server to translate a description of a user to features typically detectable by a toy, thus enabling users to input a description of other users whom they would like to meet.

It is appreciated that the functionality of Figs. 189, 190, 191 A, 191B and 191C taken together is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys intercomiected via a network, and providing an interpersonal interaction communication system providing communication between multiple users via multiple toys, the system comprising: a plurality of interactive toys operative for interaction via a computer network, at least some of the plurality of interactive toys having a functionality of forming personal profiles of users thereof based on accumulated past interactions therewith; and a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate content with at least another of the plurality of interactive toys, the content being operative to produce interaction between respective users thereof; and wherein the content is operative to produce a personal meeting between respective users thereof.

An interpersonal interaction communication system operative to produce pseudo- accidental meetings is now described, in accordance with a preferred embodiment of the present invention.

Reference is now made to Fig. 192 which is a simplified pictorial illustration of an interpersonal interaction communication system operative to produce pseudo-accidental meetings in accordance with a preferred embodiment of the present invention. Turning to Fig. 192 it is seen that toy 16650 guides a user to a certain place wherein the user encounters an old friend of his. As is seen in the figure, toy 16650 does not tell the user the purpose of the directions given.

Reference is now made to Fig. 193 which is a simplified flowchart of the meeting producing functionality of Fig. 192. Toy server 16656 detects a multiplicity of toys, including toys 16650 and 16651, in a defined area, such as a square of 100 meters length. Toy server 16656 detects the multiplicity of toys utilizing GPS devices on toys, such as devices 16652 and 16653, reporting their location to server 16656 via public wireless communication system 16655. Server 16656 checks compatibility for accidental meeting of users from among the users of the multiplicity of toys. Server 16656 detects that users of toys 16650 and 16651 are compatible for a pseudo-accidental meeting, namely that there is high probability that they will recognize each other in an encounter. Toy server 16656 sends toy 16650 message containing directions leading the user to the location of toy 16651. Toy 16650 verbalizes the directions.

Reference is now made to Fig. 194 which is a simplified table in the context of Fig. 193 showing a database record utilized in producing a pseudo accidental meeting of users. The illustrated record contains data regarding the life history of a user. It details information regarding places, organization and institutes wherein a user has been in different years. For each of these data is detailed also the measure of identification it possesses regarding the user, the measure decreasing in proportion to the number of people bearing the same name as his in the place during the years. It is appreciated that a part of the information is supplied to toy server 16656 at registration. It is also appreciated that a toy asks a user during its interaction with the user for details enhancing such a record. It is appreciated that such a record may also include other types of information such as nicknames, friends of a user . when arranging a pseudo-accidental meeting, toy server 16656 utilizes such records in order to detect overlaps in life histories of user, thus implying a probability that users would identify each other in an encounter.

Reference is now made to Fig. 195 which is a simplified pictorial illustration of an interpersonal interaction communication system operative to produce pseudo-accidental meetings in accordance with another preferred embodiment of the present invention. Turning to Fig. 195 it is seen that toy 16680 suggests to a first user to meet a second user he has requested to meet. Server 16686 detects a certain degree of proximity, such as a distance lower than 100 meters, between toys 16680 and 16681. Server 16686 utilizes for that matter GPS devices 16682 and 16683 reporting to server 16686 locations of toys 16680 and 16681 via a public wireless communication network 16685. Server 16686 detects that to a defined degree of probability, such 90%, user of toy 16681 is one of the users that user of toy 16680 had requested to meet. Server 16686 sends toy 16680 a message comprising a suggestion to a user to meet the user of toy 16681. The user agrees, and toy 16680 gives first user directions leading him to the location of second user. A meeting is thus produced without the Imowledge of second user of the meeting having been produced.

Reference is now made to Fig. 196 which is a simplified table in the context of Fig. 195 showing a database record utilized in producing a pseudo-accidental meeting. The illustrated record details infomiation regarding users whom a user would like to meet. After detecting a defined measure of proximity of a first user to a second user, toy server 16686 compares the illustrated record, associated with first user, with a record such as illustrated in Fig. 194 and associated with second user. Server 16686 thus calculates the probability that second user is one of the users whom first user would like to meet. If the probability exceeds a defined rate, server 16686 initiates a pseudo-accidental meeting procedure as illustrated in Fig. 195. Server 16686 sends toy 16680 a message comprising directions leading user to, the location of toy 16681. It is appreciated that if the degree of probability is lower than the defined rate toy 16681 asks the user for further identifying details to be added to a life histoiy record of user. Such as illustrated in Fig. 194. It is further appreciated that in such a case toy 16681 employs infonnation from the record illustrated in Fig. 196 and associated with first user in order to inquire second user for more details regarding the user's life history. For example, in the illustrated embodiment first user specifies that he would like to meet Jane from Athens High School in Alabama. It might be that in a life history record of a user named Jane from the school the name Jane is identifying for only 50% in relation to the school, meaning there are two persons named Jane in the school. A toy of the user may then inquire the user whether she plays in the Golden Eagles Brass Band, as mentioned in relation to Jane in the record illustrated in Fig. 196.

Reference is now made to Fig. 197 which is a simplified pictorial illustration of an interpersonal interaction communication system operative to produce a meeting between users in the context of a game in accordance with yet another prefened embodiment of the present invention. Turning to Fig. 197 it is seen the toys 16700 and 16701 suggest their respective users an adventure. Both users agree. Toys 16700 and 16701 thus instruct users to go to the amusement park at 20:00. At some later time both users are at the amusement park. Toys 16700 and 16701 instruct users to look for someone they would have liked to meet, namely each other.

Reference is now made to Fig. 198 which is a simplified flowchart of the gaming functionality of Fig. 197. Server 16706 detects that users of toys 16700 and 16701 have reciprocally expressed a wish to meet each other. Such detection is achieved in a method as illustrated in Figs. 194, 195 and 196. Server thus sends toy 16700 of the first user a message via computer 16707, the messages suggesting an adventure to user. If the user accepts suggestion, server 16706 sends toy 16701 of second user a similar message via computer 16708. If second user also accepts suggestion, server sends toys 16700 and 16701 messages instructing users to be at a certain place at a designated time. At some later time, typically at the designated time, server 16706 detects that toys 16700 and 16701 are in designated place, via GPS devices 16702 and 16703 on toys, reporting their location to server 16706 via public wireless communication network 16705. Server 16706 sends toys 16700 and 16701 messages via public wireless coimnunication network 16705, the messages instructing users to look for someone they are interested in meeting.

It is appreciated that the functionality of Figs. 192, 193, 194, 195 and 196 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys interconnected via a network, and providing an interpersonal interaction communication system providing communication between multiple users via multiple toys, the system comprising: a plurality of interactive toys operative for interaction via a computer network, at least some of the plurality of interactive toys having a frmctionality of forming personal profiles of users thereof based on accumulated past interactions therewith; and a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate content with at least another of the plurality of interactive toys, the content being operative to produce interaction between respective users thereof; and wherein the content is operative to produce pseudo accidental meetings between respective users thereof.

It is also appreciated that the functionality of Figs. 192, 193, 194, 195, 196, 197 and 198 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercoimected via a network, and providing an interpersonal interaction communication system providing coimnunication between multiple users via multiple toys, the system comprising: a plurality of interactive toys operative for interaction via a computer network, at least some of the plurality of interactive toys having a functionality of fonning personal profiles of users thereof based on accumulated past interactions therewith; and a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate content with at least another of the plurality of interactive toys, the content being operative to produce interaction between respective users thereof; and wherein the content is operative in the context of a game.

It is further appreciated that the functionality of Figs. 192, 193, 194, 195, 196, 197 and 198 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys interconnected via a network, and providing an interpersonal interaction communication system providing coimnunication between multiple users via multiple toys, the system comprising: a plurality of interactive toys operative for interaction via a computer network, at least some of the plurality of interactive toys having a functionality of forming personal profiles of users thereof based on accumulated past interactions therewith; and a communications functionality operative at least partially via the computer network to cause at least one of the plurality of interactive toys to communicate content with at least another of the plurality of interactive toys, the content being operative to produce interaction between respective users thereof; and wherein the content is operative to produce personal meetings between respective users thereof at predetermined locations. .

An interactive toy environment comprising a plurality of interactive toys interconnected via a network providing a toy cloning functionality is now described, in accordance with a prefened embodiment of the present invention.

Reference is now made to Fig. 199, which is a simplified party pictorial partly schematic illustration of an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a network providing a toy cloning functionality in accordance with a preferred embodiment of the present invention. Turning to Fig. 199 it is seen that while a doll-like interactive toy 17001 repeats a difficult word pronounced by a first user and perfonns an associated physical motion, a squirrel-like toy 17002 fails to repeat the same difficult word as the word is pronounced by a second user. The second user then requests that the behavior pattern of doll-like toy 17001 should be transferred to squirrel-like toy 17002. A behavior pattern transfening procedure is then performed, following which toy 17002 is capable of repeating the difficult word pronounced by the second user and of perfomiing an associated physical motion.

Reference is now made Fig. 200, which is a simplified block diagram illustration in the context of Fig. 199 showing databases involved in the toy personality storage functionality of Fig. 199 and flow of infonnation involved in the toy personality development functionality of Fig. 199. As seen in Fig. 200, a database of toy personality 17010 is associated with toy 17001 and includes database 17011 of toy life history and database 17012 of toy behavior patterns. Database 17010 may be stored on a personal computer such as computer 17003 of Fig. 199 and/or on a suitable server such as server 17005 of Fig. 199. As further shown in Fig. 1701, the system of Fig. 199 includes a language database 17013 shared by all toys of the toy type of toy 17001, which is a doll type. Database 17013 is typically stored on a suitable server such as server 17005 of Fig. 199. Database 17012 of toy behavior patterns of toy 17001 is updated based on inputs received from language database 17013 and the interaction between toy 17001 and one or more users. Thus it may be appreciated that the personality of a toy such as toy 17001 of Fig. 199 may develop based on the toy type of toy 17001 as well as the unique interaction between toy 17001 and its user. The personality development frmctionality of Figs. 199 and 200 is illustrated in greater detail in Fig. 202 described hereinbelow. Personality development functionality based on interaction between an interactive toy and another interactive toy is illustrated in Fig. 205 described further hereinbelow.

Reference is now made to Fig. 201 A, which is a simplified table of a database record 17021 of database 17012 of Fig. 200. Tuming to Fig. 201 A it is seen that database record 17021 of a behavior pattem of toy 17001 describes a motion pattem associated with a particular word given in text-to-speech form. Thus it may be appreciated that based on record 17021 of database 17012, a computer such as computer 17003 of Fig, 199 may instruct toy 17001 to pronounce the word "Abracadabra" in a particular voice and to perform associated motions.

Reference is now made to Fig. 20 IB, which is a simplified table in the context of Fig. 200 showing a database 17022 of toy mood expressions. Turning to Fig. 20 IB it is seen that for each toy type and each mood category, database 17022 provides the physical motion characterization appropriate to the toy type in order to express the mood category. Thus, while doll type Marbie expresses excitement by lifting its arms, squinel type Hansy expresses the same mood by lifting its tail.

Reference is now made to Fig. 202, which is a flowchart of the toy personality development functionality of Fig. 199. First and/or second user requests that toy 17001 and/or toy 17002 say the word "Abracadabra".. Computer 17003 extracts spectral features from voice input received via toy and/or toy 17002. Computer 17003 communicates extracted features to server 17005 together with toy ID of toy/s in question. Server 17005 retrieves toy type based on toy ID received from computer 17003. Based on retrieved toy type server 17005 checks whether extracted features match a word in language database of toy type of toy/s in question. In the case of toy 17001 (doll), the features match those of the word "Abracadabra" in language database 17013. Server 17005 creates a database record 17021 of behavior pattern and updates, for example via the Internet, pattern database 17012 of toy 17001 with the record 17021. Toy 17002 therefore pronounces the word "Abracadabra" correctly and perfom s the associated motion according to database record 17021.

In the case of toy 17002 (squirrel) the extracted features match no word on language database appropriate to the toy type of toy 17002. Server 17005 converts features to text based on phonetic description. Converted text is downloaded to computer 17003, which instmcts toy 17002 to verbalize the word. Pattem database of toy 17002 is updated with new word based on approval of the second user.

Reference is now made to Fig. 203, which is a flow chart illustration of the toy personality transfening functionality of Fig. 199. The second user or a parent thereof requests to transfer the behavior pattern of toy 17001 into the behavior pattern of toy 17002. This request is typically entered by means of a computer monitor such as monitor 17004 of Fig. 199 and communicated to server 17005, typically via the Intemet. Server 17005 requests toy ID and passwords of toys involved. Toy ID of toys 17001 and toy 17002 are typically received by computer 17003 by means of typically wireless communication therewith and communicated to server 17005. Passwords of toy 17001 and toy 17002 are typically respectively provided by the first and the second user or a parent thereof by means of computer keyboard. Computer 17003 communicates requested passwords to server 17005. Server 17005 verifies cloning data. Server 17005 checks that provided passwords match the communicated toy ID and that the toy ID's match the description of the requested cloning operation, namely doll into squinel.

Once the cloning data are verified a cloning procedure commences. For each item on pattern database 17012 of toy 17002, server 17005 creates a copy record adapted to the features of toy 17001, and updates the pattern database of toy 17001 with the newly created record. Thus, in the case of record 17021, server retrieves from mood expression database

17022 the equivalent of toy 17001 to the "lift amis" motion of toy 17001. The resulting pattern record includes the word "Abracadabra" together with "lift tail" motion. A pattern database of toy 17001 is then updated with this record. The same procedure is repeated for next item until the whole pattern database of toy 17002 is copied for toy 17001.

It is appreciated that pattern database 17012 of toy 17002 may include patterns of behavior, which are prima facie incompatible with the persona of toy 17001. Thus, for example, in the course of the personality transfening procedure, sever 17005 may sends a suggestion to the first user via computer 17003 that the first user might wish to include in the new personality of toy 17001 the singing of the song "I'm a Marbie Girl." The first user or a user thereof then decides whether to include this particular pattern of behavior on the cloning procedure of Fig. 203.

It is appreciated that the functionality of Fig. 199, 200, 201, 202 and 203 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercoimected via a network and providing a toy cloning functionality comprising: developing an interactive toy personality based on interactions between an interactive toy and at least one of interactive toy and a user; transfening at least a portion of the interactive toy personality to at least one clone.

It is also appreciated that the functionality of Fig. 199, 200, 201, 202 and 203 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected Via a network and providing a toy cloning functionality comprising: developing an interactive toy personality based on interactions between an interactive toy and at least one of interactive toy and a user; transferring at least a portion of the interactive toy personality to at least one clone; and wherein the toy personality is stored in a database.

It is further appreciated that the functionality of Fig. 199, 200, 201, 202 and 203 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercoimected via a network and providing a toy cloning functionality comprising: developing an interactive toy personality based on interactions between an interactive toy and at least one of interactive toy and a user; transfening at least a portion of the interactive toy personality to at least one clone; and wherein the at least one clone includes a toy.

It is further appreciated that the frmctionality of Fig. 199, 200, 201, 202 and 203 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected via a network and providing a toy cloning functionality comprising: developing an interactive toy personality based on interactions between an interactive toy and at least one of interactive toy and a user; transferring at least a portion of the interactive toy personality to at least one clone; and wherein the at least one clone includes a toy; and wherein the toy has a persona which is prima facie incompatible with the toy personality.

It is further appreciated that the functionality of Fig. 199, 200, 201, 202 and 203 taken together is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a network and providing a toy cloning functionality comprising: developing an interactive toy personality based on interactions between an interactive toy and at least one of interactive toy and a user; transferring at least a portion of the interactive toy personality to at least one clone; and associating at least one physical feature of the interactive toy with the clone.

An interactive toy environment comprising a plurality of interactive toys intercoimected via a network providing a toy cloning functionality in accordance with another prefen-ed embodiment of the present invention is now described.

Reference is now made to Fig. 204, which is a simplified partly pictorial partly schematic illustration of an interactive toy enviroimient comprising a plurality of interactive toys intercoimected via a network providing a toy cloning functionality in accordance with a preferred embodiment of the present invention. Tuming to Fig. 204 it is seen that an interactive toy 17101 approaching together with its user a famous tourism site 17102 suggests to its user that the user might wish to visit the site. Toy 17101 interacts based on instructions received from a suitable server 17105 with which it communicates by means of wireless communication via antenna 17103 with public wireless communication network 17104 which in turn provides Internet connection. Server 17105 includes tourism- site database 17106 as well as life history database 17107 of toy 17101. Thus it may be appreciated that server 17105 may provide toy personality development functionality based on the response of the user to the suggestion verbalized via toy 17101.

Reference is now made to Fig. 205, which is a simplified partly pictorial partly schematic illustration of an interactive toy environment comprising a plurality of interactive toys interconnected via a network providing a toy cloning functionality in accordance with another prefen-ed embodiment of the present invention. Turning to Fig. 205 it seen that an interactive toy 17111, which may be identical to toy 17101 of Fig. 204, interacts with another interactive toy 17112. Interactive toys 17111 and 17112 interact based on instructions received from computer 17113, which in turn communicates, typically via the Internet, with server 17114, which include life history database 17115 of toy 17111, which database may be similar in organization to database 17107 of Fig. 204. Thus it may be appreciated that server 17114 may provide toy personality development frmctionality based on the interaction between toys 17111 and 17112.

Reference is now made to Fig. 206, which is a simplified table of the life history database of Fig. 204 and 205. Turning to Fig. 206 it is seen that database 17121, which may be identical to database 17107 of Fig. 204 or to database 17115 of Fig. 1711, lists according to dates the sites that toy 17122 visited and the toys that toy 17122 met. For each toy listed, database record 17121 provides both a toy ID and a name reported by the other toy, which may have been given to the other toy by its user. Thus, it is appreciated that database record 17121 is based on interaction between toy 17122 and other interactive toys. It is also appreciated that the data registered is database 17121 may be ananged in other ways such as according to sites in alphabetical order or according to toy ID's of listed toys rather than according to dates.

Reference is now made to Fig. 207A, which is a simplified flowchart of the personality development of functionality of Fig. 204. Server 17105 receives transmission from toy 17101 via antenna 17103 of public wireless communication network 17104. Based on location of antenna 17103 server 17105 retrieves from tourism database 17106 list of tourism sites in the location of the user. Server 17105 sends message to the user via toy 17101 that the user might wish to visit site 17102 together with toy 17101. If computer 17108 at site 17102 reports presence of toy 17101, server 17105 updates life history database 17107 of toy 17101 with visit to site 17102.

Reference is now made to Fig. 207B, which is a simplified flowchart of the personality development of functionality of Fig. 205. Computer 17113 normally in communication with toy 17111 senses presence of toy 17112 by means of typically wireless communication therewith which includes transmission by toy 17112 to computer 17113 of toy ID. Computer 17113 sends query to server 17114 inquiring whether toy 17111 met toy 17112. Server 17114 retrieves result from life history database 17115 of toy 17111. If toy ID of toy 17112 does not appear on database 17115, computer instructs toy 17111 to address toy 17112, requesting to know the name of toy 17112. Based on the response of toy 17112 communicated via computer 17113 server 17114 updates database record 17115 with both toy ID and name of toy 17112. If toy ID of toy 17112 does appear on database 17115, server 17114 retrieves the name of toy 17112 therefrom and instructs toy 17111 to address toy 17112 by name.

It is appreciated that the toy personality development frmctionality of Figs. 204, 205, 206 and 207 may be used in conjunction with toy personality transferring functionality such as the functionality of Figs. 199, 200, 201, 202 and 203.

It is appreciated that the functionality of Fig. 204, 205, 206 and 207 taken together is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a network and providing a toy cloning functionality comprising: developing an interactive toy personality based on interactions between an interactive toy and at least one of interactive toy and a user; transferring at least a portion of the interactive toy personality to at least one clone; and wherein the interactive toy personality includes a toy life history.

It is also appreciated that the functionality of Fig. 204, 205, 206 and 207 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys interconnected via a network and providing a toy cloning functionality comprising: developing an interactive toy personality based on interactions between an interactive toy and at least one of interactive toy and a user; transfening at least a portion of the interactive toy personality to at least one clone; and wherein the interactive toy personality includes a toy life history; and wherein the toy life history is stored in a database.

Reference is now made to Fig. 208, which is a simplified partly pictorial partly schematic illustration of an interactive toy environment comprising a plurality of interactive toys intercomiected via a network providing a toy cloning functionality in accordance with another prefened embodiment of the present invention. Turning to Fig. 208 it is seen that a user interacts with a animated virtual character 17154 on computer monitor 17519, which animated character behaves generally similarly to interactive toy 17151, with which the user normally interacts. Animated character 17154 interacts based on instruction received from computer 17153, which communicates, typically via the Internet, with a suitable interactive toy server 17155. Server 17155 communicates with a multiplicity of computers including computer 17152, which communicates typically wirelessly with toy 17151 and includes a database storing the interactive toy personality thereof. Thus it may be appreciated that at least a portion of the personality of interactive toy 17151 may be cloned to animated figure 17154.

Reference is now made to Fig. 209, which is a simplified flowchart of the personality transferring functionality of Fig. 208. A user sends a request to server 17155 via computer 17153 to clone toy 17151 to an animated character. Server 17155 requests an appropriate password, which the user provides by means of a keyboard 17156 of computer 17153. Server 17155 verifies password and retrieves personality data of toy 17151 from computer 17152, which is normally in wireless communication with toy 17151. Toy 17151 itself may or may not be in communication with computer 17152 or any other computer interconnected therewith in the course of the cloning procedure in question. Alternately, personality data associated with toy 17151 are stored in a database on server 17155, and only private data, such as data personally related to the user are stored in a private database on computer 17152. In such a case, server 17155 may allow the user to choose between partial cloning where only that portion of the personality of toy 17151 which is stored on server 17155 is transferred to a clone, and complete cloning where the entire personality of toy 17151 including the portion stored on computer 17152 is transfened to the clone. Once the required personality data of toy 17151 are retrieved, server 17155 downloads to computer 17153 graphical software, which allows display of animated character 17154 on monitor 17159. The user then verbally interacts with animated character 17154 via microphone 17157 and speaker 17158. Based on the cloning option selected by the user, animated character 17154 behaves in a way generally similar to that of toy 17151. Thus, for example, if toy 17151 normally responds to the user if called by a specific name, then animated character responds in a similar way on being called by the same name. Toy 17151 may be kept in a dormant mode while the user interacts with animated character 17154. Alternately, toy 17151 may regularly interacts with other one or more users independently of the interaction between the first user and animated character 17154.

It is appreciated that the functionality of Fig. 208 and 209 taken together is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys intercomiected via a network and providing a toy cloning frmctionality comprising: developing an interactive toy personality based on interactions between an interactive toy and at least one of interactive toy and a user; transfening at least a portion of the interactive toy personality to at least one clone; and wherein the at least one clone includes an animated virtual character.

An interactive toy environment comprising a plurality of interactive toys intercomiected via a network providing a toy cloning functionality in accordance with another preferred embodiment of the present invention is now described.

Reference is now made to Figs. 210A and 210B, which are simplified partly pictorial partly schematic illustrations of an interactive toy environment comprising a plurality of interactive toys intercomiected via a network providing a toy cloning functionality in accordance with a prefened embodiment of the present invention. Turning to Fig. 210A it is seen that a first user interacts with an interactive toy 17201, which is a clone of another interactive toy 17202, teaching toy 17201 a new song. When later a second user interacts with toy 17202, requesting that toy 17202 sing a song, toy 17201 sings the same song that the first user previously taught toy 17201. In the illustrated embodiment, toys 17201 and 17202 interact based on instructions received from computer 17203, which provides access to a single database of toy personality. Thus it may be appreciated that the personality of toys 17201 and 17202 develops generally identically so that toy 17202 may sing the same song toy 17201 is used to singing.

Turning to Fig. 210B it is seen that a first user interacts with an interactive toy 17211, which is a clone of another interactive toy 17212, teaching toy 17211 a new song. When later, a second user interacts with toy 17212, requesting that toy 17212 sing a song, toy 17211 suggests to the second user that the second user might wish to listen to a song about football, which the second user previously reported to toy 17212 to be the favorite game of the second user. In the illustrated embodiment, toys 17211 and 17212 interact based on instructions received from computer 17213, which provides access to two separate databases of toy personality. Thus it may be appreciated the personality of toy 17211 develops at least partly independently of the personality of toy 17212 of which toy 17211 is a clone, so that toy 17212 may suggest singing a song different from the song toy 17211 is used to singing.

Reference is now made to Figs. 211 A and 21 IB, which are block diagrams respectively illustrating the toy personality storage and development frmctionality of Figs. 210A and 210B. Turning, to Fig. 211 A, it is seen that personality database 17221, which is initially the personality database of toy 17202 of Fig. 210A, receives updates based on both the interaction between toy 17201 and a user and the interaction between toy 17202 and a user. At the same time, the way both toy. 17201 and toy 17202 interacts is based on the personality data stored in database 17221. Turning to Fig. 21 IB, it is seen that personality database 17223 of toy 17211 of Fig. 210B is initially created as a copy of personality database 17222 of toy 17212. Later, however, databases 17222 and 17223 receive updates respectively from the interaction between toy 17212 with a user and the interaction between toy 17211 and a user. Thus, databases 17222 and 17223 receive updates at least partially independently of one another.

It is appreciated that the functionality of Fig. 210A and 211 A taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected via a network and providing a toy cloning functionality comprising: developing an interactive toy personality based on interactions between an interactive toy and at least one of interactive toy and a user; transferring at least a portion of the interactive toy personality to at least one clone; and wherein following the transferring, the interactive toy personality continues to develop generally identically both in the interactive toy and in the clone. It is appreciated that the functionality of Fig. 21 OB and 21 IB taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected via a network and providing a toy cloning functionality comprising: developing an interactive toy personality based on interactions between an interactive toy and at least one of interactive toy and a user; transfening at least a portion of the interactive toy personality to at least one clone; and wherein following the transferring, the interactive toy personality continues to develop at least partially independently both in the interactive toy and in the clone.

Reference is now made to Fig. 211C, which is simplified block diagram illustration in the context of Figs. 210A, 21 OB, 211 A and 21 IB showing personality development and personality transfening functionality in accordance with another prefen-ed embodiment of the present invention. Tuming to Fig. 211C it is seen that personality databases 17221 and 17222 respectively of toys 17211 and 17212 of Fig. 21 OB, which databases develop independently of one another, are both cloned to a single personality database 17224 of a clone comprising a third interactive toy 17214 not shown in Fig. 21 OB. Thus, it may be appreciated that the toy personality of toy 17214 is based on the toy personality of both toy 17211 and toy 17212.

Reference is now made to Fig. 212, which is simplified flowchart of the toy personality transfening functionality of Fig. 21 lC. As seen in Fig. 212, toy personality database 17221 of toy 1721 1 is initially cloned to toy personality database 17224 of toy 17214. Then, for any item on the behavior pattem database of toy personality database 17222 of toy 17212, the item in question is cloned into the behavior pattern database of toy personality database 17224 of toy 17214 only if the item is not already included therein. Then, for any item on the life history database of toy personality database 17222 of toy 17212, the item in question is cloned into the life history database of toy personality database 17224 of toy 17214 only if the item is not incompatible with another item already included therein. Thus, if life history database of database 17222 includes and item "June, 1, Paris", and life history database of database 17221 includes an item "June, 1, Rome", then only the item "June, 1, Paris" is cloned into the life history database of database 17224.

It is appreciated that the frmctionality of Fig. 211C and 212 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected via a network and providing a toy cloning functionality comprising: developing an interactive toy personality based on interactions between an interactive toy and at least one of interactive toy and a user; transfening at least a portion of the interactive toy personality to at least one clone; and wherein the interactive toy personality incorporates features based on multiple toy life histories.

An interactive toy environment comprising a plurality of interactive toys interconnected via a network providing a toy cloning frmctionality in accordance with another prefen-ed embodiment of the present invention is now described.

Reference is now made to Fig. 213, which is a simplified partly pictorial partly schematic illustration of an interactive toy environment comprising a plurality of interactive toys intercomiected via a network providing a toy cloning functionality in accordance with a preferred embodiment of the present invention. Turning to Fig. 213 it is seen that a first user normally interacting with an interactive toy 17301 and a second user normally interacting with an interactive toy 17302, decide to exchange between them the toys 17301 and 17302 with which they respectively interact. As seen in Fig. 213, the first user then interacts with toy 17302, which addresses the first user by his name, and the second user interacts with toy 17301, which addresses the second user by his name. In the illustrated embodiment, the toys 17301 and 17302 interact based on instructions respectively received from computers 17304 and 17303, which computers communicate typically via the Internet with a suitable toy server 17305. And while the users exchange toys 17301 and 17302 between them, toys 17301 and 17302 may exchange their respective toy personality databases between , them. Thus it may be appreciated that, toy 17301 which normally interacts with the first user may personally interact with the second user, and toy 17302 which normally interacts with the second user may personally interact with the first user.

Reference is now made to Fig. 214, which is a simplified flowchart of personality transferring functionality of Fig. 213. Server 17305 receives via computer 17303 a request to exchange toy 17301 and 17302 between their respective users. Computer 17303 communicates to server 17305 toy ID's of toys 17301 and 17302 and corresponding passwords provided by their respective user. If the passwords are correct, server 17305 retrieves from computer 17303 database 17306 of toy 17301 and downloads it to computer 17304. Server 17305 then retrieves from computer 17304 database 17307 of toy 17302 and downloads it to computer 17303. This completes the cloning procedure. When computer

17303 senses the presence of toy 17302 by means of typically wireless communication therewith, computer 17303 provides to toy 17302 content input based on database 17307, which content input is therefore appropriate to toy 17301 and the first user. When computer

17304 senses the presence of toy 17301 by means of typically wireless communication therewith, computer 17304 provides to toy 17301 content input based on database 17306, which content input is therefore appropriate to toy 17302 and the second user.

It is appreciated that the functionality of Fig. 213 and 214 taken together is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys intercomiected via a network and providing a toy cloning functionality comprising: developing an interactive toy personality based on interactions between an interactive toy and at least one of interactive toy and a user; transfening at least a portion of the interactive toy personality to at least one clone; and also comprising transfening at least one clone personality fr-om the at least one clone to the interactive toy.

An interactive toy environment comprising a plurality of interactive toys intercomiected via a network providing toy personality functionality is now described, in accordance with a preferred embodiment of the present invention.

Reference is now made to Fig. 215, which is a simplified partly pictorial partly schematic illustration of an interactive toy environment comprising a plurality of interactive toys intercoimected via a network providing toy personality functionality in accordance with a preferred embodiment of the present invention. Turning to Fig. 215 it is seen that a first user, who is the user nomially interacting with an interactive toy 17501, verbally interacts with toy 17501 requesting that toy 17501 entertain other users who are not normally in communication with toy 17501. When the others user in question anive and communicate with toy 17501, toy 17501 interacts with the other users in a way which is different from the way toy 17501 interacts with the first user. In the illustrated embodiment toy 17501 interacts based on instructions received from a computer 17502, which is in coimnunication typically via the Intemet with a suitable toy server 17503. Both computer 17502 and server 17503 may provide a plurality of selectable database for storing toy personality for toy 17501. Thus, it may be appreciated that toy 17501 may interact with different users exhibiting a plurality of different toy personalities.

Reference is now made to Fig. 216, which is simplified flowchart of the selectable personality exhibiting mechanism frmctionality of Fig. 215. Computer 17502 receives user voice via toy 17501. If computer 17502 recognizes, for example by means of voiceprint, the voice of the user nomially in interaction with toy 17501, computer 17502 provides content input to toy 17501 based on toy personality database 17504 (not shown in Fig. 215) and stores infomiation gained in the course of interaction with the user in database 17504. Otherwise, computer 17502 provides content input to toy 17501 based on toy personality database 17505 (not shown in Fig. 1750) and stores infonnation gained in the course of interaction with the user in database 17505. Databases 17504 and 17505 may be stored on computer 17502 or on server 17503.

It is appreciated that the decision functionality of Fig. 216 as to which personality database to employ at a given time may be based on recognizing at least one of a predetermined list of user voices rather than a single user voice.

It is also appreciated that the decision functionality of Fig. 216 as to which personality database to employ at a given time may be based on recognizing at least one of a predetermined list of words and/or phrases rather than one of a predetermined list of user voices.

It is further appreciated that the decision functionality of Fig. 216 as to which personality database to employ at a given time may be based on recognizing at least one of a predetennined list of toys, for example by means of toy ID, rather than users.

It is appreciated that the functionality of Figs. 215 and 216 allows for the toy personality of toy 17501, which personality develops in course of interaction between toy 17501 and the user nomially interacting therewith, to develop independently of interactions between toy 17501 and other users.

It is appreciated that the functionality of Figs. 215 and 216 is particularly appropriate to an interactive toy environment comprising a plurality of interactive toys interconnected via a network and providing toy personality functionality comprising: developing a plurality of interactive toy personalities based on interactions between an interactive toy and at least one of another interactive toy and a user; and endowing at least one interactive toy with at least to of the plurality of interactive toy personalities and with a mechanism for exhibiting at least one selectable personality at a given time.

It is appreciated that the functionality of Figs. 215 and 216 is particularly appropriate to an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a network and providing toy personality functionality comprising: developing a plurality of interactive toy personalities based on interactions between an interactive toy and at least one of another interactive toy and a user; and endowing at least one interactive toy with at least to of the plurality of interactive toy personalities and with a mechanism for exhibiting at least one selectable personality at a given time; and wherein the at least one interactive toy exliibits at least one selectable personality in accordance with a toy-perceived personality of a corresponding user.

An interactive toy system comprising user assistance functionality is now described, in accordance with a prefeιτed embodiment of the present invention. Reference is now made to Fig. 217, which is a simplified partly pictorial partly schematic illustration of an interactive toy system comprising user assistance functionality including tooth bmshing functionality in accordance with a prefen-ed embodiment of the present invention. Turning to Fig. 217 it is seen that an interactive toy 18001 comprising a fanciful toothbrush 18003 demonstrates a tooth brushing motion to a user. In the illustrated embodiment, toy 18001 interacts with the user based on instructions received from a computer 18002, which provides content input to toy 18001 by means of typically wireless communication therewith. Thus it may be appreciated that toy 18001 may assist the user in tooth brushing based on verbal input by the user received via toy 18001 as well as diffused infrared red signal by transmitter 18005 at a bathroom 18006 received via infrared receiver 18005 on toy 18001.

Reference is now made to Fig. 218, which is a simplified flowchart of the user assistance functionality of Fig. 217. Computer 18002 initiates tooth brushing functionality by instructing toy 18001 to suggest to the user that the user should brush his/her teeth. If the user does not agree, toy 18001 suggests to the user that the user might wish to go to the bathroom together with toy 18001 and later listen to a bedtime story by toy 18001. The presence of the user and toy 18001 at bathroom 18006 may be sensed by means of diffused infrared signal emitted by transmitter 18005 at bathroom 18006, which signal is received by an infrared receptor 18004 on toy 18001. If an infrared red receptor signal via receptor 18004 on toy 18001 is not received by computer 18002 within a predetermined time( lapse such as 1 minute, computer 18002 instructs toy 18001 to urge the user to take toy 18001 to bathroom 18006. If the receptor signal is still not received, this procedure may repeat itself for a predetermined nmnber of times. Upon reception of infrared red receptor signal via receptor 18004 on toy 18001, computer 18002 instructs toy 18001 to perform a fanciful tooth brushing motion and explain to the user how to brush his/her teeth.

It is appreciated that the functionality of Figs. 217 and 218 taken together is particularly appropriate to an interactive toy system comprising: at least one interactive toy; an interactive toy functionality at least partially resident at the at least one interactive toy and including user assistance functionality providing an output to the user which assists the user in user functioning.

It is also appreciated that the frmctionality of Figs. 217 and 218 taken together is particularly appropriate to an interactive toy system comprising: at least one interactive toy; an interactive toy functionality at least partially resident at the at least one interactive toy and including at least partially verbal user assistance functionality providing an output to the user which assists the user in user functioning.

It is also appreciated that the functionality of Figs. 217 and 218 taken together is particularly appropriate to an interactive toy system comprising: at least one interactive toy; an interactive toy functionality at least partially resident at the at least one interactive toy and including at least partially mechanical user assistance functionality providing an output to the user which assists the user in user functioning.

It is also appreciated that the functionality of Figs. 217 and 218 taken together is particularly appropriate to an interactive toy system comprising: at least one interactive toy; an interactive toy frmctionality at least partially resident at the at least one interactive toy and including user assistance functionality providing an output to the user which assists the user in user functioning; and wherein the interactive toy functionality includes tooth brushing functionality.

It is also appreciated that the functionality of Figs. 217 and 218 taken together is particularly appropriate to an interactive toy system comprising: at least one interactive toy; an interactive toy functionality at least partially resident at the at least one interactive toy and mcluding user assistance functionality providing an output to the user which assists the user in user functioning; and wherein the user assistance frmctionality is at least partially visible.

An interactive toy system comprising user assistance functionality in accordance with another prefened embodiment of the present invention is now described.

Reference is now made to Fig. 219, which is a simplified partly pictorial partly schematic illustration of an interactive toy system comprising user assistance functionality including a guide functionality in accordance with a prefened embodiment of the present invention. As shown in Fig. 219 a blind user carrying interactive toy 18101 encounters an obstacle 18103 on route-segment 18107. Toy 18101 detects obstacle 18103 by means of obstacle detector 18109, which may be an infrared-based obstacle detector, and provides an audio warning message to the user. As further seen Fig. 219, another interactive toy 18102 suggests to its user that the user should take an alternative route turning to route-segment 18108 in order to avoid obstacle 18103 on route-segment 18107. In the illustrated embodiment, interactive toy 18102 interacts based on instructions received from server 18106 with which it communicates by means of wireless communication via public wireless coimnunication network antenna, which provides connection to Internet 18105. Server 18106 also communicates with toy 18101, which typically includes a location-tracking device such as a GPS device. Thus it may be appreciated that server 18106 may instruct toy 18102 to suggest to its user an alternative route based on hazard information received from toy 18101.

Reference is now made to Fig. 220A, which is a simplified table in the context of Fig. 219 showing a destination database 18111 for a particular blind user such as the user of toy 18101 of Fig. 219. Destination database 18111 may be stored on a personal computer of the user and or on a suitable server such as server 18106 of Fig. 219. Turning to Fig. 220 A it is seen that for each keyword designating a user-desired destination, database 18111 provides a suitable route definition. In the illustrated embodiment, a route-definition R is provided in the form a series of route-segments (Xi,Yi) where Xi and Yi represent coordinates of a location on a computerized map of a traffic area. A route-segment may be a segment along a strait line, a portion of a street, a series of stops along a public transport line and/or any other single motion segment that may be traversed by a blind user without the user requiring instruction as to turns, changes and the like. A route segment (Xi,Yi) is typically defined as a possible single motion segment leading from the location whose coordinates are Xi-1, Yi-1 to the location whose coordinates are Xi,Yi. Coordinates X0,Y0 are typically defined as those of the home of a user.

Reference is now made to Fig. 220B, which is a simplified table in the context of Fig. 219 showing a route-definition database 18112. Database 18112 is typically stored on a suitable server such as server 18106 of Fig. 219. Turning to Fig. 220B it is seen that for any two locations in a given traffic area respectively represented by coordinates Xi,Yi and Xj,Yj, database 18112 provides a route-definition Rij typically comprising a series of route segments as shown in Fig. 220 A. Preferably, route-definition database 18112 provides routes avoiding obstacles, which may pose hazard to a blind user. Preferably, database 18112 is continuously updated based on newly reported obstacles.

Reference is now made to Fig. 220C, which is a simplified table in the context of Fig. 219 showing a multiple user guiding database 18113. Database 18113 is typically stored on a suitable server such as server 18106 of Fig. 219. Turning to Fig. 220C it is seen that for each user currently being guided by an interactive toy, database 18113 provides a toy ID, route- definition of the route currently traversed, and the cunent location of the user.

Reference is now made to Fig. 221, which is a simplified flowchart in the context of Fig. 219 showing route definition functionality of a computerized guide system for a blind user. Toy 18101 at a home of a user is in typically wireless communication with a personal computer 18120, which in turn communicates typically via Internet 18105 with server 18106. Computer 18120 typically includes destination database such as destination database 18111 of Fig. 220A. Toy 18101 receives a verbal input from the user. Computer 18120 is operative to perform speech recognition based of a list of keywords designating desired destinations of the user. If computer 18120 recognizes one of the keywords of destination database 18111, computer 18120 communicates to server 18106 the proposed route definition corresponding to the user-desired destination as provided by database 18111. If the proposed route includes newly reported obstacles, server 18106 retrieves an altemative route from route-definition database 18112 based on the location of the user's home and the desired destination. Server 18106 communicates the alternative route to computer 18120, which in turn updates destination database 18111 with the alternative route and instructs toy 18101 to infonn the user thereof.

If the user verbal input includes no keyword from database 18111, computer 18120 communicates to server 18106 the user's route input. Server 18106 may send a message to be verbalized to the user via toy 18101, requesting that the user clarify and/or confirm the route input, for example, by verbally providing an exact address. Server 18106 then retrieves the coordinates of the destination desired by the user from a suitable traffic database. Based on the coordinates of the user's home and those of the desired destination, server 18106 retrieves a route definition from database 18112. Server 18106 then instructs computer 18120 to verbalize a message to the user via toy 18101 informing the user of the proposed route.

Computer 18120 communicates to server 18106 a user confirmation of a proposed route, which confirmation is preferably received by means of verbal input via toy 18101. Server 18106 then updates guiding database 18113 with a new guiding task for the user, registering the toy ID of toy 18101, the route definition and the cunent location of the user, which is initially that of the home of the user.

Returning to Fig. 219, it is seen that toy 18101 comprising a GPS device is in commimication with server 18106 via public wireless communication network antenna 18104 and Internet 18105. It may therefore be appreciated that server 18106 is operative to continuously update guiding database 18113 with the user's cun-ent location based on GPS device reading received from the toy 18101. It may also be appreciated that the route definition procedure of Fig. 221 may also be perfonned while receiving user route-input from a location other than the home of the user. In such a case, a route definition is retrieved from database 18112 based on the user current location tracked via the GPS device on toy 18101 and a desired destination received from the user typically by means of verbal input via the toy 18101. Reference is now made to Fig. 222, which is a simplified flowchart in the context of Fig. 219 showing the audio waming functionality and the hazard infomiation providing functionality of the system of Fig. 219. While performing a guiding task for its user, toy 18101 detects obstacle 18103, for example, by means of infrared-based obstacle detector 18109. Based on detector 18109 signal, toy 18101 provides an immediate audio warning for the user. Toy 18101 is typically operative to provide such an audio waming even in a case where wireless communication is temporarily or pennanently lost.

Toy 18101 communicates to server 18106 the presence of an unreported obstacle 18103 on route-segment 18107. Server 18106 sends a message to the user suggesting that the user might wish to receive help. Based on the user's response received via the toy 18101, the user may receive help by means of human intervention.

Seiver 18106 updates route-definition database 18112 based on the newly reported obstacle 18103. This may result in changing one or more route-definition for future selecting of routes. At the same, server 18106 checks cunent task database 18113 for users currently traversing a route comprising route-segment 18107 where obstacle 18103 has been reported. The search retrieves users such as the user of toy 18102. Server 18106 retrieves the current location of the user of toy 18102 from database 18113 and/or by means of GPS reading from toy 18102. Based on the cunent location and the desired destination of the user of toy 18102, server 18106 retrieves from database 181 12 an alternative route for the user that avoids obstacle 18103. Server 18106 sends a message to be verbalized for the user via toy 18102 informing the user of the newly reported obstacle and suggesting that the user should take the alternative route.

It is appreciated that the functionality of Figs. 219, 220A, 220B, 220C, 221 and 222 taken together is particularly appropriate to an interactive toy system comprising: at least one interactive toy; an interactive toy frmctionality at least partially resident at the at least one interactive toy and including at least partially verbal user assistance functionality providing an output to the user which assists the user in user functioning; and wherein the at least one interactive toy is connected to a computer network.

It is also appreciated that the functionality of Figs. 219, 220A, 220B, 220C, 221 and 222 taken together is particularly appropriate to an interactive toy system comprising: at least one interactive toy; an interactive toy frmctionality at least partially resident at the at least one interactive toy and including at least partially verbal user assistance functionality providing an output to the user which assists the user in user functioning; and wherein the user assistance functionality includes a guide dog functionality. An interactive toy system comprising "point to object-name of object" functionality is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 223 which is a simplified flowchart of a toy system comprising "point to object-name of object functionality". Turning to Fig. 223 it is seen that a user selects a language to be taught by toy 18513. Selection is perfonned via keyboard of computer 18510 or via vocal command to toy 18513. After selection computer 18510 downloads from toy server 18511 a phonetic dictionary of the selected language, comprising pronunciations of words. Toy 18513 is equipped with a video camera 18512 positioned on a pointing hand of toy 18513, thus transmitting to computer 18510 via toy 18513 images of objects that hand of toy 18513 points whereto. It is appreciated that hand of toy is movable. It is further appreciated that the hand is equipped with a standard distance measuring device enabling toy 18513 to point toward objects located at a defined distance therewith. For example, toy 18513 may move hand randomly until a defined distance a distance from an object is detected. Toy 18513 sends to computer 18510 digital images created by video camera 18512. Computer 18510 applies object recognition techniques known in the art to the images. If an object is recognized in an image, computer 18510 retrieves from dictionary the pronunciation of the name of the object is the language selected and sends it to toy 18513. 18513 verbalize name of object in the selected language. It is appreciated that the object recognition techniques are enhanced with standard optical character recognition techniques, enabling computer 18510 to recognize letters and symbols. Computer 18510 is thus further provided with a dictionary of commercial symbols and names enabling computer 18510 to recognize objects normally associated with the commercial symbols and names. It is further appreciated that the illustrated teaching functionality operates in conjunction with other toy activities, so that a toy occasionally verbalizes a name of an object during the other activities.

In accordance with another preferred embodiment of the present invention, a toy points to objects displayed on a computer monitor. Computer detennines display position a toy points whereto, utilizing standard techniques such as applied in light pens, and sends toy the name of the object displayed therein.

It is appreciated that the functionality of Fig. 223 is particularly appropriate to an interactive toy system comprising: at least one interactive toy: an interactive toy functionality at least partially resident at the at least one interactive toy and including teaching functionality providing a teaching output to the user which assists the user in learning. It is also appreciated that the functionality of Fig. 223 is particularly appropriate to an interactive toy system comprising at least one interactive toy: an interactive toy functionality at least partially resident at the at least one interactive toy and including teaching functionality providing a teaching output to the user which assists the user in learning; and wherein the teaching functionality includes a point to object - name of object teaching functionality.

It is further appreciated that the functionality of Fig. 223 is particularly appropriate to an interactive toy system comprising: at least one interactive toy; an interactive toy frmctionality at least partially resident at the at least one interactive toy and including teaching functionality providing a teaching output to the user which assists the user in learning; wherein the teaching functionality includes a point to object - name of object teaching functionality; and wherein the point to object - name of object teaching functionality includes selectable language functionality.

An interactive toy system comprising both verbal and body language teaching functionality for teaching a foreign language is now described, in accordance with a prefened embodiment of the present invention. Reference is now made to Fig.. 224 which is a simplified pictorial illustration of an interactive toy system comprising both verbal and body language teaching functionality for teaching a foreign language in accordance with a preferred embodiment of the present invention. Turning to Fig. 224 it is seen that toy 18550 utilizes both body and verbal language in teaching Italian to a user. Toy 18550 retrieves from computer 18551 a word in Italian, namely "Nolare", to be taught to user, and a body gesture or a sequence of body gestures symbolizing the meaning of the word, namely waving of hands as symbolizing "flying" or "to fly". The user guesses the meaning of the Italian word. Toy 18550 picks up the user utterance and sends it to computer 18551. If utterance contains the correct meaning of the Italian word, toy 18550 confirms in English and retrieves next word. Otherwise toy 18550 retrieves from computer another body gesture symbolizing the meaning of the Italian word. After a defined number of failed user attempts, toy 18550 verbalizes correct meaning of word in English. It is appreciated that toy 18550 also delivers verbal hints and guiding comments to user.

It is appreciated that the functionality of Fig. 224 is particularly appropriate to an interactive toy system comprising: at least one interactive toy; an interactive toy functionality at least partially resident at the at least one interactive toy and including teaching functionality providing a teaching output to the user which assists the user in learning; and wherein the teaching functionality includes both verbal and body language teaching functionality for teaching a foreign language.

An interactive toy system comprising different home and school environment teaching functionalities is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 225 which is a simplified flowchart of a school environment teaching functionality wherein an interactive toy acts as a fellow student to a user in accordance with a prefened embodiment of the present invention. Turning to Fig. 225 it is seen that a plurality of toys, including toys 18610 and 18611 are in communication with a classroom computer 18612. During a lesson a teacher pronounces a word or a phrase defined in advance by the teacher as a mark for a toy to insert a question. First toy 18610 picks up the keyword and sends it to computer 18612. Computer 18612 identifies keyword and sends toy 18610 a question prepared in advance by teacher. It is appreciated that a teacher utilizes such toy-questions as a means to progress the lesson, for example by thought provoking questions arising from material taught. A teacher may also apply such toy-questions as means to arise questions that students might be reluctant to ask. Toy 18610 verbalizes question. If within a defined period of time, such as 10 seconds, no speech is picked up by a toy, computer 18612 sends second toy 18611 an answer to the question, the answer prepared in advance by teacher.

Reference is now made to Fig. 226 which is a simplified flowchart in the context of Fig. 225 showing a home environment teaching frmctionality of a toy. Computer in classroom 18612 sends interactive scripts related to a lesson to computers of students in class, including home computer 18613. Computer 18613 receives acknowledgement signal fr-om toy 18610, indicating that the toy is currently in home enviromnent. Computer 18613 performs interactive script related to lesson. Computer 18613 sends toy 18610 a question related to lesson. Toy 18610 verbalizes question. It is appreciated that the question is phrased as a request of the toy that a user explain or remind the toy of something. It is also appreciated that the question is phrased so that computer 18613 may detennine whether the user answered correctly, for example by relating to a specific detail in lesson. It is further appreciated that if the user does not answer con-ectly, toy initiates communication with another toy-user who was present in class in order that both users find an answer together. In such a case toys intermediate communication between users.

It is appreciated that the functionality of Figs. 225 and 226 taken together is particularly appropriate to an interactive toy system comprising: at least one interactive toy; an interactive toy frmctionality at least partially resident at the at least one interactive toy and including teaching functionality providing a teaching output to the user which assists the user in learning; and wherein the teaching frmctionality includes different home and school environment teaching functionalities.

It is also appreciated that the functionality of Figs. 225 and 226 taken together is particularly appropriate to an interactive toy system comprising: at least one interactive toy; an interactive toy functionality at least partially resident at the at least one interactive toy and including teaching functionality providing a teaching output to the user which assists the user in learning; wherein the teaching functionality includes different home and school enviromnent teaching functionalities; and wherein the school environment teaching functionality interactively involves at least a plurality of interactive toys.

It is further appreciated that the functionality of Figs. 225 and 226 taken together is particularly appropriate to an interactive toy system comprising: at least one interactive toy; an interactive toy functionality at least partially resident at the at least one interactive toy and including teaching functionality providing a teaching output to the user which assists the user in learning; and wherein the teaching frmctionality includes fellow student functionality wherein the at least one interactive toy acts as a fellow student to a user.

An interactive toy system comprising teaching functionality actuable by an event in a non-teaching functionality of a toy is now described, in accordance with a preferred embodiment of the present invention. Reference is now made to Fig. 227 which is a simplified flowchart of an interactive toy system comprising teaching functionality actuable by an event in a non-teaching functionality of a toy in accordance with a preferred embodiment of the present invention. Turning to Fig. 227 it is seen that a user purchases chocolate via toy 18700. The purchase may be effected via a web server when a user browses the web via toy 18700 as illustrated hereinabove. It is appreciated that such a purchase may also be effected via a toy in a physical commercial institute, utilizing a value of a user. Computer 18701 which is in communication with toy 18700, typically home computer, a computer in a commercial institute or a toy server, detects that keyword "chocolate" appears in a database 18702 comprising popular teaching subjects, the database stored on toy server 18703. Computer 18701 sends toy 18700 a message, containing suggestion to user. Toy 18700 verbalizes message, suggesting to teach a user about chocolate. If the user agrees computer 18701 downloads from toy server 18703 an interactive toy lesson about chocolate. Toy 18700 actuates lesson.

It is appreciated that the functionality of Fig. 227 is particularly appropriate to an interactive toy system comprising: at least one interactive toy; an interactive toy functionality at least partially resident at the at least one interactive toy and including teaching functionality providing a teaching output to the user which assists the user in learning; and wherein the teaching frmctionality is automatically actuable by an event in a non-teaching functionality of the at least one interactive toy.

Figs. 228 and 229 are simplified schematic illustrations of screen display of an interactive toy system providing language teaching functionality in accordance with a preferred embodiment of the present invention.

Figs. 230A-232 are simplified flowcharts of language teaching functionality of an interactive toy system in accordance with a prefened embodiment of the present invention.

An interactive toy system comprising a user behavior conective functionality is now described, in accordance with a preferred embodiment of the present invention.

Figs. 233-236 are simplified partly pictorial partly schematic illustration of an interactive toy system providing behavior corrective functionality in accordance with a preferred embodiment of the present invention.

Turning to Fig. 233, it is seen that an interactive toy informs it that the toy has sensed that the user smokes a cigarette for the third time within a defined time lapse. It is appreciated that the behavior conective functionality of Fig. 233 involves pre-acquired knowledge of a characteristic of the user obtained via the toy. It is also appreciated that the behavior corrective functionality of Fig. 233 involves cun*ently acquired knowledge of a characteristic of the user obtained via the toy.

Turning to Fig. 234, it is seen that an interactive toy suggests to a user that the user could express himself in a more appropriate language. It is appreciated that the functionality of Fig. 234 includes sensing functionality for sensing the language appropriateness of the user.

Turning to Fig. 236, it is seen that an interactive infomis its user that the blood pressure of the user is too low and suggests to the user that the toy should call a doctor. It is appreciated that the frmctionality of Fig. 236 includes sensing functionality for the blood pressure of the user.

Figs. 237 and 238 are simplified flowchart of toy learning functionality of the system of Figs. 233 to 236.

Figs. 239-241 are simplified block diagram illustrations of test group formation functionality of an interactive toy system providing a methodology for obtaining and utilizing information in accordance with a prefened embodiment of the present invention. Fig. 242 is a simplified flowchart of infonnation obtaining functionality of an interactive toy system providing a methodology for obtaining and utilizing information in accordance with a preferred embodiment of the present invention.

Figs. 243 and 244 are simplified block diagram illustrations of information utilization functionality of an interactive toy system providing a methodology for obtaining and utilizing information in accordance with a prefened embodiment of the present invention.

Figs. 245 and 246 are simplified flowcharts of infonnation obtaining functionality of an interactive toy system providing a methodology for obtaining and utilizing infonnation in accordance with a preferred embodiment of the present invention.

Fig. 247 is a simplified partly pictorial partly schematic illustration of an interactive toy scheduling system in accordance with a preferred embodiment of the present invention.

Fig. 248A-250 are simplified flowcharts of the scheduling functionality of the system of Fig. 247.

Fig. 251 is a simplified schematic illustration of an interactive toy web browser system in accordance with a preferred embodiment of the present invention.

Figs. 252-255 are simplified flowcharts of the web-browsing functionality of the system of Fig. 251.

Fig. 256 is a simplified partly pictorial partly schematic illustration of an interactive toy system providing teaching frmctionality in accordance with a preferred embodiment of the present invention.

Fig. 257 is a simplified flowchart of the teaching functionality of the system of Fig. 256.

Fig. 258 is a simplified partly pictorial partly schematic illustration of an interactive toy system providing telephone inquiry functionality in accordance with a preferred embodiment of the present invention.

Figs. 259A-260 are simplified flowcharts of the dialer functionality of the system of Fig. 258.

Fig. 261 is simplified partly pictorial partly schematic illustration of information retrieval functionality of the system of Fig. 258.

Figs. 262 and 263 are simplified flowcharts of computer equipment upgrade functionality provided by an interactive toy in accordance with a preferced embodiment of the present invention. An interactive toy system comprising a user-specific event driven, musical output functionality is now described, in accordance with a preferred embodiment of the present invention.

Fig. 264 is a simplified partly pictorial partly schematic illustration of an interactive toy system providing musical output functionality in accordance with a preferced embodiment of the present invention. Turning to Fig. 264 it is seen that a plurality of interactive toys, which cooperate to provide various coordinated parts in a musical output. It is appreciated that the interactive toys of Fig. 264 may have at least one known persona.

Fig. 265 is a simplified partly pictorial partly schematic illustration of the noise control functionality of the system of Fig. 264.

Figs. 266-269 are simplified flowcharts of the musical output functionality of the system of Fig. 264. Turning to Fig. 266 it is seen that an interactive toy assists a user in finding a musical piece partially lmown to the user. It is appreciated that the functionality of Figs. 264 and 266 taken together includes acquiring knowledge of the musical taste and Imowledge of the user, which acquired knowledge may be used in conjunction with the musical output functionality of Fig. 264 and 266.

Fig. 270 is a simplified partly pictorial partly schematic illustration of an interactive persona system comprising a tliree-dimensional artificial person having a pattern of behavior associated with a physician in accordance with a preferred embodiment of the present invention.

Fig. 271 is a simplified partly pictorial partly schematic illustration of the three dimensional artificial person of the interactive persona system of Fig. 270.

Figs. 272 and 273 are simplified flowcharts of the functionality of the interactive persona system of Fig. 270.

Fig. 274 is a simplified partly pictorial partly schematic illustration of an interactive toy web-browsing system providing employment agency functionality in accordance with a preferred embodiment of the present invention.

Fig. 275 is a simplified flowchart of the employment agency functionality of the interactive toy web-browsing system of Fig. 274.

Figs. 276 and 277 are simplified partly pictorial partly schematic illustrations of an interactive persona system comprising a three-dimensional artificial person having an appearance of a historical figure in accordance with a preferred embodiment of the present invention. Fig. 278 is a block diagram illustration of various collections of historical figures of the interactive persona system of Figs. 276 and 277.

Fig. 279 is a simplified partly pictorial partly schematic illustration of an interactive toy system providing services to a disabled user in accordance with a prefened embodiment of the present invention.

Fig. 280 is a simplified partly pictorial partly schematic illustration of a walking interactive of the system of Fig. 279.

Fig. 281 is a simplified flowchart of the functionality of Fig. 279.

Fig. 282 is a simplified schematic illustration of an interactive toy system providing a toy personality cloning functionality in accordance with a prefened embodiment of the present invention.

Figs. 283-288 are simplified schematic illustrations of the toy personality cloning functionality of Fig. 282.

Fig. 289 is a simplified partly pictorial partly schematic illustration of an interactive toy web-browsing system providing communication to potential charity organizations in accordance with a prefened embodiment of the present invention.

Figs. 290 and 291 are simplified flowchart of the charity communication functionality of the toy web-browsing system of Fig. 289.

Fig. 292 is a simplified partly schematic partly block diagram illustration of an interactive persona system comprising an artificial three-dimensional person having a pattern of behavior of a guide.

Figs. 293A-296 are simplified flowcharts of the functionality of the interactive persona system of Fig. 292.

Figs. 297 and 298 are simplified schematic illustrations of an interactive toy system providing toy-game functionality in accordance with a prefened embodiment of the present invention.

Fig. 299A and 299B are a simplified flowchart of the toy-game functionality of Figs. 297 and 298.

Fig. 300 is a simplified schematic illustration of an interactive toy system providing multi-user game frmctionality.

Fig. 301 is a simplified schematic illustration of an interactive toy system comprising an interactive toy comprising a lenticular display unit in accordance with a prefen-ed embodiment of the present invention. Fig. 302 is a simplified flowchart of a point of sale functionality of the interactive toy system of Fig. 301.

Figs. 303A-304 are simplified schematic illustrations of an interactive toy system comprising an inter-toy communication system in accordance with a prefened embodiment of the present invention.

Figs. 305-312 are simplified flowcharts of the inter-toy communication functionality of the interactive toy system of Figs. 303 and 304.

Fig. 313 is simplified table of a database record of an interactive toy system providing community formation functionality in accordance with a preferred embodiment of the present invention.

Fig. 314-317 are simplified flowcharts of community fonnation functionality provided by an interactive toy system in accordance with a prefened embodiment of the present invention.

Fig. 318 is a simplified block-diagram illustration of information storage and utilization of an interactive persona system comprising a tliree-dimensional artificial person having an appearance of a pattern of behavior associated with a comedian in accordance with a prefened embodiment of the present invention.

Fig. 319A to 319D are simplified flowcharts of the functionality of an interactive persona system comprising a tliree-dimensional artificial person having an appearance of a pattern of behavior associated with a comedian in accordance with a prefened embodiment of the present invention.

Figs. 320-326 are simplified flowcharts of interpersonal interaction communication functionality of an interactive toy system in accordance with a preferred embodiment of the present invention.

Fig. 327 is a simplified diagrammatic illustration of personal meeting production functionality of an interactive toy system in accordance with a prefened embodiment of the present invention.

Fig. 328A is a simplified flowchart of interpersonal interaction communication functionality of an interactive toy system in accordance with a preferred embodiment of the present invention.

Fig. 328B is a simplified table of a user database record used in conjunction with the interpersonal interaction communication functionality of Fig. 328A.

Fig. 328C is a simplified flowchart of the interpersonal interaction communication functionality of Fig. 328 A. Figs. 329-332 are simplified flowcharts of inteipersonal interaction communication functionality of an interactive toy system in accordance with a prefened embodiment of the present invention.

Fig. 333 is a simplified table of a user database record of an interactive toy system providing interpersonal interactive communication functionality in accordance with a preferred embodiment of the present invention.

Figs. 334-342 are simplified flowcharts of inteipersonal interaction communication functionality of an interactive toy system in accordance with a preferred embodiment of the present invention.

Figs. 343A and 343B are simplified schematic illustration of interactive toy propinquity and relative direction detection functionality of an interactive toy system in accordance with a prefened embodiment of the present invention.

Fig. 344 is a simplified table of meeting request database record of an interactive toy system providing interpersonal interactive communication functionality in accordance with a prefened embodiment of the present invention.

Fig. 345 is a simplified schematic illustration of a teaching functionality for an interactive toy system in accordance with a preferred embodiment of the present invention.

Fig. 346 is a simplified schematic illustration of an interactive toy system comprising teaching ftmctionality in accordance with a prefen-ed embodiment of the present invention.

Figs. 347 and 348 are simplified flowcharts of teaching functionality of the interactive toy system of Fig. 346.

Fig. 349 is a simplified flowchart of an interactive toy system comprising virtual classroom teaching ftmctionality in accordance with a preferred embodiment of the present invention.

Fig. 350 is a simplified schematic illustration of an interactive toy system comprising virtual classroom teaching functionality in accordance with a preferred embodiment of the present invention.

Fig. 351 is a simplified schematic illustration of an interactive toy system comprising teaching functionality in accordance with a prefened embodiment of the present invention.

Figs. 352 and 353 are simplified flowcharts of the teaching functionality of the interactive toy system of Fig. 351.

Fig. 354 is a simplified flowchart of the functionality of an interactive toy system providing content, which assists a user in teaching. Figs. 355A-360 are simplified flowcharts of the teaching functionality of the interactive toy system of Figs. 350 and 351.

The term "interactive toy" refers to a toy which receives at least one control input from a toy controlling entity and which includes an electronic device capable of generating, at least partly in response to the control input, at least one output which is discernible by the toy controlling entity. The toy controlling entity may be a human user and may alternatively be a device.

The tenn "networked interactive toy" means an interactive toy which is connected to the toy controlling entity via a network. The toy therefore is operative to receive control input over the network and may or may not receive additional, locally generated control inputs which are not transmitted over the network.

It is appreciated that the toy system shown and described herein is not useful only for children and alternatively may be used by adult toy users such as executive toy users.

Appropriate commercially available software for speech recognition include, but are not limited to, Via Voice and Voice Dictation by IBM; Naturally Speaking by Dragon Systems, L&H Voice Xpress by Lemout & Hauspie, and Conversa by Conversa.

Techniques and applications of Speech Recognition suitable for implementation of a prefened embodiment of the present invention are described in the following references.

1. Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics and Speech Recognition by Dan Jurafsky, James H. Martin, Nigel Ward and Daniel Jurafsky; Published by Prentice Hall, January 2000.

2. Fundamentals of Speech Recognition by Lawrence Rabiner, Biing-Hwang Juang, Bilng-Hwang Juang; Published by Prentice Hall, April 1993.

3. The Dragon NaturallySpeaking Guide: Speech Recognition made Fast and Simple by Dan Newman &, James Baker; Published by Waveside Pub, September 1999.

4. How to Build a Speech Recognition Application by Bruce Balentine, David P. Morgan and William S. Meisel; Published by Enterprise Integration Group, April 1999.

5. Computer Speech: Recognition, Compression, Synthesis; by Manfred R. Schroeder; Springer Series in Information Sciences, 35; Published by Springer Verlag, May 1999

6. Electronic Speech Recognition: Techniques, Technology & Applications; Geoff Bristow (Editor); McGiaw Hill 1986; WAP (Wireless Application Protocol) technology for providing Internet services via cellular phones in accordance with a prefened embodiment of the present invention is described in the following references:

1. Professional WAP by Charles Arehart, et al (Wrox Press Inc. - July 2000)

2. Understanding WAP : Wireless Applications, Devices, and Services (Artech House Telecommunications Library) by Marcel Van Der Heijden(Editor), et al (Artech House - July 2000)

3. Programming Applications with the Wireless Application Protocol: The Complete Developer's Guide; Steve Maim (John Wiley & Sons, March 2000).

GSM (Global System for Mobile Communication) technology suitable for the implementation of prefened embodiments of the present invention is described in the following references:

1. GSM Made Simple by George Lamb, et al (Cordero Consulting - June 1997)

2. GSM Networks : Protocols, Terminology, and Implementation (Artech House Mobile Communications Library- January 1999) by Giinnar Heine

3. The GSM System for Mobile Communications by Michel Mouly, Marie-Bernadette Pautet (Telecom - June 1992)

It is appreciated that the term "computer" used in this document refers to any computing device including, but not limited to, a personal computer, a television set-top box used for interactive television, a game playstation , a computing device located onboard an automobile and computing devices embedded in home appliances. It is further appreciated that the computing requirements of an interactive toy may be implemented partly or totally onboard the toy, on a local computer, on any other computer available to the toy via a computer network, or it may be distributed in any mamier between such devices.

It is further appreciated that the term Intemet used in this document includes any network of computers including fully or partly wireless networks and is not restricted to any particular computer network.

It is further appreciated that interactive toys may be connected to the Internet either by a direct cable connection or by any wireless technique including via a direct RF link with a computer, via a Bluetooth™ coimection, via a wireless cellular connection such as provided by the WAP (Wireless Application Protocol) system or via any other available method. It is appreciated that the software components of the present invention may, if desired, be implemented in ROM (read-only memory) fonn. The software components may, generally, be implemented in hardware, if desired, using conventional techniques.

It is appreciated that various features of the invention which are, for clarity, described in the contexts of separate embodiments may also be provided in combination in a single embodiment. Conversely, various features of the invention which are, for brevity, described in the context of a single embodiment may also be provided separately or in any suitable subcombination.

It will be appreciated by persons skilled in the art that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention is defined only by the claims that follow:

Claims

1. In an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a network, a methodology for obtaining and utilizing information comprising: employing at least one of said plurality of interactive toys to obtain information via the user; utilizing said information obtained via the user in an application which is not limited to user involvement.
2. A methodology for obtaining and utilizing information according to claim 1 and also comprising: obtaining required permission of at least one of a user and a person legally capable of providing permission in respect of said user.
3. A methodology according to claim 1 and wherein said infomiation is utilized in marketing.
4. A methodology according to claim 3 and wherein at least one of said plurality of interactive toys provides information on purchasing.
5. A methodology according to claim 3 and wherein said information is utilized in advertising.
6. A methodology according to claim 3 and wherein at least one of said infomiation is utilized in designing advertising.
7. A methodology according to claim 3 and wherein at least one of said information is utilized in directing advertising.
8. A methodology according to claim 7 and wherein at least one of said infonnation is utilized in classifying users according to user profiles at least partly derived from said information.
9. A methodology according to claim 8 and wherein said information includes not only infonnation directly provided by the user but also information derived from user behavior sensed by the at least one interactive toy.
10. A methodology according to claim 9 and wherein said information includes information derived from user behavior sensed by the at least one interactive toy, which behavior is non-commercial behavior.
11. A methodology according to claim 1 and wherein said infomiation is obtained at least partially by employing speech recognition.
12. A methodology according to claim 9 and wherein said information is obtained at least partially by employing speech recognition.
13. A methodology according to claim 1 and wherein said information is obtained at least partially by employing speech recognition from disparate cultural groups of users.
14. A methodology according to claim 13 and wherein criteria employed in speech recognition are updated in response to infonnation received indicating the efficacy of said speech recognition.
15. A methodology according to claim 11 and wherein said at least one toy is employed at least partially to prompt said user to speak.
16. A methodology according to claim 15 and wherein said at least one toy is employed at least partially to prompt said user to say certain words.
17. A methodology according to claim 11 and wherein said at least one toy is employed at least partially to develop a language model.
18. A methodology according to claim 1 and wherein said information is utilized at least partially as a diagnostic tool for evaluating perfonnance of at least one of a computer and an interactive toy.
19. A methodology according to claim 1 and wherein said information is utilized at least partially as a diagnostic tool for evaluating performance of at least one user.
20. A methodology according to claim 1 and wherein said information is utilized at least partially as a diagnostic tool for evaluating perfonnance of content employed by said at least one interactive toy.
21. A methodology according to claim 1 and wherein said infonnation is utilized at least partially as a diagnostic tool for evaluating utility of teaching methods.
22. A methodology according to claim 1 and wherein said infonnation is utilized at least partially as a toy design tool.
23. A methodology according to claim 1 and wherein said information is utilized at least partially as a game design tool.
24. A methodology according to claim 1 and wherein said information is utilized at least partially for evaluating changes in perfonnance of at least one user over time.
25. A methodology according to claim 1 and wherein said information is utilized at least partially for evaluating nutrition habits of said at least one user.
26. A methodology according to claim 1 and wherein said information is utilized at least partially as a diagnostic tool for evaluating utility of educational methodologies.
27. A schedule monitoring toy system comprising: an at least partially verbal-input interactive toy operative to leam personal infonnation about a child; and toy content operative to actuate said verbal-input interactive toy to present to the child at least one personalized, verbal scheduling prompt based on at least one item of personal information wliich the verbal-input interactive toy has learned about the child.
28. A schedule monitoring toy system according to claim 27 and wherein said toy content comprises personalized content which at least partly conforms to at least one personal characteristic of the user, said personal characteristic being learned by the user's toy.
29. A schedule monitoring toy system comprising: a verbal-input interactive toy; a parental input receiver operative to recognize a parent and to receive therefrom at least one parental input regarding at least one desired schedule item; and toy content actuating the verbal-input interactive toy to present to a child a timely verbal presentation of said at least one desired schedule item.
30. A schedule-monitoring toy system comprising: a mobile, verbal-input interactive toy; a scheduler operative to receive an input regarding at least one schedule item; a child locator operative to locate a child within a predetermined area; and a prompter operative, at a time appropriate to said at least one schedule item, to locate the child and to deliver at least one verbal prompt for said at least one schedule item.
31. A schedule-monitoring toy system according to claim 30 and wherein said prompter is operative to physically approach said child.
32. A schedule monitoring toy system comprising: a verbal -input interactive toy operative to perform speech recognition; and toy content actuating the verbal-input interactive toy to present to a child: at least one timely, interactive verbal scheduling prompt; and at least one anthropomorphic response to recognized speech content produced by a child responsive to said prompt.
33. A schedule monitoring toy system comprising: a verbal-input interactive toy; a schedule input receiver operative to receive, from at least one authorized source, information regarding a plurality of schedule items; a free time database operative to receive, from at least one authorized source, information regarding at least one free time activities authorized to be perfonned by a child during his free time; and toy content actuating the verbal-input interactive toy to present to the child:
a timely verbal presentation of each of said plurality of schedule items; and a verbal presentation, presented at a time not occupied by any of said plurality of schedule items, prompting the child to perfonn at least one of said free time activities.
34. A follow-me toy system comprising: a mobile toy; and a user-following mechanism tracking the user and guiding the toy to follow the user as the user moves within a working area.
35. A networked diary toy system comprising: a verbal-input interactive toy; a network interface connecting said verbal-input interactive toy to a computer network including at least one networked computer; a diary database storing at least one diary item for an individual user; and verbal content for presenting diaiy items from the diary database, wherein at least a portion of said verbal content is stored on said at least one networked computer and anives at said verbal-input interactive toy via said network interface.
36. A speech-responsive networked diary toy system comprising: a toy; a network interface comiecting the toy to a computer network including at least one networked computer; a diary database storing at least one diary item for an individual user; a speech-recognition unit residing at least in part in said at least one networked computer and communicating with said toy via said network and said network interface; and toy content actuating the toy to present at least one diary item responsive to user utterances recognized by the speech recognition unit.
37. A supervised networked organizer system comprising: an organizer subsystem operative to perform at least one organizing function involving multiple individual users; and a supervision subsystem storing at least one supervisor identity and automatically providing to each individual supervisor, inputs from the organizer system.
38. A supervised networked organizer system according to claim 37 and wherein said organizer subsystem comprises multiple interactive toys associated with said multiple individual users.
39. A supervised networked organizer system according to claim 37 and wherein at least some of said multiple individual users are children and wherein at least one of said individual supervisors is a parent of at least some of said children.
40. A system according to claim 37 and wherein said organizer subsystem includes ovenide functionality which enables said individual supervisor to override inputs received by said organizer subsystem from at least one of said multiple individual users.
41. A child-messaging toy system comprising: a verbal-input interactive toy including child propinquity indicating frmctionality; a message database operative to accept at least one message to be delivered to a child whose propinquity to said toy is indicated to exist; and a message delivery controller including: an audible amiunciator operative to provide a personalized audible output to said child requesting that said child come into propinquity with said toy; and a message output generator, operative in response to an indication of propinquity of said child to said toy for providing at least one message from said message database to said child.
42. A child-messaging toy system comprising: a verbal-input interactive toy including child propinquity indicating functionality; a timed message database operative to accept at least one time-specific message to be delivered to a child whose propinquity to said toy is indicated to exist at least one predetermined time; and a message delivery controller including: an audible annunciator operative to provide a personalized audible output to said child requesting that said child come into propinquity with said toy; and a message output generator, operative in response to an indication of propinquity of said child to said toy for providing at least one time-specific message from said timed message database to said child.
43. A system according to claim 42 and also comprising a message delivery indication that said time-specific message has not been delivered to said child at said predetermined time.
44. A virtual parenting toy system comprising: a verbal-input interactive toy operative to play at least one game with a child, said verbal-input interactive toy including verbal-input interactive toy content operative to actuate the verbal-input interactive toy to present to the child: at least one verbal prompt to perfonn at least one task; and at least one verbal offer to play the at least one game with the child once said at least one task is perfonned; and a compliance monitor operative to accept at least one indication that said at least one task has been performed and in response to said indication, to actuate said at least one game.
45. A virtual parenting toy system comprising: an interactive toy including: a child want indication-recognizing functionality operative to recognize at least one indication of a child want; a child want reporting functionality for providing an output indication of a child want recognized by said child want indication-recognizing functionality; and a child want satisfying frmctionality operative to satisfy said child want reported by said child want reporting functionality.
46. A virtual parenting toy system according to claim 45 and wherein said child want satisfying functionality is controlled by a child want satisfying input which may be received other than from said child.
47. A system according to claim 46 wherein said child want satisfying functionality includes: advertisement content responsive to the child want indication and offering a plurality of advertised items; and child preference eliciting functionality ascertaining a preference of said child for a given item from among said plurality of advertised items and providing a child preference output; and transactional functionality operative in response to said child preference output for purchasing said given item.
48. A toy system comprising: an interactive toy including; free time indication functionality designating at least one time slot during which a child has free time and may participate in toy interaction; and free time utilization functionality operative in response to an output from said free time indication functionality for providing entertainment to said child during said at least one time slot.
49. A toy system according to claim 48 and wherein said free time indication frmctionality comprises a schedule input receiver operative to receive schedule information regarding a plurality of schedule items and to compute therefrom said at least one time slot.
50. A toy system according to claim 48 wherein the free time indication ftmctionality is responsive to an overriding parent input for defining said at least one time-slot.
51. A user-location monitoring toy diary comprising: a schedule database storing, for each of a multiplicity of schedule items, a time thereof and coordinates of a location thereof, wherein the multiplicity of locations are represented within a single coordinate system; a user tracker operative to track the cunent location of the user; and a prompter operative to prompt said user to conform to the schedule if the user's current location does not confonn to the stored location of a current schedule item.
52. A schedule-monitoring toy system comprising: a verbal-input interactive toy operative to interact with a user; a schedule database storing the user's schedule; and a schedule reminder actuating the verbal-input interactive toy to present to the user at least one verbal scheduling prompt which includes content which is not related to scheduling.
53. A system according to claim 52 wherein said prompt comprises at least one game.
54. A system according to claim 52 wherein said prompt comprises at least one joke.
55. A system according to claim 52 wherein said prompt offers the user a value credit for compliance with the prompt and stores said credit for the user if the user fulfills a compliance criterion.
56. A system according to claim 52 wherein said prompt comprises content which emotionally prepares the user to cope with an emotionally traumatic item in the schedule database.
57. A computerized guide system for a blind user, the guide comprising: a portable interactive computerized device including: route definition functionality operative to receive route input from a blind user for selecting a user route; hazard detection functionality operative to detect at least one hazard along said user route; and audio waming functionality operative in response to an output from said hazard detection functionality to provide the user with an audio warning regarding presence of the hazard.
58. A guide system according to claim 57 and wherein said interactive device is networked with at least one other such device.
59. A guide system according to claim 58 and wherein said interactive device is operative to provide hazard infonnation to said at least one other such device.
60. A system according to claim 58 wherein said interactive device is operative to broadcast said hazard infonnation in real time.
61. A parental surrogate toy comprising: a toy; a child behavior report receiver; and a toy controller comprising: a behavioral report configuration definer allowing a parent to define at least one parameter of child behavior which is of interest; a child monitor operative to monitor said parameter of child behavior and to provide a report relating to said at least one parameter to said child behavior report receiver.
62. A web browsing system comprising: an interactive toy being connectable to the Internet, said interactive toy including a user interface having web browsing functionality.
63. A web browsing system according to claim 62 and also comprising a computer which serves as an intermediary between said interactive toy and the Internet.
64. A web browsing system according to claim 62 and wherein said user interface also has non-web browsing functionality.
65. A web browsing system according to claim 62 and wherein said user interface provides said web browsing functionality within the context of a game.
66. A web browsing system according to claim 65 and wherein in the context of said game said web browsing functionality provides an answer to a question posed in said game.
67. A web browsing system according to claim 65 and wherein in the context of said game said web browsing functionality provides non-rational web browsing.
68. A web browsing system according to claim 62 and wherein said web browsing functionality produces content which is subsequently employed by said toy.
69. A web browsing system according to claim 68 and wherein said content is added to a stored user profile.
70. A web browsing system according to claim 62 and wherein said user interface also includes intenOgation functionality for obtaining information from other interactive toys networked therewith.
71. A web browsing system according to claim 62 and wherein said user interface includes a voice interactive ftmctionality.
72. A web browsing system according to claim 62 and wherein at least one user characteristic ascertained from earlier interaction between the toy and a user is employed as an input in said web browsing functionality.
73. A web browsing system according to claim 72 and wherein said at least one user characteristic is employed by said web browsing functionality for matching the user with an activity offering functionality.
74. A web browsing system according to claim 73 and wherein said activity offering ftmctionality is an employment agency frmctionality.
75. A Imowledge management system comprising: an interactive toy being connectable to the Internet, said interactive toy including a user interface having infonnation management functionality.
76. A knowledge management system according to claim 75 and wherein said infomiation management functionality includes at least one of infonnation retrieval functionality, information synthesis functionality and information filtering functionality.
77. A knowledge management system according to claim 75 and wherein said user interface includes a voice interactive functionality.
78. A knowledge management system according to claim 75 and wherein said user interface includes a telephone dialer.
79. A knowledge management system according to claim 75 and wherein said user interface includes a telephone inquiry functionality.
80. A Imowledge management system according to claim 75 and wherein said user interface includes a download to diary functionality.
81. A Imowledge management system according to claim 75 and wherein said information management frmctionality includes matching functionality operative to match potential donors with potential charity recipients.
82. A knowledge management system according to claim 81 and wherein said matching functionality employs user profile information collected by said toy.
83. A knowledge management system according to claim 75 and wherein said information management functionality includes matching ftmctionality operative to match potential volunteers with potential charity organizations.
84. A Imowledge management system according to claim 75 and wherein said infonnation management ftmctionality includes user status determination ftmctionality operative to sense a wellness status of a user and help functionality operative to place the user into contact with functionalities equipped to enhance the wellness status of the user.
85. A knowledge management system according to claim 75 and wherein said information management functionality includes user status detennination functionality operative to sense a happiness status of a user and help functionality operative to place the user into contact with functionalities equipped to enhance the happiness status of the user.
86. A Imowledge management system according to claim 85 and wherein said user status determination functionality comprises voice responsive functionality.
87. A Imowledge management system according to claim 75 and wherein said information management functionality includes matching functionality operative to possessions of potential donors with potential charity recipients.
88. An interactive persona system comprising: a three-dimensional artificial person including: a computer; and a voice responsive interactive functionality employing said computer, said three-dimensional artificial person having a pattern of behavior associated with a defined persona and being capable of interacting with a user in a manner which mimics behavior characteristic of said persona.
89. An interactive persona system according to claim 88 and wherein said tliree- dimensional artificial person is locomotive.
90. An interactive persona system according to claim 88 and wherein said voice responsive interactive functionality employs artificial intelligence.
91. An interactive persona system according to claim 88 and wherein said three- dimensional artificial person has at least one of an appearance and voice which is characteristic of said persona.
92. An interactive persona system according to claim 88 and wherein said pattern of behavior is at least partially programmable.
93. An interactive persona system according to claim 88 and wherein said pattem of behavior is at least partially programmable by a user.
94. An interactive persona system according to claim 88 and wherein said pattern of behavior is at least partially programmable other than by a user.
95. An interactive persona system according to claim 88 and wherein said pattem of behavior is at least partially remotely programmable.
96. An interactive persona system according to claim 88 and wherein said pattern of behavior is at least partially controllable via a computer network.
97. An interactive persona system according to claim 88 and wherein said pattern of behavior is at least partially controllable via a computer network in real time.
98. An interactive persona system according to claim 88 and wherein said pattern of behavior is that of a teacher.
99. An interactive persona system according to claim 98 and wherein the persona is of a known non-teacher.
100. An interactive persona system according to claim 88 and wherein said pattern of behavior is that of a coach.
101. An interactive persona system according to claim 98 and wherein the persona is of a lmown coach.
102. An interactive persona system according to claim 88 and wherein said pattern of behavior is that of a guide.
103. An interactive persona system according to claim 98 and wherein the persona is of a lmown guide.
104. An interactive persona system according to claim 88 and wherein said pattern of behavior is detennined at least in part by at least one user characteristic known to the artificial person.
105. An interactive persona system according to claim 102 and wherein said pattern of behavior is determined at least in part by at least one user characteristic known to the artificial person.
106. In an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a network, a inter-toy communication system comprising: at least one interactive toy operative for interaction with a plurality of users, wherein the interaction of said at least one interactive toy with at least one of said plurality of users is affected by the interaction of said at least one interactive toy with another one of said plurality of users.
107. In an interactive toy environment comprising a plurality of interactive toys interconnected via a network, a inter-toy communication system comprising: at least one interactive toy operative for interaction with a plurality of users, wherein the interaction of said at least one interactive toy with at least two of said plurality of users is dependent on knowledge of the toy of which user it is interacting with and characteristics of said user lmown to the toy.
108. In an interactive toy environment comprising a plurality of interactive toys interconnected via a network, a inter-toy communication system comprising: a plurality of interactive toys operative for interaction with at least one user, wherein the interaction of one of said plurality of interactive toys with said at least one user is affected by the interaction of another of said plurality of toys with said at least one user.
109. In an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a computer network, a multi-toy communication system comprising: at least one first interactive toy operative for communication with said computer network; at least one second interactive toy operative for communication with said computer network via said at least one first interactive toy.
110. In an interactive toy environment comprising a plurality of interactive toys interconnected via a computer network, a multi-toy location system comprising: location functionality operative to sense at least predetermined propinquity between at least two of said plurality of interactive toys.
111. A multi-toy location system according to claim 110 and also comprising: propinquity notification functionality operative in response to an output from said location functionality indicating said sensed at least predetermined propinquity for notifying at least one of said at least two of said plurality of interactive toys of at least said predetermined propinquity.
112. A multi-toy location system according to claim 110 and wherein said location functionality includes toy voice recognition functionality.
113. In an interactive toy environment comprising a plurality of interactive toys, at least one of which being nonnally in interactive communication via a computer with a computer network, said computer including a toy communication functionality comprising: a toy recognition functionality enabling said computer to recognize the identity of a toy which is not nonnally in interactive communication therewith, when said toy comes into communication propinquity therewith; and a communication establishing functionality operative following recognition of the identity of a toy which is not nonnally in interactive communication therewith, when said toy comes into communication propinquity therewith for establishing interactive communication therewith.
114. A toy communication functionality according to claim 113 and wherein said communication establishing ftmctionality is operative in response to an authorization received from a user of said at least one toy which is nonnally in interactive communication with said computer network via said computer.
115. In an interactive toy environment comprising a plurality of interactive toys intercomiected via a network, a multi-toy coordinated activity system comprising: a plurality of interactive toys operative for interaction via a computer network; and a coordinated activity functionality operative via said computer network to cause said plurality of interactive toys to coordinate their actions in a coordinated activity.
116. A multi-toy coordinated activity system according to claim 115 and wherein said plurality of interactive toys are located at disparate locations.
117. A multi-toy coordinated activity system according to claim 115 and wherein said coordinated activity functionality causes said plurality of interactive toys to communicate with each other at least partially not in real time.
118. In an interactive toy environment comprising a plurality of interactive toys intercoimected via a network, a communication system providing communication between at least one of multiple toys and at least one toy and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via said computer network to cause at least some of said plurality of interactive toys to communicate with each other at least partially not in real time.
119. An interactive toy communication system according to claim 118 and wherein said communications functionality includes a text message to voice conversion functionality.
120. An interactive toy commimication system according to claim 118 and wherein said communication functionality includes a message to voice conversion functionality, which provides a vocal output having characteristics of at least one of said plurality of interactive toys.
121. An interactive toy communication system according to claim 118 and wherein said communications functionality includes an e mail communication functionality.
122. An interactive toy communication system according to claim 118 and wherein at least some of said plurality of interactive toys has an e-mail address which is distinct for that of a user thereof.
123. An interactive toy communication system according to claim 121 and wherein said e- mail communication functionality enables transmission of movement instractions to at least one of said plurality of interactive toys.
124. In an interactive toy enviromnent comprising a plurality of interactive toys intercomiected via a network, a communication system providing communication between at least one of multiple toys and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via said computer network to cause at least one of said plurality of interactive toys to communicate with at least one user via a telephone link.
125. An interactive toy communication system according to claim 124 and wherein said communications functionality comprises an interactive voice response computer operative to enable the user to communicate by voice with said at least one of said plurality of interactive toys.
126. An interactive toy communication system according to claim 124 and wherein said communications functionality enables a user to provide instractions to at least one of said plurality of interactive toys to cany out physical functions.
127. An interactive toy communication system according to claim 124 and wherein said communications functionality is operative at least partially via said computer network to cause at least one of said plurality of interactive toys to communicate with at least one user via at least another one of said plurality of interactive toys and a telephone link.
128. In an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a network, a coimnunication system providing communication between at least one of multiple toys and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via said computer network to cause at least one of said plurality of interactive toys to communicate with at least one user via at least another one of said interactive toys.
129. In an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a network, a communication system providing communication between at least one of multiple toys and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via said computer network to cause at least one of said plurality of interactive toys to communicate motions of at least one of said plurality of interactive toys to at least another of said plurality of interactive toys.
130. An interactive toy commimication system according to claim 129 and wherein said communications functionality employs software instructions to said first one of said plurality of interactive toys for transmission to said another of said plurality of interactive toys.
131. An interactive toy communication system according to claim 129 and wherein said communications functionality employs infonnation regarding sensed motion of said first one of said plurality of interactive toys for transmission to said another of said plurality of interactive toys.
132. An interactive toy coimnunication system according to claim 129 and wherein said another of said plurality of interactive toys generally replicates the motion of said first one of said plurality of interactive toys.
133. An interactive toy communication system according to claim 130 and wherein said another of said plurality of interactive toys generally replicates the motion of said first one of said plurality of interactive toys.
134. An interactive toy communication system according to claim 131 and wherein said another of said plurality of interactive toys generally replicates the motion of said first one of said plurality of interactive toys.
135. An interactive toy communication system according to claim 129 and wherein said communications functionality is operative at least partially via said computer network to cause at least one of said plurality of interactive toys to communicate motions and speech of at least one of said plurality of interactive toys to at least another of said plurality of interactive toys.
136. An interactive toy coimnunication system according to claim 135 and wherein said communications functionality is operative at least partially via said computer network to cause at least one of said plurality of interactive toys to communicate synchronized motion and speech of at least one of said plurality of interactive toys to at least another of said plurality of interactive toys.
137. An interactive toy communication system according to claim 130 and wherein said communications functionality is operative at least partially via said computer network to cause at least one of said plurality of interactive toys to communicate motions and speech of at least one of said plurality of interactive toys to at least another of said plurality of interactive toys.
138. An interactive toy communication system according to claim 137 and wherein said communications ftmctionality is operative at least partially via said computer network to cause at least one of said plurality of interactive toys to communicate synchronized motion and speech of at least one of said plurality of interactive toys to at least another of said plurality of interactive toys.
139. An interactive toy communication system according to claim 131 and wherein said communications functionality is operative at least partially via said computer network to cause at least one of said plurality of interactive toys to communicate motions and speech of at least one of said plurality of interactive toys to at least another of said plurality of interactive toys.
140. An interactive toy communication system according to claim 139 and wherein said communications ftmctionality is operative at least partially via said computer network to cause at least one of said plurality of interactive toys to communicate synchronized motion and speech of at least one of said plurality of interactive toys to at least another of said plurality of interactive toys.
141. In an interactive toy environment comprising a plurality of interactive toys intercoimected via a network, a communication system providing communication between at least one of multiple toys and at least one user, the system comprising: a plurality of interactive toys operative for interaction via a computer network; a communications functionality operative at least partially via said computer network to cause at least one of said plurality of interactive toys to communicate with at least one user via at least another one of said interactive toys.
142. In an interactive toy environment comprising a plurality of interactive toys intercomiected via a computer network, an integrated toy-game ftmctionality comprising: a game which may be played by a user; at least one interactive toy containing game-specific functionality which participates in playing said game.
143. A toy-game functionality according to claim 142 and wherein said game-specific functionality enables the interactive toy to play said game as a player.
144. A toy-game functionality according to claim 142 and wherein said game-specific functionality enables the interactive toy to assist said user in playing said game.
145. A toy-game functionality according to claim 142 and wherein said game-specific functionality enables the interactive toy to be employed by the user as a user interface in playing said game.
146. A toy-game ftmctionality according to claim 142 and wherein said game-specific functionality enables the interactive toy to have voice interaction with said user in the course of playing said game.
147. A toy-game functionality according to claim 143 and wherein said game-specific functionality enables the interactive toy to have voice interaction with said user in the course of playing said game.
148. A toy-game functionality according to claim 144 and wherein said game-specific functionality enables the interactive toy to have voice interaction with said user in the course of playing said game.
149. A toy- game functionality according to claim 145 and wherein said game-specific functionality enables the interactive toy to have voice interaction with said user in the course of playing said game.
150. A toy-game functionality according to claim 142 and wherein said game-specific functionality enables the interactive toy to be responsive to at least one cunent characteristic of the user as sensed by the interactive toy.
151. A toy-game functionality according to claim 143 and wherein said game-specific functionality enables the interactive toy to be responsive to at least one cunent characteristic of the user as sensed by the interactive toy.
152. A toy-game functionality according to claim 144 and wherein said game-specific functionality enables the interactive toy to be responsive to at least one current characteristic of the user as sensed by the interactive toy.
153. A toy-game functionality according to claim 145 and wherein said game-specific functionality enables the interactive toy to be responsive to at least one cunent characteristic of the user as sensed by the interactive toy.
154. A toy-game functionality according to claim 146 and wherein said game-specific functionality enables the interactive toy to be responsive to at least one current characteristic of the user as sensed by the interactive toy.
155. A toy-game functionality according to claim 147 and wherein said game-specific functionality enables the interactive toy to be responsive to at least one cunent characteristic of the user as sensed by the interactive toy.
156. A toy-game functionality according to claim 148 and wherein said game-specific functionality enables the interactive toy to be responsive to at least one current characteristic of the user as sensed by the interactive toy.
157. A toy-game functionality according to claim 149 and wherein said game-specific functionality enables the interactive toy to be responsive to at least one cun-ent characteristic of the user as sensed by the interactive toy.
158. A toy-game functionality according to claim 142 and wherein said game is a multi- user game which may be played over a network.
159. A toy-game functionality according to claim 158 and wherein said game-specific ftmctionality is operative to mediate between at least two users playing said game.
160. In an interactive toy environment comprising a plurality of interactive toys intercoimected via a network, an interpersonal interaction communication system providing communication between multiple users via multiple toys, the system comprising: a plurality of interactive toys operative for interaction via a computer network, at least some of said plurality of interactive toys having a functionality of forming personal profiles of users thereof based on accumulated past interactions therewith; and a communications functionality operative at least partially via said computer network to cause at least one of said plurality of interactive toys to communicate content with at least another of said plurality of interactive toys, said content being operative to produce interaction between respective users thereof.
161. An inteipersonal interaction communication system according to claim 160 and wherein said content is operative to produce personal interaction between respective users thereof.
162. An interpersonal interaction communication system according to claim 160 and wherein said content is operative to produce a personal meeting between respective users thereof.
163. An interpersonal interaction communication system according to claim 160 and wherein said content is operative to produce pseudo-accidental personal meetings between respective users thereof.
164. An inteipersonal interaction communication system according to claim 160 and wherein said content is operative in the context of a game.
165. An inteipersonal interaction commimication system according to claim 160 and wherein said content is operative to produce interactions between interactive toys which are in physical propinquity therebetween.
166. An interpersonal interaction communication system according to claim 160 and wherein at least some of said interactive toys have a persona which share at least one personal characteristic with an identifiable person.
167. An interpersonal interaction communication system according to claim 166 and wherein said identifiable person is the user of a given interactive toy.
168. An interpersonal interaction communication system according to claim 160 and wherein said content is operative to produce conversations between respective users thereof and employs at least some personal infomiation about said respective users based on accumulated past interactions therewith.
169. An inteipersonal interaction communication system according to claim 160 and wherein said content is operative to produce personal meetings between respective users thereof at predetermined locations.
170. In an interactive toy environment comprising a plurality of interactive toys intercomiected via a network, a toy cloning functionality comprising: developing an interactive toy personality based on interactions between an interactive toy and at least one of another interactive toy and a user; transferring at least a portion of the interactive toy personality to at least one clone.
171. A toy cloning functionality according to claim 170 and wherein said interactive toy personality comprises a toy life history.
172. A toy cloning functionality according to claim 171 and wherein said toy life history is stored in a database.
173. A toy cloning functionality according to claim 171 and wherein said toy personality is stored in a database.
174. A toy cloning functionality according to claim 170 and wherein said at least one clone comprises a toy.
175. A toy cloning functionality according to claim 174 and wherein said toy has a persona which is prima facie incompatible with said toy personality.
176. A toy cloning functionality according to claim 170 and wherein said at least one clone comprises an animated virtual character.
177. A toy cloning functionality according to claim 170 and wherein following said transferring, said interactive toy personality continues to develop generally identically both in said interactive toy and in said clone.
178. A toy cloning functionality according to claim 170 and wherein following said transferring, said interactive toy personality continues to develop at least partially independently both in said interactive toy and in said clone.
179. A toy cloning functionality according to claim 170 and also comprising transferring at least one clone personality from said at least one clone to the interactive toy.
180. A toy cloning frmctionality according to claim 170 and wherein said interactive toy personality incorporates features based on multiple toy life histories.
181. A toy cloning functionality according to claim 170 and also comprising: associating at least one physical feature of said interactive toy with said clone.
182. In an interactive toy environment comprising a plurality of interactive toys intercomiected via a network, a toy personality functionality comprising: developing a plurality of interactive toy personalities based on interactions between an interactive toy and at least one of another interactive toy and a user; and endowing at least one interactive toy with at least two of said plurality of interactive toy personalities and with a mechanism for exhibiting at least one selectable personality at a given time.
183. A toy personality functionality according to claim 182 and wherein said at least one interactive toy exhibits at least one selectable personality in accordance with a toy-perceived personality of a corresponding user.
184. An interactive toy system comprising: at least one interactive toy; an interactive toy functionality at least partially resident at said at least one interactive toy and including user assistance functionality providing an output to said user which assists said user in user functioning.
185. An interactive toy system according to claim 184 and wherein said user assistance functionality is at least partially mechanical.
186. An interactive toy system according to claim 184 and wherein said user assistance functionality is at least partially verbal.
187. An interactive toy system according to claim 184 and wherein said interactive toy functionality includes tooth brushing functionality.
188. An interactive toy system according to claim 184 and wherein said at least one interactive toy is connected to a computer network.
189. An interactive toy system according to claim 184 and wherein said user assistance functionality is at least partially visible.
190. An interactive toy system according to claim 184 and wherein said user assistance functionality includes a guide dog functionality.
191. An interactive toy system comprising: at least one interactive toy; an interactive toy functionality at least partially resident at said at least one interactive toy and including teaching functionality providing a teaching output to said user which assists said user in learning.
192. An interactive toy system according to claim 191 and wherein said teaching functionality includes a point to object - name of object teaching functionality.
193. An interactive toy system according to claim 192 and wherein said point to object - name of object teaching functionality includes selectable language functionality.
194. An interactive toy system according to claim 191 and wherein said teaching functionality includes different home and school environment teaching functionalities.
195. An interactive toy system according to claim 194 and wherein said school environment teaching functionality interactively involves at least a plurality of interactive toys.
196. An interactive toy system according to claim 191 and wherein said teaching functionality includes both verbal and body language teaching functionality for teaching a foreign language.
197. An interactive toy system according to claim 191 and wherein said teaching functionality is automatically actuable by an event in a non-teaching functionality of said at least one interactive toy.
198. An interactive toy system according to claim 191 and wherein said teaching functionality includes virtual classroom teaching functionality.
199. An interactive toy system according to claim 191 and wherein said teaching functionality includes fellow student frmctionality wherein the at least one interactive toy acts as a fellow student to a user.
200. An interactive toy system according to claim 191 and wherein said teaching functionality includes behavior corrective functionality at least partially involving pre- acquired knowledge of at least one characteristic of the user obtained by at least one interactive toy.
201. An interactive toy system according to claim 200 and wherein said behavior corrective functionality also at least partially involves cunently acquired knowledge of said at least one characteristic of the user obtained by at least one interactive toy.
202. An interactive toy system according to claim 200 and wherein said at least one interactive toy includes sensing functionality for sensing at least one of the following user parameters: breath constituents; blood pressure; breathing activity; heart activity; and language.
203. An interactive toy system comprising: at least one interactive toy; an interactive toy functionality at least partially resident at said at least one interactive toy and including user-specific event driven functionality providing an output to said user which is dependent on pre-acquired Imowledge of at least one characteristic of the user obtained by at least one interactive toy.
204. An interactive toy system according to claim 203 and wherein said user-specific event driven functionality includes musical output functionality.
205. An interactive toy system according to claim 203 and wherein said at least one interactive toy comprises a plurality of interactive toys which cooperate to provide various coordinated parts in a musical output.
206. An interactive toy system according to claim 204 and wherein said at least one interactive toy has at least one lmown persona.
207. In an interactive toy environment comprising a plurality of interactive toys interconnected via a network, a system for obtaining and utilizing infonnation comprising: at least one of said plurality of interactive toys, employed to obtain infonnation via a user; said information obtained via the user and utilized in an application which is not limited to user involvement.
208. A schedule monitoring toy methodology comprising: learning personal infonnation about a child by means of an at least partially verbal-input interactive toy; and actuating said verbal-input interactive toy by means of toy content so as to present to the child at least one personalized, verbal scheduling prompt based on at least one item of personal information which the verbal-input interactive toy has learned about the child.
209. A schedule monitoring toy methodology comprising: utilizing a verbal-input interactive toy; recognizing a parent by means of a parental input receiver; receiving from said receiver at least one parental input regarding at least one desired schedule item; and actuating the verbal-input interactive toy by toy content so as to present to a child a timely verbal presentation of said at least one desired schedule item.
210. A schedule-monitoring toy methodology comprising : verbally inputting a mobile, interactive toy ; receiving an input from a scheduler regarding at least one schedule item; locating a child by means of a child locator within a predetennined area; and locating the child by means of a prompter at a time appropriate to said at least one schedule item, and delivering at least one verbal prompt from the prompter for said at least one schedule item to the child.
211. A schedule monitoring toy methodology comprising: perfomiing speech recognition by means of a verbal-input interactive toy; and actuating the verbal-input interactive toy by means of toy content so as to present to a child: at least one timely, interactive verbal scheduling prompt; and at least one anthropomorphic response to recognized speech content produced by a child responsive to said prompt.
212. A schedule monitoring toy methodology comprising : activating a verbal-input interactive toy; receiving from at least one authorized source, infonnation regarding a plurality of schedule items so as to input said infonnation to a schedule input receiver; receiving from at least one authorized source, infonnation regarding at least one free time activities authorized to be perfomied by a child during his free time; so as to input said information to a free time database ; and actuating the verbal-input interactive toy by means of toy content so as to present to the child: a timely verbal presentation of each of said plurality of schedule items; and a verbal presentation, presented at a time not occupied by any of said plurality of schedule items so as to prompt the child to perform at least one of said free time activities.
213. A follow-me toy methodology comprising : activating a mobile toy; tracking the user by means of a user-following mechanism; and, guiding the toy to follow the user as the user moves within a working area.
214. A networked diary toy methodology comprising: activating a verbal-input interactive toy; connecting, by means of a network interface, said verbal-input interactive toy to a computer network including at least one networked computer; storing at least one diary item for an individual user in a diary database; and presenting diary items from the diary database by means of verbal content, wherein at least a portion of said verbal content is stored on said at least one networked computer and arrives at said verbal-input interactive toy via said network interface.
215. A speech-responsive networked diary toy system comprising: activating a toy; comiecting the toy to a computer network including at least one networked computer by means of a network interface; storing at least one diary item for an individual user in a diary database; communicating with said toy by means of a speech-recognition unit residing at least in part in said at least one networked computer and via said network and said network interface; and actuating the toy by toy content so as to present at least one diary item responsive to user utterances recognized by the speech recognition unit.
216. A supervised networked organizer methodology comprising: performing at least one organizing function involving multiple individual users by means of an organizer subsystem; and storing at least one supervisor identity and automatically providing to each individual supervisor, inputs from the organizer system by means of a supervision subsystem.
217. A child-messaging toy methodology comprising: indicating child propinquity by means of a child propinquity functionality in a verbal-input interactive toy; accepting at least one message to be delivered to a child whose propinquity to said toy is indicated to exist fonn a message database; and controlling message delivery by means of a message controller including: providing a personalized audible output to said child requesting that said child come into propinquity with said toy from an audible annunciator; and providing at least one message from said message database to said child from a message output generator in response to an indication of propinquity of said child to said toy.
218. A child-messaging toy methodology comprising: indicating child propinquity by means of a child propinquity functionality in a verbal-input interactive toy; accepting at least one time-specific message to be delivered to a child whose propinquity to said toy is indicated to exist at least one predetermined time from a timed message database; controlling message delivery by means of a message controller including: providing a personalized audible output to said child requesting that said child come into propinquity with said toy by means of an audible amiunciator; and providing at least one time-specific message from said timed message database to said child from a message output generator, in response to an indication of propinquity of said child to said toy.
219. A virtual parenting toy methodology comprising: playing at least one game with a child together with a verbal-input interactive toy; actuating the verbal-input interactive toy by means of verbal-input interactive toy content so as to present to the child said verbal-input interactive toy content including: at least one verbal prompt to perform at least one task; and at least one verbal offer to play the at least one game with the child once said at least one task is performed; and accepting at least one indication that said at least one task has been performed and in response to said indication, so as to actuate said at least one game by means of a compliance monitor.
220. A virtual parenting toy methodology comprising: activating an interactive toy including: recognizing at least one indication of a child want by means of a child want indication-recognizing functionality; providing an output indication of a child want recognized by said child want indication-recognizing functionality by a child want reporting functionality; and satisfying a said child want reported by said child want reporting functionality by means of a child want satisfying functionality.
221. A toy methodology system comprising: activating an interactive toy including; designating at least one time slot during which a child has free time and may participate in a toy interaction, by means of a free time indication functionality; and providing entertainment to said child during said at least one time slot by a free time utilization functionality in response to an output from said free time indication functionality.
222. A user-location monitoring toy diary methodology comprising: storing, for each of a multiplicity of schedule items, a time thereof and coordinates of a location thereof, wherein the multiplicity of locations are represented within a single coordinate system of a schedule database; tracking a current location of the user by a user tracker; and prompting said user to confonn to the schedule if the user's current location does not conform to the stored location of a current schedule item by means of a prompter.
223. A schedule-monitoring toy methodology comprising: interacting with a user by means of a verbal-input interactive toy; storing the user's schedule in a schedule database; and actuating the verbal-input interactive toy so as to present to the user at least one verbal scheduling prompt which includes content which is not related to scheduling by a schedule reminder.
224. A computerized guide methodology for a blind user, the guide comprising: activating a portable interactive computerized device including: receiving a route input from a blind user for selecting a user route from a route definition functionality; detecting at least one hazard along said user route by a hazard detection functionality operative; and providing the user with an audio warning regarding presence of the hazard in response to an output from said hazard detection functionality by means of an audio warning functionality.
225. A parental sun'ogate toy methodology comprising: activating a toy; receiving a child behavior report by means of a child behavior report receiver; and controlling said child by a toy controller comprising: allowing a parent to define at least one parameter of child behavior which is of interest by a behavioral report configuration definer; monitoring said parameter of child behavior by a child monitor so as to provide a report relating to said at least one parameter to said child behavior report receiver.
226. A web browsing methodology comprising: connecting an interactive toy to the Internet; and web-browsing by means of a user interface on said interactive toy.
227. A knowledge management methodology comprising: connecting an interactive toy to the Internet; and managing infonnation by an information management functionality on said interactive toy.
228. An interactive persona methodology comprising: activating a tliree-dimensional artificial person including a computer; and employing said computer and said three-dimensional artificial person having a pattern of behavior associated with a defined persona by means of a voice responsive interactive functionality; and, interacting with a user in a maimer which mimics behavior characteristic of said persona.
229. In an interactive toy environment comprising a plurality of interactive toys interconnected via a network, a inter-toy communication methodology comprising: interacting with a plurality of users by means of least one interactive toy; and wherein said interacting with a plurality of users by means of least one interactive toy is affected by the interaction of said at least one interactive toy with another one of said plurality of users.
230. In an interactive toy environment comprising a plurality of interactive toys interconnected via a network, a inter-toy communication methodology comprising: interacting with a plurality of users by at least one interactive toy, and wherein the interacting of said at least one interactive toy with at least two of said plurality of users is dependent on Imowledge of the toy of which user it is interacting with and characteristics of said user known to the toy.
231. In an interactive toy environment comprising a plurality of interactive toys interconnected via a network, a inter-toy communication methodology comprising: interacting with at least one user by a plurality of interactive toy, wherein the interacting of one of said plurality of interactive toys with said at least one user is affected by the interaction of another of said plurality of toys with said at least one user.
232. In an interactive toy environment comprising a plurality of interactive toys interconnected via a computer network, a multi-toy communication methodology comprising: communicating of at least one first interactive toy with said computer network; and, communicating of at least one second interactive toy with said computer network via said at least one first interactive toy.
233. In an interactive toy environment comprising a plurality of interactive toys interconnected via a computer network, a multi-toy location methodology comprising: sensing at least predetennined propinquity between at least two of said plurality of interactive toys by means of a location functionality.
234. In an interactive toy enviromnent, comprising a plurality of interactive toys, at least one of which being normally in interactive commimication via a computer with a computer network, said computer including a toy communication functionality, a toy communication methodology comprising: enabling said computer to recognize the identity of a toy which is not normally in interactive communication therewith, when said toy comes into communication propinquity therewith by means of a toy recognition functionality; and establishing interactive communication with a communication establishing functionality following recognition of the identity of a toy which is not normally in interactive communication therewith, when said toy comes into communication propinquity therewith.
235. In an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a network, a multi-toy coordinated activity methodology comprising: interacting of a plurality of interactive toys via a computer network; and causing said plurality of interactive toys to coordinate their actions in a coordinated activity by means of a coordinated activity functionality.
236. In an interactive toy enviromnent comprising a plurality of interactive toys interconnected via a network, a communication methodology providing commmiication between at least one of the plurality of toys and at least one toy and at least one user, the methodology comprising: interacting of the plurality of interactive toys via a computer network; causing at least some of said plurality of interactive toys to communicate with each other at least partially not in real time by means of a communications functionality operative at least partially via said computer network.
237. In an interactive toy environment comprising a plurality of interactive toys intercoimected via a network, a communication methodology providing communication between at least one of multiple toys and at least one user, the methodology comprising: interacting of the plurality of interactive toys via a computer network; causing at least one of said plurality of interactive toys to communicate with at least one user via a telephone link by means of a communications functionality operative at least partially via said computer network.
238. In an interactive toy enviroimient comprising a plurality of interactive toys interconnected via a network, a communication methodology providing commmiication between at least one of multiple toys and at least one user, the methodology comprising: interacting of the plurality of interactive toys via a computer network;: causing at least one of said plurality of interactive toys to communicate with at least one user via at least another one of said interactive toys by means of a communications functionality, wherein said functionality is operative at least partially via said computer network.
239. In an interactive toy environment comprising a plurality of interactive toys intercoimected via a network, a communication methodology providing communication between at least one of multiple toys and at least one user, the methodology comprising: interacting of a plurality of interactive toys via a computer network; causing at least one of said plurality of interactive toys to communicate motions of at least one of said plurality of interactive toys to at least another of said plm-ality of interactive toys by means of a communications functionality, wherein said ftmctionality is operative at least partially via said computer network.
240. In an interactive toy enviromnent comprising a plurality of interactive toys intercomiected via a network, a coimnunication methodology providing communication between at least one of multiple toys and at least one user, the methodology comprising: interacting of a plurality of interactive toys via a computer network; causing at least one of said plurality of interactive toys to communicate with at least one user via at least another one of said interactive toys by means of a communications functionality, wherein said ftmctionality is operative at least partially via said computer network.
241. In an interactive toy enviromnent comprising a plurality of interactive toys intercoimected via a computer network, an integrated toy-game methodology comprising: playing of a game by a user; participating in playing said game of at least one interactive toy containing game-specific functionality.
242. In an interactive toy environment comprising a plurality of interactive toys intercomiected via a network, an interpersonal interaction commumcation methodology providing communication between multiple users via multiple toys, the methodology comprising: interacting of a plurality of interactive toys via a computer network; at least some of said plurality of interactive toys having a functionality of forming personal profiles of users thereof based on accumulated past interactions therewith; and causing at least one of said plurality of interactive toys to communicate content with at least another of said plurality of interactive toys by means of a communications ftmctionality, wherein said functionality is operative at least partially via said computer network; and, said content being operative to produce interaction between respective users thereof.
243. In an interactive toy environment comprising a plurality of interactive toys intercomiected via a network, a toy cloning methodology comprising: developing an interactive toy personality based on interactions between an interactive toy and at least one of another interactive toy and a user; transferring at least a portion of the interactive toy personality to at least one clone.
244. In an interactive toy environment comprising a plurality of interactive toys interconnected via a network, a toy personality functionality comprising: developing a plurality of interactive toy personalities based on interactions between an interactive toy and at least one of another interactive toy and a user; and endowing at least one interactive toy with at least two of said plurality of interactive toy personalities and with a mechanism for exhibiting at least one selectable personality at a given time.
245. An interactive toy methodology comprising: activating at least one interactive toy; providing an output to a user which assists said user in user functioning by means of an interactive toy functionality at least partially resident at said at least one interactive toy.
246. An interactive toy methodology comprising: activating at least one interactive toy; providing a teaching output to a user which assists said user in leaming.by an interactive toy functionality at least partially resident at said at least one interactive toy.
247. An interactive toy system comprising: activating at least one interactive toy; providing an output to a user which is dependent on pre-acquired Imowledge of at least one characteristic of the user obtained by the at least one interactive toy, and wherein said output is provided by an interactive toy functionality at least partially resident at said at least one interactive toy; and driving said output in respect of a user-specific event. ABSTRACT
In an interactive toy environment, in which a plurality of interactive toys are interconnected via a computer network and in which interactive toys interact with one or more users, an inter-toy communication system in which the interaction of a toy with its user is affected by the interaction of either that toy or another toy with another user. The interaction of a toy with its user is personalized and depends on knowledge of the characteristics of both the toy and its user. Interactive toys have real time conversations with users. Networked interactive toys are further able to communicate with computers on the network so that, if authorized, they are aware of the activities of other toys and of their users. Networked interactive toys may thus utilize information from any computer on the network. Interactive toy applications making use of these features are also provided.
PCT/IL2001/000268 2000-03-24 2001-03-20 Interactive toy applications WO2001070361A2 (en)

Priority Applications (126)

Application Number Priority Date Filing Date Title
US19201100P true 2000-03-24 2000-03-24
US19201400P true 2000-03-24 2000-03-24
US19201300P true 2000-03-24 2000-03-24
US19201200P true 2000-03-24 2000-03-24
US60/192,011 2000-03-24
US60/192,012 2000-03-24
US60/192,014 2000-03-24
US60/192,013 2000-03-24
US19369900P true 2000-03-31 2000-03-31
US19370400P true 2000-03-31 2000-03-31
US19370200P true 2000-03-31 2000-03-31
US19369700P true 2000-03-31 2000-03-31
US19370300P true 2000-03-31 2000-03-31
US60/193,702 2000-03-31
US60/193,703 2000-03-31
US60/193,699 2000-03-31
US60/193,704 2000-03-31
US60/193,697 2000-03-31
US19586500P true 2000-04-07 2000-04-07
US19586200P true 2000-04-07 2000-04-07
US19586400P true 2000-04-07 2000-04-07
US19586100P true 2000-04-07 2000-04-07
US19586300P true 2000-04-07 2000-04-07
US19586600P true 2000-04-07 2000-04-07
US60/195,862 2000-04-07
US60/195,866 2000-04-07
US60/195,865 2000-04-07
US60/195,864 2000-04-07
US60/195,863 2000-04-07
US60/195,861 2000-04-07
US19622700P true 2000-04-10 2000-04-10
US60/196,227 2000-04-10
US19757600P true 2000-04-17 2000-04-17
US19757800P true 2000-04-17 2000-04-17
US19757300P true 2000-04-17 2000-04-17
US19757900P true 2000-04-17 2000-04-17
US19757700P true 2000-04-17 2000-04-17
US60/197,578 2000-04-17
US60/197,573 2000-04-17
US60/197,577 2000-04-17
US60/197,576 2000-04-17
US60/197,579 2000-04-17
US20051300P true 2000-04-28 2000-04-28
US20063900P true 2000-04-28 2000-04-28
US20064000P true 2000-04-28 2000-04-28
US20050800P true 2000-04-28 2000-04-28
US20064100P true 2000-04-28 2000-04-28
US20064700P true 2000-04-28 2000-04-28
US60/200,641 2000-04-28
US60/200,513 2000-04-28
US60/200,640 2000-04-28
US60/200,508 2000-04-28
US60/200,639 2000-04-28
US60/200,647 2000-04-28
US20324400P true 2000-05-08 2000-05-08
US20317700P true 2000-05-08 2000-05-08
US20318200P true 2000-05-08 2000-05-08
US20317500P true 2000-05-08 2000-05-08
US60/203,182 2000-05-08
US60/203,244 2000-05-08
US60/203,175 2000-05-08
US60/203,177 2000-05-08
US20420000P true 2000-05-15 2000-05-15
US20420100P true 2000-05-15 2000-05-15
US60/204,200 2000-05-15
US60/204,201 2000-05-15
US20712800P true 2000-05-25 2000-05-25
US20712600P true 2000-05-25 2000-05-25
US60/207,126 2000-05-25
US60/207,128 2000-05-25
US20810500P true 2000-05-26 2000-05-26
US60/208,105 2000-05-26
US20839100P true 2000-05-30 2000-05-30
US20839200P true 2000-05-30 2000-05-30
US20839000P true 2000-05-30 2000-05-30
US60/208,390 2000-05-30
US60/208,391 2000-05-30
US60/208,392 2000-05-30
US20947100P true 2000-06-05 2000-06-05
US60/209,471 2000-06-05
US21044500P true 2000-06-08 2000-06-08
US21044300P true 2000-06-08 2000-06-08
US60/210,443 2000-06-08
US60/210,445 2000-06-08
US21269600P true 2000-06-19 2000-06-19
US60/212,696 2000-06-19
US21536000P true 2000-06-30 2000-06-30
US60/215,360 2000-06-30
US21623700P true 2000-07-05 2000-07-05
US21623800P true 2000-07-05 2000-07-05
US60/216,237 2000-07-05
US60/216,238 2000-07-05
US21735700P true 2000-07-12 2000-07-12
US60/217,357 2000-07-12
US21923400P true 2000-07-18 2000-07-18
US60/219,234 2000-07-18
US22027600P true 2000-07-24 2000-07-24
US60/220,276 2000-07-24
US22193300P true 2000-07-31 2000-07-31
US60/221,933 2000-07-31
US22387700P true 2000-08-08 2000-08-08
US60/223,877 2000-08-08
US22711200P true 2000-08-22 2000-08-22
US60/227,112 2000-08-22
US22937100P true 2000-08-30 2000-08-30
US60/229,371 2000-08-30
US22964800P true 2000-08-31 2000-08-31
US60/229,648 2000-08-31
US23110500P true 2000-09-08 2000-09-08
US23110300P true 2000-09-08 2000-09-08
US60/231,103 2000-09-08
US60/231,105 2000-09-08
US23488300P true 2000-09-25 2000-09-25
US23489500P true 2000-09-25 2000-09-25
US60/234,895 2000-09-25
US60/234,883 2000-09-25
US23932900P true 2000-10-10 2000-10-10
US60/239,329 2000-10-10
US25336200P true 2000-11-27 2000-11-27
US60/253,362 2000-11-27
US25033200P true 2000-11-29 2000-11-29
US60/250,332 2000-11-29
US25469900P true 2000-12-11 2000-12-11
US60/254,699 2000-12-11
US26735001P true 2001-02-08 2001-02-08
US60/267,350 2001-02-08

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU44498/01A AU4449801A (en) 2000-03-24 2001-03-20 Interactive toy applications

Publications (2)

Publication Number Publication Date
WO2001070361A2 true WO2001070361A2 (en) 2001-09-27
WO2001070361A3 WO2001070361A3 (en) 2002-08-08

Family

ID=27587048

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2001/000268 WO2001070361A2 (en) 2000-03-24 2001-03-20 Interactive toy applications

Country Status (1)

Country Link
WO (1) WO2001070361A2 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003096171A1 (en) * 2002-05-14 2003-11-20 Philips Intellectual Property & Standards Gmbh Dialog control for an electric apparatus
WO2011078796A1 (en) * 2009-12-21 2011-06-30 National University Of Singapore Tele-puppetry platform
WO2012000927A1 (en) * 2010-07-02 2012-01-05 Aldebaran Robotics Humanoid game-playing robot, method and system for using said robot
US8134061B2 (en) 2006-04-21 2012-03-13 Vergence Entertainment Llc System for musically interacting avatars
US8324492B2 (en) 2006-04-21 2012-12-04 Vergence Entertainment Llc Musically interacting devices
GB2508347A (en) * 2012-11-28 2014-06-04 Paul Nathan Location-Aware Doll
GB2532141A (en) * 2014-11-04 2016-05-11 Mooredoll Inc Method and device of community interaction with toy as the center
US9396437B2 (en) 2013-11-11 2016-07-19 Mera Software Services, Inc. Interface apparatus and method for providing interaction of a user with network entities
US9814993B2 (en) 2013-11-11 2017-11-14 Mera Software Services, Inc. Interactive toy plaything having wireless communication of interaction-related information with remote entities

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4846693A (en) * 1987-01-08 1989-07-11 Smith Engineering Video based instructional and entertainment system using animated figure
US4857030A (en) * 1987-02-06 1989-08-15 Coleco Industries, Inc. Conversing dolls
US5636994A (en) * 1995-11-09 1997-06-10 Tong; Vincent M. K. Interactive computer controlled doll
US5752880A (en) * 1995-11-20 1998-05-19 Creator Ltd. Interactive doll
US5769269A (en) * 1994-04-28 1998-06-23 Peters; Steven A. Vending system
US6012961A (en) * 1997-05-14 2000-01-11 Design Lab, Llc Electronic toy including a reprogrammable data storage device
US6368177B1 (en) * 1995-11-20 2002-04-09 Creator, Ltd. Method for using a toy to conduct sales over a network

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4846693A (en) * 1987-01-08 1989-07-11 Smith Engineering Video based instructional and entertainment system using animated figure
US4857030A (en) * 1987-02-06 1989-08-15 Coleco Industries, Inc. Conversing dolls
US5769269A (en) * 1994-04-28 1998-06-23 Peters; Steven A. Vending system
US5636994A (en) * 1995-11-09 1997-06-10 Tong; Vincent M. K. Interactive computer controlled doll
US5752880A (en) * 1995-11-20 1998-05-19 Creator Ltd. Interactive doll
US6368177B1 (en) * 1995-11-20 2002-04-09 Creator, Ltd. Method for using a toy to conduct sales over a network
US6012961A (en) * 1997-05-14 2000-01-11 Design Lab, Llc Electronic toy including a reprogrammable data storage device

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003096171A1 (en) * 2002-05-14 2003-11-20 Philips Intellectual Property & Standards Gmbh Dialog control for an electric apparatus
US8134061B2 (en) 2006-04-21 2012-03-13 Vergence Entertainment Llc System for musically interacting avatars
US8324492B2 (en) 2006-04-21 2012-12-04 Vergence Entertainment Llc Musically interacting devices
WO2011078796A1 (en) * 2009-12-21 2011-06-30 National University Of Singapore Tele-puppetry platform
WO2012000927A1 (en) * 2010-07-02 2012-01-05 Aldebaran Robotics Humanoid game-playing robot, method and system for using said robot
FR2962048A1 (en) * 2010-07-02 2012-01-06 Aldebaran Robotics S A Humanoid robot player, method and system for using the same
CN103079657A (en) * 2010-07-02 2013-05-01 奥尔德巴伦机器人公司 Humanoid game-playing robot, method and system for using said robot
US9950421B2 (en) 2010-07-02 2018-04-24 Softbank Robotics Europe Humanoid game-playing robot, method and system for using said robot
GB2508347A (en) * 2012-11-28 2014-06-04 Paul Nathan Location-Aware Doll
US9396437B2 (en) 2013-11-11 2016-07-19 Mera Software Services, Inc. Interface apparatus and method for providing interaction of a user with network entities
US9691018B2 (en) 2013-11-11 2017-06-27 Mera Software Services, Inc. Interface apparatus and method for providing interaction of a user with network entities
US9814993B2 (en) 2013-11-11 2017-11-14 Mera Software Services, Inc. Interactive toy plaything having wireless communication of interaction-related information with remote entities
GB2532141A (en) * 2014-11-04 2016-05-11 Mooredoll Inc Method and device of community interaction with toy as the center

Also Published As

Publication number Publication date
WO2001070361A3 (en) 2002-08-08

Similar Documents

Publication Publication Date Title
Long The witness of preaching
US10086302B2 (en) Doll companion integrating child self-directed execution of applications with cell phone communication, education, entertainment, alert and monitoring systems
Simmons Whoever tells the best story wins: How to use your own stories to communicate with power and impact
Kottler On being a therapist
Noë Strange tools: Art and human nature
Kriete et al. The morning meeting book
Fernyhough The voices within: The history and science of how we talk to ourselves
Argyle The social psychology of everyday life
US10512850B2 (en) Three way multidirectional interactive toy
US20170206095A1 (en) Virtual agent
O'Connor et al. Introducing NLP: Psychological skills for understanding and influencing people
Prizant et al. Uniquely human: A different way of seeing autism
Miller Amy, Wendy, and Beth: Learning language in South Baltimore
US9039482B2 (en) Interactive toy apparatus and method of using same
Borg Persuasion 4th edn: The art of influencing people
Lawrence-Lightfoot The good high school: Portraits of character and culture
Albrecht Brain power: Learn to improve your thinking skills
Dunn Shaping the Spiritual Life of Students: A Guide for Youth Workers, Pastors, Teachers Campus Ministers
Tovani I read it, but I don't get it: Comprehension strategies for adolescent readers
Pope Doing school: How we are creating a generation of stressed out, materialistic, and miseducated students
Gutstein et al. Relationship development intervention with children, adolescents and adults: Social and emotional development activities for Asperger syndrome, autism, PDD and NLD
Mayer Personal intelligence: The power of personality and how it shapes our lives
Kalb Beckett in performance
Neisser The perceived self: Ecological and interpersonal sources of self knowledge
Argyle The psychology of interpersonal behaviour

Legal Events

Date Code Title Description
AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

121 Ep: the epo has been informed by wipo that ep was designated in this application
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 69(1)EPC DATED 04/04/03

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase in:

Ref country code: JP